WorldWideScience

Sample records for intravenous human big

  1. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  2. The human experience with intravenous levodopa

    Directory of Open Access Journals (Sweden)

    Shan H Siddiqi

    2016-01-01

    Full Text Available Objective: To compile a comprehensive summary of published human experience with levodopa given intravenously, with a focus on information required by regulatory agencies.Background: While safe intravenous (IV use of levodopa has been documented for over 50 years, regulatory supervision for pharmaceuticals given by a route other than that approved by the U.S. Food and Drug Administration (FDA has become increasingly cautious. If delivering a drug by an alternate route raises the risk of adverse events, an investigational new drug (IND application is required, including a comprehensive review of toxicity data.Methods: Over 200 articles referring to IV levodopa were examined for details of administration, pharmacokinetics, benefit and side effects.Results: We identified 142 original reports describing IVLD use in humans, beginning with psychiatric research in 1959-1960 before the development of peripheral decarboxylase inhibitors. Over 2750 subjects have received IV levodopa, and reported outcomes include parkinsonian signs, sleep variables, hormone levels, hemodynamics, CSF amino acid composition, regional cerebral blood flow, cognition, perception and complex behavior. Mean pharmacokinetic variables were summarized for 49 healthy subjects and 190 with Parkinson’s disease. Side effects were those expected from clinical experience with oral levodopa and dopamine agonists. No articles reported deaths or induction of psychosis.Conclusion: Over 2750 patients have received IV levodopa with a safety profile comparable to that seen with oral administration.

  3. Effect of intravenous lipid on human pancreatic secretion.

    Science.gov (United States)

    Edelman, K; Valenzuela, J E

    1983-11-01

    Parenteral alimentation, including intravenous fat, is sometimes used in the treatment of patients with pancreatitis, although the effect of intravenous fat on human pancreatic secretion has not been systematically studied. Intravenous fat, however, has been shown to stimulate pancreatic protein secretion in the dog. The purpose of these studies was to clarify the effect of intravenous fat on human pancreatic secretion. Pancreatic secretion was assessed by measurement of enzymes and bicarbonate in duodenal aspirate collected via a double-lumen tube from 6 healthy volunteers. Four studies were randomly conducted on different days. On day 1, graded concentrations of Intralipid (5%, 10%, and 20%) were given intravenously for 1 h each, while secretin (8.2 pmol . kg-1 . h-1) was given as a background. On day 2, the same doses of Intralipid were infused intravenously without secretin. On day 3, the same doses of Intralipid were perfused into the intestine, and, finally, on day 4, 20% Intralipid was given by intestinal infusion for 2 h while 10% Intralipid was infused intravenously during the second hour. Significant stimulation of enzyme secretion was observed only during the infusion of fat into the intestine, not after intravenous infusion at any concentration. Pancreatic enzyme secretion, stimulated by intraintestinal fat, was not significantly modified by simultaneous intravenous lipid infusion. We conclude that since intravenous fat does not stimulate pancreatic secretion, its use in conditions where pancreatic stimulation is undesirable appears safe.

  4. Intravenous buprenorphine and norbuprenorphine pharmacokinetics in humans

    Science.gov (United States)

    Huestis, M.A.; Cone, E.J.; Pirnay, S.O.; Umbricht, A.; Preston, K.L.

    2013-01-01

    Background Prescribed sublingual (SL) buprenorphine is sometimes diverted for intravenous (IV) abuse, but no human pharmacokinetic data are available following high-dose IV buprenorphine. Methods Plasma was collected for 72 h after administration of placebo or 2, 4, 8, 12, or 16 mg IV buprenorphine in escalating order (single-blind, double-dummy) in 5 healthy male non-dependent opioid users. Buprenorphine and its primary active metabolite, norbuprenorphine, were quantified by liquid chromatography tandem mass spectrometry with limits of quantitation of 0.1 μg/L. Results Maximum buprenorphine concentrations (mean ± SE) were detected 10 min after 2, 4, 8, 12, 16 mg IV: 19.3±1.0, 44.5±4.8, 85.2±7.7, 124.6±16.6, and 137.7±18.8 μg/L, respectively. Maximum norbuprenorphine concentrations occurred 10–15 min (3.7±0.7 μg/L) after 16 mg IV administration. Conclusions Buprenorphine concentrations increased in a significantly linear dose-dependent manner up to 12 mg IV buprenorphine. Thus, previously demonstrated pharmacodynamic ceiling effects (over 2–16 mg) are not due to pharmacokinetic adaptations within this range, although they may play a role at doses higher than 12 mg. PMID:23246635

  5. Intravenous polyclonal human immunoglobulins in multiple sclerosis

    DEFF Research Database (Denmark)

    Sørensen, Per Soelberg

    2008-01-01

    Intravenous immunoglobulin (IVIG) is an established therapy for demyelinating diseases of the peripheral nervous system. IVIG exerts a number of effects that may be beneficial in multiple sclerosis (MS). Four double-blind IVIG trials have been performed in relapsing-remitting MS. A meta-analysis ......Intravenous immunoglobulin (IVIG) is an established therapy for demyelinating diseases of the peripheral nervous system. IVIG exerts a number of effects that may be beneficial in multiple sclerosis (MS). Four double-blind IVIG trials have been performed in relapsing-remitting MS. A meta...

  6. Intravenous polyclonal human immunoglobulins in multiple sclerosis

    DEFF Research Database (Denmark)

    Sørensen, Per Soelberg

    2008-01-01

    Intravenous immunoglobulin (IVIG) is an established therapy for demyelinating diseases of the peripheral nervous system. IVIG exerts a number of effects that may be beneficial in multiple sclerosis (MS). Four double-blind IVIG trials have been performed in relapsing-remitting MS. A meta-analysis ......Intravenous immunoglobulin (IVIG) is an established therapy for demyelinating diseases of the peripheral nervous system. IVIG exerts a number of effects that may be beneficial in multiple sclerosis (MS). Four double-blind IVIG trials have been performed in relapsing-remitting MS. A meta......-analysis of the four trials has shown that IVIG reduces the relapse rate and, possibly, disease progression. In patients with a first episode of demyelinating disease, IVIG delays the time to the second relapse and thereby to the diagnosis of definite MS. In patients with an acute MS relapse, IVIG as add-on therapy...... to methylprednisolone does not make remission of symptoms faster or more complete. IVIG does not seem to be of any benefit to chronic visual or motor symptoms in MS. In secondary progressive MS, IVIG has not shown any effect on disease progression, relapses or new magnetic resonance imaging lesions. Experimental...

  7. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  8. Human neuroimaging as a "Big Data" science.

    Science.gov (United States)

    Van Horn, John Darrell; Toga, Arthur W

    2014-06-01

    The maturation of in vivo neuroimaging has led to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of "big data". A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a multifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociological and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, "big data" can become "big" brain science.

  9. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  10. Biliary excretion of intravenous (/sup 14/C) omeprazole in humans

    Energy Technology Data Exchange (ETDEWEB)

    Lind, T.; Andersson, T.; Skanberg, I.O.; Olbe, L.

    1987-11-01

    We have studied the biliary excretion of (/sup 14/C) omeprazole in humans. The study was performed in eight healthy subjects and the technique used was based on multiple marker dilution principles with double-lumen tubes placed in both the stomach and intestine. The results obtained show a 16% biliary excretion of (/sup 14/C) omeprazole. These data suggest a minimal spillover of omeprazole from the gastric mucosa into the gastric lumen in humans. The results also agree with previous data of the fecal recovery of radiolabeled omeprazole that suggest that the fecal excretion of intravenous omeprazole in humans is entirely accounted for by biliary excretion.

  11. Biliary excretion of intravenous [14C] omeprazole in humans

    International Nuclear Information System (INIS)

    Lind, T.; Andersson, T.; Skanberg, I.O.; Olbe, L.

    1987-01-01

    We have studied the biliary excretion of [ 14 C] omeprazole in humans. The study was performed in eight healthy subjects and the technique used was based on multiple marker dilution principles with double-lumen tubes placed in both the stomach and intestine. The results obtained show a 16% biliary excretion of [ 14 C] omeprazole. These data suggest a minimal spillover of omeprazole from the gastric mucosa into the gastric lumen in humans. The results also agree with previous data of the fecal recovery of radiolabeled omeprazole that suggest that the fecal excretion of intravenous omeprazole in humans is entirely accounted for by biliary excretion

  12. Big Data is a big lie without little data: Humanistic intelligence as a human right

    Directory of Open Access Journals (Sweden)

    Steve Mann

    2017-02-01

    Full Text Available This article introduces an important concept: Transparency by way of Humanistic Intelligence as a human right, and in particular, Big/Little Data and Sur/Sous Veillance, where “Little Data” is to sousveillance (undersight as “Big Data” is to surveillance (oversight. Veillance (Sur- and Sous-veillance is a core concept not just in human–human interaction (e.g. people watching other people but also in terms of Human–Computer Interaction. In this sense, veillance is the core of Human-in-the-loop Intelligence (Humanistic Intelligence rather than Artificial Intelligence, leading us to the concept of “Sousveillant Systems” which are forms of Human–Computer Interaction in which internal computational states are made visible to end users, allowing users (but not requiring them to “jump” into the computational feedback loop whenever or wherever they want. An important special case of Sousveillant Systems is that of scientific exploration: not only is (big/little data considered, but also due consideration must be given to how data is captured, understood, explored, and discovered, and in particular, to the use of scientific instruments to collect data and to make important new discoveries, and learn about the world. Science is a domain where bottom-up transparency is of the utmost importance, and scientists have the right and responsibility to be able to understand the instruments that they use to make their discoveries. Such instruments must be sousveillant systems !

  13. Think Big! The Human Condition Project

    Science.gov (United States)

    Metcalfe, Gareth

    2014-01-01

    How can educators provide children with a genuine experience of carrying out an extended scientific investigation? And can teachers change the perception of what it means to be a scientist? These were key questions that lay behind "The Human Condition" project, an initiative funded by the Primary Science Teaching Trust to explore a new…

  14. The Human Genome Project: big science transforms biology and medicine.

    Science.gov (United States)

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  15. Pharmacokinetics of high-dose intravenous melatonin in humans

    DEFF Research Database (Denmark)

    Andersen, Lars P H; Werner, Mads U; Rosenkilde, Mette Marie

    2016-01-01

    This crossover study investigated the pharmacokinetics and adverse effects of high-dose intravenous melatonin. Volunteers participated in 3 identical study sessions, receiving an intravenous bolus of 10 mg melatonin, 100 mg melatonin, and placebo. Blood samples were collected at baseline and 0, 60......, 120, 180, 240, 300, 360, and 420 minutes after the bolus. Quantitative determination of plasma melatonin concentrations was performed using a radioimmunoassay technique. Pharmacokinetic parameters were estimated by a compartmental pharmacokinetic analysis. Adverse effects included assessments...... of sedation and registration of other symptoms. Sedation, evaluated as simple reaction times, was measured at baseline and 120, 180, 300, and 420 minutes after the bolus. Twelve male volunteers completed the study. Median (IQR) Cmax after the bolus injections of 10 mg and 100 mg of melatonin were 221...

  16. Big Data and Intelligence: Applications, Human Capital, and Education

    Directory of Open Access Journals (Sweden)

    Michael Landon-Murray

    2016-06-01

    Full Text Available The potential for big data to contribute to the US intelligence mission goes beyond bulk collection, social media and counterterrorism. Applications will speak to a range of issues of major concern to intelligence agencies, from military operations to climate change to cyber security. There are challenges too: procurement lags, data stovepiping, separating signal from noise, sources and methods, a range of normative issues, and central to managing these challenges, human capital. These potential applications and challenges are discussed and a closer look at what data scientists do in the Intelligence Community (IC is offered. Effectively filling the ranks of the IC’s data science workforce will depend on the provision of well-trained data scientists from the higher education system. Program offerings at America’s top fifty universities will thus be surveyed (just a few years ago there were reportedly no degrees in data science. One Master’s program that has melded data science with intelligence is examined as well as a university big data research center focused on security and intelligence. This discussion goes a long way to clarify the prospective uses of data science in intelligence while probing perhaps the key challenge to optimal application of big data in the IC.

  17. The dynamics of big data and human rights: the case of scientific research

    OpenAIRE

    Vayena, Effy; Tasioulas, John

    2016-01-01

    In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less wellknown right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, bu...

  18. Big History or the 13800 million years from the Big Bang to the Human Brain

    Science.gov (United States)

    Gústafsson, Ludvik E.

    2017-04-01

    Big History is the integrated history of the Cosmos, Earth, Life, and Humanity. It is an attempt to understand our existence as a continuous unfolding of processes leading to ever more complex structures. Three major steps in the development of the Universe can be distinguished, the first being the creation of matter/energy and forces in the context of an expanding universe, while the second and third steps were reached when completely new qualities of matter came into existence. 1. Matter comes out of nothing Quantum fluctuations and the inflation event are thought to be responsible for the creation of stable matter particles in what is called the Big Bang. Along with simple particles the universe is formed. Later larger particles like atoms and the most simple chemical elements hydrogen and helium evolved. Gravitational contraction of hydrogen and helium formed the first stars und later on the first galaxies. Massive stars ended their lives in violent explosions releasing heavier elements like carbon, oxygen, nitrogen, sulfur and iron into the universe. Subsequent star formation led to star systems with bodies containing these heavier elements. 2. Matter starts to live About 9200 million years after the Big Bang a rather inconspicous star of middle size formed in one of a billion galaxies. The leftovers of the star formation clumped into bodies rotating around the central star. In some of them elements like silicon, oxygen, iron and many other became the dominant matter. On the third of these bodies from the central star much of the surface was covered with an already very common chemical compound in the universe, water. Fluid water and plenty of various elements, especially carbon, were the ingredients of very complex chemical compounds that made up even more complex structures. These were able to replicate themselves. Life had appeared, the only occasion that we human beings know of. Life evolved subsequently leading eventually to the formation of multicellular

  19. Human Neuroimaging as a “Big Data” Science

    Science.gov (United States)

    Van Horn, John Darrell; Toga, Arthur W.

    2013-01-01

    The maturation of in vivo neuroimaging has lead to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of “big data”. A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a mutifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociologial and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, “big data” can become “big” brain science. PMID:24113873

  20. A Big Bang model of human colorectal tumor growth.

    Science.gov (United States)

    Sottoriva, Andrea; Kang, Haeyoun; Ma, Zhicheng; Graham, Trevor A; Salomon, Matthew P; Zhao, Junsong; Marjoram, Paul; Siegmund, Kimberly; Press, Michael F; Shibata, Darryl; Curtis, Christina

    2015-03-01

    What happens in early, still undetectable human malignancies is unknown because direct observations are impractical. Here we present and validate a 'Big Bang' model, whereby tumors grow predominantly as a single expansion producing numerous intermixed subclones that are not subject to stringent selection and where both public (clonal) and most detectable private (subclonal) alterations arise early during growth. Genomic profiling of 349 individual glands from 15 colorectal tumors showed an absence of selective sweeps, uniformly high intratumoral heterogeneity (ITH) and subclone mixing in distant regions, as postulated by our model. We also verified the prediction that most detectable ITH originates from early private alterations and not from later clonal expansions, thus exposing the profile of the primordial tumor. Moreover, some tumors appear 'born to be bad', with subclone mixing indicative of early malignant potential. This new model provides a quantitative framework to interpret tumor growth dynamics and the origins of ITH, with important clinical implications.

  1. Oral versus intravenous flucytosine in patients with human immunodeficiency virus-associated cryptococcal meningitis

    NARCIS (Netherlands)

    Brouwer, Annemarie E.; van Kan, Hendrikus J. M.; Johnson, Elizabeth; Rajanuwong, Adul; Teparrukkul, Prapit; Wuthiekanun, Vannaporn; Chierakul, Wirongrong; Day, Nick; Harrison, Thomas S.

    2007-01-01

    In a randomized controlled trial of amphotericin B-based therapy for human immunodeficiency virus (HIV)-associated cryptococcal meningitis in Thailand, we also compared the mycological efficacy, toxicity, and pharmacokinetics of oral versus intravenous flucytosine at 100 mg/kg of body weight/day for

  2. Induction of circulating phospholipase A2 by intravenous administration of recombinant human tumour necrosis factor

    Directory of Open Access Journals (Sweden)

    Waldemar Pruzanski

    1992-01-01

    Full Text Available We have examined the effects of intravenous infusion of recombinant human tumour necrosis factor (rh-TNF on serum activity of phospholipase A2 (PLA2 in patients with malignancies. Nine patients received a 24 h continuous intravenous infusion ranging from 1.0 × 105 U/m2 to 3.0 × 105 U/m2; 14 patients received a 5 day continuous intravenous infusion ranging from 0.5 × 105 U/m2/day to 3.0 105 U/m2/day. Twenty one of 23 patients responded with marked increases in serum PLA2 activity that were detectable 3 h after the beginning of the rh-TNF infusion and reached maximum levels at 18 h with a mean increase of 16.2-fold. In patients receiving a 5 day rh-TNF infusion, the highest levels of PLA2 were observed after the first day of infusion. Serum PLA2 activity declined continuously to 2.9-fold above baseline at the end of the infusion. A significant correlation was noted between the dose of infused rh-TNF and the maximum increase in PLA2 activity. To our knowledge, this is the first time that an association between intravenous TNF administration and induction of circulating PLA2 in man has been established.

  3. First-pass metabolism of ethanol in human beings: effect of intravenous infusion of fructose

    DEFF Research Database (Denmark)

    Parlesak, Alexandr; Billinger, MH; Schäfer, C.

    2004-01-01

    Intravenous infusion of fructose has been shown to enhance reduced form of nicotinamide adenine dinucleotide reoxidation and, thereby, to enhance the metabolism of ethanol. In the current study, the effect of fructose infusion on first-pass metabolism of ethanol was studied in human volunteers. A......, results of the current study support the assumption that only a negligible part of first-pass metabolism of ethanol occurs in the stomach....

  4. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    Directory of Open Access Journals (Sweden)

    Michele Thums

    2018-02-01

    Full Text Available The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  5. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    KAUST Repository

    Thums, Michele

    2018-02-13

    The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data) on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  6. The safety of studies with intravenous Δ⁹-tetrahydrocannabinol in humans, with case histories.

    Science.gov (United States)

    Carbuto, Michelle; Sewell, R Andrew; Williams, Ashley; Forselius-Bielen, Kim; Braley, Gabriel; Elander, Jacqueline; Pittman, Brian; Schnakenberg, Ashley; Bhakta, Savita; Perry, Edward; Ranganathan, Mohini; D'Souza, Deepak Cyril

    2012-02-01

    Delta-9-tetrahydrocannabinol (THC) is one of the few cannabinoid receptor ligands that can be used to probe the cannabinoid system in humans. Despite increasing interest in the cannabinoid receptor system, use of intravenous THC as a research tool has been limited by concerns about its abuse liability and psychoactive effects. This study aims to evaluate the safety of all intravenous THC studies conducted at this center for the past 13 years. Included were 11 studies with 266 subjects (14 schizophrenia patients and 252 healthy subjects, of whom 76 were frequent cannabis users), 351 active THC infusions, and 226 placebo infusions. Subjects were monitored for subjective and physical adverse events and followed up to 12 months beyond study participation. There was one serious and 70 minor adverse events in 9.7% of subjects and 7.4% of infusions, with 8.5% occurring after the end of the test day. Nausea and dizziness were the most frequent side effects. Adverse events were more likely to be associated with faster infusion rates (2-5 min) and higher doses (>2.1 mg/70 kg). Of 149 subjects on whom long-term follow-up data were gathered, 94% reported either no change or a reduction in their desire to use cannabis in the post-study period, 18% stated that their cannabis use decreased, and 3% stated that it increased in the post-study period. With careful subject selection and screening, risk to subjects is relatively low. Safeguards are generally sufficient and effective, reducing both the duration and severity of adverse events.

  7. The acute effects of intravenously administered mibefradil, a new calcium antagonist, on the electrophysiologic characteristics of the human heart

    NARCIS (Netherlands)

    Rosenquist, M; BrembillaPerrot, B; Meinertz, T; Neugebauer, A; Crijns, HJMG; Smeets, JLRM; vanderVring, JAFM; Fromer, M; Kobrin, [No Value

    Objective: This multicenter, double-blind, placebo-controlled, parallel-group study was designed to assess the acute effects of intravenous mibefradil on the electrophysiologic characteristics of the human heart. Methods: Seventy-one patients referred for routine electrophysiologic testing were

  8. Effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function in healthy humans

    DEFF Research Database (Denmark)

    Madsen, Jan Lysgård; Fuglsang, Stefan; Graff, J

    2006-01-01

    : To examine the effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function after a meal in healthy humans. METHODS: Nine healthy volunteers participated in a placebo-controlled, double-blind, crossover study. Each volunteer was examined during intravenous infusion...... of glyceryl trinitrate 1 microg/kg x min or saline. A gamma camera technique was used to measure gastric emptying and small intestinal transit after a 1600-kJ mixed liquid and solid meal. Furthermore, duodenal motility was assessed by manometry. RESULTS: Glyceryl trinitrate did not change gastric mean...... emptying time, gastric half emptying time, gastric retention at 15 min or small intestinal mean transit time. Glyceryl trinitrate did not influence the frequency of duodenal contractions, the amplitude of duodenal contractions or the duodenal motility index. CONCLUSIONS: Intravenous infusion of glyceryl...

  9. Intravenous Injection of Clinical Grade Human MSCs After Experimental Stroke: Functional Benefit and Microvascular Effect.

    Science.gov (United States)

    Moisan, Anack; Favre, Isabelle; Rome, Claire; De Fraipont, Florence; Grillon, Emmanuelle; Coquery, Nicolas; Mathieu, Herv; Mayan, Virginie; Naegele, Bernadette; Hommel, Marc; Richard, Marie-Jeanne; Barbier, Emmanuel Luc; Remy, Chantal; Detante, Olivier

    2016-12-13

    Stroke is the leading cause of disability in adults. Many current clinical trials use intravenous (IV) administration of human bone marrow-derived mesenchymal stem cells (BM-MSCs). This autologous graft requires a delay for ex vivo expansion of cells. We followed microvascular effects and mechanisms of action involved after an IV injection of human BM-MSCs (hBM-MSCs) at a subacute phase of stroke. Rats underwent a transient middle cerebral artery occlusion (MCAo) or a surgery without occlusion (sham) at day 0 (D0). At D8, rats received an IV injection of 3 million hBM-MSCs or PBS-glutamine. In a longitudinal behavioral follow-up, we showed delayed somatosensory and cognitive benefits 4 to 7 weeks after hBM-MSC injection. In a separate longitudinal in vivo magnetic resonance imaging (MRI) study, we observed an enhanced vascular density in the ischemic area 2 and 3 weeks after hBM-MSC injection. Histology and quantitative polymerase chain reaction (qPCR) revealed an overexpression of angiogenic factors such as Ang1 and transforming growth factor-1 (TGF-1) at D16 in hBM-MSC-treated MCAo rats compared to PBS-treated MCAo rats. Altogether, delayed IV injection of hBM-MSCs provides functional benefits and increases cerebral angiogenesis in the stroke lesion via a release of endogenous angiogenic factors enhancing the stabilization of newborn vessels. Enhanced angiogenesis could therefore be a means of improving functional recovery after stroke.

  10. Big Hat, No Cattle: Managing Human Resources, Part 1.

    Science.gov (United States)

    Skinner, Wickham

    1982-01-01

    Presents an in-depth analysis of problems and a suggested approach to developing human resources which goes beyond identifying symptoms and provides a comprehensive perspective for building an effective work force. (JOW)

  11. Pharmacokinetics and pharmacodynamics of eltanolone (pregnanolone), a new steroid intravenous anaesthetic, in humans

    DEFF Research Database (Denmark)

    Carl, Peder; Høgskilde, S; Lang-Jensen, T

    1994-01-01

    Eltanolone, a new intravenous steroid anaesthetic agent was administered intravenously in a dose of 0.6 mg.kg-1 over 45 s to eight healthy male volunteers to evaluate some of its pharmacokinetic and pharmacodynamic effects. Drug concentration-time data were analysed by PCNONLIN, a non-linear regr...

  12. The dynamics of big data and human rights: the case of scientific research

    Science.gov (United States)

    Tasioulas, John

    2016-01-01

    In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities—the new capabilities and risks—of the rapidly evolving digital environment. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336802

  13. The dynamics of big data and human rights: the case of scientific research.

    Science.gov (United States)

    Vayena, Effy; Tasioulas, John

    2016-12-28

    In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities-the new capabilities and risks-of the rapidly evolving digital environment.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  14. A chromatographic method for the production of a human immunoglobulin G solution for intravenous use

    Directory of Open Access Journals (Sweden)

    K. Tanaka

    1998-11-01

    Full Text Available Immunoglobulin G (IgG of excellent quality for intravenous use was obtained from the cryosupernatant of human plasma by a chromatographic method based on a mixture of ion-exchange, DEAE-Sepharose FF and arginine Sepharose 4B affinity chromatography and a final purification step by Sephacryl S-300 HR gel filtration. The yield of 10 experimental batches produced was 3.5 g IgG per liter of plasma. A solvent/detergent combination of 1% Tri (n-butyl phosphate and 1% Triton X-100 was used to inactivate lipid-coated viruses. Analysis of the final product (5% liquid IgG based on the mean for 10 batches showed 94% monomers, 5.5% dimers and 0.5% polymers and aggregates. Anticomplementary activity was 0.3 CH50/mg IgG and prekallikrein activator levels were less than 5 IU/ml. Stability at 37ºC for 30 days in the liquid state was satisfactory. IgG was stored in flasks (2.5 g/flask at 4 to 8ºC. All the characteristics of the product were consistent with the requirements of the 1997 Pharmacopée Européenne.

  15. A baboon syndrome induced by intravenous human immunoglobulins: report of a case and immunological analysis.

    Science.gov (United States)

    Barbaud, A; Tréchot, P; Granel, F; Lonchamp, P; Faure, G; Schmutz, J L; Béné, M C

    1999-01-01

    Following the second series of intravenous human immunoglobulins (IVIg; 0.4 g/kg) prescribed to treat a sensorimotor polyneuritis, a 28-year-old woman developed pompholyx that recurred after each of the following monthly treatments with IVIg. During the administration of the 10th series, the patient developed a typical baboon syndrome. Immunohistochemical studies of a skin biopsy revealed an unexpected epidermal expression of P-selectin, usually expressed by endothelial cells. Patch, prick and intradermal tests performed with IVIg on the back, arms and buttocks gave negative results on immediate and delayed readings. IVIg were re-administered, with the informed consent of the patient, and induced a generalized maculopapular rash. This is the first reported case of baboon syndrome induced by IVIg. Although extensive skin testing was performed, all test sites remained negative. We wonder whether IVIg could reproduce immunological mechanisms involved in the 3 types of systemic contact dermatitis (pompholyx, baboon syndrome and maculopapular rash), including the epidermal expression of P-selectin.

  16. The excretion of hexavalent uranium following intravenous administration. II, Studies on human subjects

    Energy Technology Data Exchange (ETDEWEB)

    Bassett, S.H.; Frankel, A.; Cedars, N.; VanAlstine, H.; Waterhouse, C.; Cusson, K.

    1948-06-25

    Tracer studies employing uranium enriched in the isotopes U{sup 234}, U{sup 235} have been carried out in six human subjects; four males and two females. The uranium, 6 micrograms to 70 micrograms per kilogram of body weight was given intravenously in the hexavalent state as uranyl nitrate. Each individual of the series received a single injection of the metal except for one who was given two widely spaced doses. The first of these was when his condition was normal and the second after an acidosis had been produced by ingestion of ammonium chloride. Renal function tests including urinary catalase, protein, amino N to Creatinine N ratio and clearances of mannitol and p-aminohippurate were done before and after administration of uranium. Only at the 70 microgram per kilogram level in Subject 6 was there a slight rise in urinary catalase and protein suggesting that tolerance had been reached. The excretion of uranium was mainly in the urine, where from 70 to 85% of the administered dose appeared in the first twenty-four hours. Urine of the second twenty-four hours contained about 4% and the third twenty-four hour urine, 1.5% of the administered dose. Detectable amounts were excreted for at least two weeks.

  17. Short faces, big tongues: developmental origin of the human chin.

    Directory of Open Access Journals (Sweden)

    Michael Coquerelle

    Full Text Available During the course of human evolution, the retraction of the face underneath the braincase, and closer to the cervical column, has reduced the horizontal dimension of the vocal tract. By contrast, the relative size of the tongue has not been reduced, implying a rearrangement of the space at the back of the vocal tract to allow breathing and swallowing. This may have left a morphological signature such as a chin (mental prominence that can potentially be interpreted in Homo. Long considered an autopomorphic trait of Homo sapiens, various extinct hominins show different forms of mental prominence. These features may be the evolutionary by-product of equivalent developmental constraints correlated with an enlarged tongue. In order to investigate developmental mechanisms related to this hypothesis, we compare modern 34 human infants against 8 chimpanzee fetuses, whom development of the mandibular symphysis passes through similar stages. The study sets out to test that the shared ontogenetic shape changes of the symphysis observed in both species are driven by the same factor--space restriction at the back of the vocal tract and the associated arrangement of the tongue and hyoid bone. We apply geometric morphometric methods to extensive three-dimensional anatomical landmarks and semilandmarks configuration, capturing the geometry of the cervico-craniofacial complex including the hyoid bone, tongue muscle and the mandible. We demonstrate that in both species, the forward displacement of the mental region derives from the arrangement of the tongue and hyoid bone, in order to cope with the relative horizontal narrowing of the oral cavity. Because humans and chimpanzees share this pattern of developmental integration, the different forms of mental prominence seen in some extinct hominids likely originate from equivalent ontogenetic constraints. Variations in this process could account for similar morphologies.

  18. Intravenous Thrombolysis for Stroke and Presumed Stroke in Human Immunodeficiency Virus-Infected Adults: A Retrospective, Multicenter US Study.

    Science.gov (United States)

    AbdelRazek, Mahmoud A; Gutierrez, Jose; Mampre, David; Cervantes-Arslanian, Anna; Ormseth, Cora; Haussen, Diogo; Thakur, Kiran T; Lyons, Jennifer L; Smith, Bryan R; O'Connor, Owen; Willey, Joshua Z; Mateen, Farrah J

    2018-01-01

    Human immunodeficiency virus (HIV) infection has been shown to increase both ischemic and hemorrhagic stroke risks, but there are limited data on the safety and outcomes of intravenous thrombolysis with tPA (tissue-type plasminogen activator) for acute ischemic stroke in HIV-infected patients. A retrospective chart review of intravenous tPA-treated HIV patients who presented with acute stroke symptoms was performed in 7 large inner-city US academic centers (various search years between 2000 and 2017). We collected data on HIV, National Institutes of Health Stroke Scale score, ischemic stroke risk factors, opportunistic infections, intravenous drug abuse, neuroimaging findings, and modified Rankin Scale score at last follow-up. We identified 33 HIV-infected patients treated with intravenous tPA (mean age, 51 years; 24 men), 10 of whom were stroke mimics. Sixteen of 33 (48%) patients had an HIV viral load less than the limit of detection while 10 of 33 (30%) had a CD4 count Stroke Scale score at presentation was 9, and mean time from symptom onset to tPA was 144 minutes (median, 159). The median modified Rankin Scale score for the 33-patient cohort was 1 and for the 23-patient actual stroke cohort was 2, measured at a median of 90 days poststroke symptom onset. Two patients had nonfatal hemorrhagic transformation (6%; 95% confidence interval, 1%-20%), both in the actual stroke group. Two patients had varicella zoster virus vasculitis of the central nervous system, 1 had meningovascular syphilis, and 7 other patients were actively using intravenous drugs (3 cocaine, 1 heroin, and 3 unspecified), none of whom had hemorrhagic transformation. Most HIV-infected patients treated with intravenous tPA for presumed and actual acute ischemic stroke had no complications, and we observed no fatalities. Stroke mimics were common, and thrombolysis seems safe in this group. We found no data to suggest an increased risk of intravenous tPA-related complications because of concomitant

  19. Natural and Human-Induced Dynamics on Big Hickory Island, Florida

    Directory of Open Access Journals (Sweden)

    Tiffany M. Roberts Briggs

    2016-02-01

    Full Text Available Big Hickory Island, located in Lee County along the mixed-energy west Florida coast, experiences high long-term rates of shoreline recession, with much of the erosion concentrated along the central and southern portions of the island. In 2013, approximately 86,300 cubic meters of sand from an adjacent tidal inlet to the north were placed along 457 m to restore the beach and dune system. In an effort to combat erosion, seven concrete king-pile groins with adjustable panels were constructed subsequent to the completion of the beach nourishment. Natural and human-induced dynamics of Big Hickory Island are discussed through analysis of shoreline and morphologic change using historic aerial photographs and topographic and bathymetric field surveys of the recent beach erosion mitigation project. Although much of the long-term anomalously high rates of erosion for the area are related to natural interchanges between the sand resources of the barrier islands and adjacent ebb tidal shoals, additional reduction in sand supply is a result of human-interventions updrift of Big Hickory over the last several decades. The coupled natural and anthropogenic influences are driving the coastal processes toward a different morphodynamic state than would have occurred under natural processes alone.

  20. Intravenous human immunoglobulins for refractory recurrent pericarditis: a systematic review of all published cases.

    Science.gov (United States)

    Imazio, Massimo; Lazaros, George; Picardi, Elisa; Vasileiou, Panagiotis; Carraro, Mara; Tousoulis, Dimitrios; Belli, Riccardo; Gaita, Fiorenzo

    2016-04-01

    Refractory recurrent pericarditis is a major clinical challenge after colchicine failure, especially in corticosteroid-dependent patients. Human intravenous immunoglobulins (IVIGs) have been proposed as possible therapeutic options for these cases. The goal of this systematic review is to assess the efficacy and safety of IVIGs in this context. Studies reporting the use of IVIG for the treatment of recurrent pericarditis and published up to October 2014 were searched in several databases. All references found, upon initial assessment at title and abstract level for suitability, were consequently retrieved as full reports for further appraisal. Among the 18 citations retrieved, 17 reports (4 case series and 13 single case reports, with an overall population of 30 patients) were included. The mean disease duration was 14 months and the mean number of recurrences before IVIG was 3. Approximately 47% of patients had idiopathic recurrent pericarditis, 10% had an infective cause, and the remainder a systemic inflammatory disease. Nineteen out of the 30 patients (63.3%) were on corticosteroids at IVIG commencement. IVIGs were generally administered at a dose of 400-500 mg/kg/day for 5 consecutive days with repeated cycles according to the clinical response. Complications were uncommon (headache in ~3%) and not life-threatening. After a mean follow-up of approximately 33th months, recurrences occurred in 26.6% of cases after the first IVIG cycle, and 22 of the 30 patients (73.3%) were recurrence-free. Five patients (16.6%) were on corticosteroids at the end of the follow-up. IVIGs are rapidly acting, well tolerated, and efficacious steroid-sparing agents in refractory pericarditis.

  1. Usefulness of high-dose intravenous human immunoglobulins treatment for refractory recurrent pericarditis.

    Science.gov (United States)

    Moretti, Michele; Buiatti, Alessandra; Merlo, Marco; Massa, Laura; Fabris, Enrico; Pinamonti, Bruno; Sinagra, Gianfranco

    2013-11-01

    The management of refractory recurrent pericarditis is challenging. Previous clinical reports have noted a beneficial effect of high-dose intravenous human immunoglobulins (IvIgs) in isolated and systemic inflammatory disease-related forms. In this article, we analyzed retrospectively our clinical experience with IvIg therapy in a series of clinical cases of pericarditis refractory to conventional treatment. We retrospectively analyzed 9 patients (1994 to 2010) with refractory recurrent pericarditis, who received high-dose IvIg as a part of their medical treatment. Nonsteroidal anti-inflammatory drugs (NSAIDs), steroids, or colchicine treatment was not discontinued during IvIg treatment. No patients had a history of autoimmune or connective tissue diseases. During an average period of 11 months from the first recurrence, patients had experienced a mean of 5 relapses before the first IvIg treatment. In 4 cases, patients showed complete clinical remission with no further relapse after the first IvIg cycle. Two patients experienced a single minor relapse, responsive to short-term nonsteroidal anti-inflammatory drugs. In 2 patients, we performed a second cycle of IvIg after a recurrence of pericarditis, with subsequent complete remission. One patient did not respond to 3 cycles of IvIg and subsequently underwent pericardial window and long-term immunosuppressive treatment. No major adverse effect was observed in consequence of IvIg administration in all the cases. In conclusion, although IvIg mode of action is still poorly understood in this setting, this treatment can be considered as an option in patients with recurrent pericarditis refractory to conventional medical treatment and, in our small series, has proved to be effective in 8 of 9 cases. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Systemic administration of antiretrovirals prior to exposure prevents rectal and intravenous HIV-1 transmission in humanized BLT mice.

    Directory of Open Access Journals (Sweden)

    Paul W Denton

    2010-01-01

    Full Text Available Successful antiretroviral pre-exposure prophylaxis (PrEP for mucosal and intravenous HIV-1 transmission could reduce new infections among targeted high-risk populations including discordant couples, injection drug users, high-risk women and men who have sex with men. Targeted antiretroviral PrEP could be particularly effective at slowing the spread of HIV-1 if a single antiretroviral combination were found to be broadly protective across multiple routes of transmission. Therefore, we designed our in vivo preclinical study to systematically investigate whether rectal and intravenous HIV-1 transmission can be blocked by antiretrovirals administered systemically prior to HIV-1 exposure. We performed these studies using a highly relevant in vivo model of mucosal HIV-1 transmission, humanized Bone marrow/Liver/Thymus mice (BLT. BLT mice are susceptible to HIV-1 infection via three major physiological routes of viral transmission: vaginal, rectal and intravenous. Our results show that BLT mice given systemic antiretroviral PrEP are efficiently protected from HIV-1 infection regardless of the route of exposure. Specifically, systemic antiretroviral PrEP with emtricitabine and tenofovir disoproxil fumarate prevented both rectal (Chi square = 8.6, df = 1, p = 0.003 and intravenous (Chi square = 13, df = 1, p = 0.0003 HIV-1 transmission. Our results indicate that antiretroviral PrEP has the potential to be broadly effective at preventing new rectal or intravenous HIV transmissions in targeted high risk individuals. These in vivo preclinical findings provide strong experimental evidence supporting the potential clinical implementation of antiretroviral based pre-exposure prophylactic measures to prevent the spread of HIV/AIDS.

  3. How They Move Reveals What Is Happening: Understanding the Dynamics of Big Events from Human Mobility Pattern

    Directory of Open Access Journals (Sweden)

    Jean Damascène Mazimpaka

    2017-01-01

    Full Text Available The context in which a moving object moves contributes to the movement pattern observed. Likewise, the movement pattern reflects the properties of the movement context. In particular, big events influence human mobility depending on the dynamics of the events. However, this influence has not been explored to understand big events. In this paper, we propose a methodology for learning about big events from human mobility pattern. The methodology involves extracting and analysing the stopping, approaching, and moving-away interactions between public transportation vehicles and the geographic context. The analysis is carried out at two different temporal granularity levels to discover global and local patterns. The results of evaluating this methodology on bus trajectories demonstrate that it can discover occurrences of big events from mobility patterns, roughly estimate the event start and end time, and reveal the temporal patterns of arrival and departure of event attendees. This knowledge can be usefully applied in transportation and event planning and management.

  4. Safety and pharmacokinetics of oral cannabidiol when administered concomitantly with intravenous fentanyl in humans.

    Science.gov (United States)

    Manini, Alex F; Yiannoulos, Georgia; Bergamaschi, Mateus M; Hernandez, Stephanie; Olmedo, Ruben; Barnes, Allan J; Winkel, Gary; Sinha, Rajita; Jutras-Aswad, Didier; Huestis, Marilyn A; Hurd, Yasmin L

    2015-01-01

    Cannabidiol (CBD) is hypothesized as a potential treatment for opioid addiction, with safety studies an important first step for medication development. We determined CBD safety and pharmacokinetics when administered concomitantly with a high-potency opioid in healthy subjects. This double-blind, placebo-controlled cross-over study of CBD, coadministered with intravenous fentanyl, was conducted at the Clinical Research Center in Mount Sinai Hospital, a tertiary care medical center in New York City. Participants were healthy volunteers aged 21 to 65 years with prior opioid exposure, regardless of the route. Blood samples were obtained before and after 400 or 800 mg of CBD pretreatment, followed by a single 0.5 (session 1) or 1.0 μg/kg (session 2) of intravenous fentanyl dose. The primary outcome was the Systematic Assessment for Treatment Emergent Events (SAFTEE) to assess safety and adverse effects. CBD peak plasma concentrations, time to reach peak plasma concentrations (tmax), and area under the curve (AUC) were measured. SAFTEE data were similar between groups without respiratory depression or cardiovascular complications during any test session. After low-dose CBD, tmax occurred at 3 and 1.5 hours in sessions 1 and 2, respectively. After high-dose CBD, tmax occurred at 3 and 4 hours in sessions 1 and 2, respectively. There were no significant differences in plasma CBD or cortisol (AUC P = NS) between sessions. Cannabidiol does not exacerbate adverse effects associated with intravenous fentanyl administration. Coadministration of CBD and opioids was safe and well tolerated. These data provide the foundation for future studies examining CBD as a potential treatment for opioid abuse.

  5. A Pilot Study Evaluating the Safety of Intravenously Administered Human Amnion Epithelial Cells for the Treatment of Hepatic Fibrosis

    Directory of Open Access Journals (Sweden)

    Rebecca Lim

    2017-08-01

    Full Text Available Liver cirrhosis is the 6th leading cause of death in adults aged 15–59 years in high-income countries. For many who progress to cirrhosis, the only prospect for survival is liver transplantation. While there is some indication that mesenchymal stem cells may be useful in reversing established liver fibrosis, there are limitations to their widespread use – namely their rarity, the need for extensive serial passaging and the associated potential for genomic instability and cellular senescence. To this end, we propose the use of allogeneic amnion epithelial cells. This clinical trial will assess the safety of intravenously delivered allogeneic human amnion epithelial cells (hAECs in patients with compensated liver cirrhosis. This will also provide clinical data that will inform phases 2 and 3 clinical trials with the ultimate goal of developing hAECs as a therapeutic option for patients with cirrhosis who are at significant risk of disease progression. We will recruit 12 patients with compensated cirrhosis, based on their hepatic venous pressure gradient, for a dose escalation study. Patients will be closely monitored in the first 24 h post-infusion, then via daily telephone interviews until clinical assessment on day 5. Long term follow up will include standard liver tests, transient elastography and hepatic ultrasound. Ethics approval was obtained from Monash Health for this trial 16052A, “A Pilot Study Evaluating the Safety of Intravenously Administered Human Amnion Epithelial Cells for the Treatment of Liver Fibrosis, A First in Adult Human Study.” The trial will be conducted in accordance to Monash Health Human Ethics guidelines. Outcomes from this study will be disseminated in the form of conference presentations and submission to a peer reviewed journal. This trial has been registered on the Australian and New Zealand Clinical Trials Registry ACTRN12616000437460.

  6. Human factors/ergonomics implications of big data analytics: Chartered Institute of Ergonomics and Human Factors annual lecture.

    Science.gov (United States)

    Drury, Colin G

    2015-01-01

    In recent years, advances in sensor technology, connectedness and computational power have come together to produce huge data-sets. The treatment and analysis of these data-sets is known as big data analytics (BDA), and the somewhat related term data mining. Fields allied to human factors/ergonomics (HFE), e.g. statistics, have developed computational methods to derive meaningful, actionable conclusions from these data bases. This paper examines BDA, often characterised by volume, velocity and variety, giving examples of successful BDA use. This examination provides context by considering examples of using BDA on human data, using BDA in HFE studies, and studies of how people perform BDA. Significant issues for HFE are the reliance of BDA on correlation rather than hypotheses and theory, the ethics of BDA and the use of HFE in data visualisation.

  7. Clearance of 131I-labeled murine monoclonal antibody from patients' blood by intravenous human anti-murine immunoglobulin antibody

    International Nuclear Information System (INIS)

    Stewart, J.S.; Sivolapenko, G.B.; Hird, V.; Davies, K.A.; Walport, M.; Ritter, M.A.; Epenetos, A.A.

    1990-01-01

    Five patients treated with intraperitoneal 131I-labeled mouse monoclonal antibody for ovarian cancer also received i.v. exogenous polyclonal human anti-murine immunoglobulin antibody. The pharmacokinetics of 131I-labeled monoclonal antibody in these patients were compared with those of 28 other patients receiving i.p.-radiolabeled monoclonal antibody for the first time without exogenous human anti-murine immunoglobulin, and who had no preexisting endogenous human anti-murine immunoglobulin antibody. Patients receiving i.v. human anti-murine immunoglobulin antibody demonstrated a rapid clearance of 131I-labeled monoclonal antibody from their circulation. The (mean) maximum 131I blood content was 11.4% of the injected activity in patients receiving human anti-murine immunoglobulin antibody compared to 23.3% in patients not given human anti-murine immunoglobulin antibody. Intravenous human anti-murine immunoglobulin antibody decreased the radiation dose to bone marrow (from 131I-labeled monoclonal antibody in the vascular compartment) 4-fold. Following the injection of human anti-murine immunoglobulin antibody, 131I-monoclonal/human anti-murine immunoglobulin antibody immune complexes were rapidly transported to the liver. Antibody dehalogenation in the liver was rapid, with 87% of the injected 131I excreted in 5 days. Despite the efficient hepatic uptake of immune complexes, dehalogenation of monoclonal antibody was so rapid that the radiation dose to liver parenchyma from circulating 131I was decreased 4-fold rather than increased. All patients developed endogenous human anti-murine immunoglobulin antibody 2 to 3 weeks after treatment

  8. High Efficiency of Human Normal Immunoglobulin for Intravenous Administration in a Patient with Kawasaki Syndrome Diagnosed in the Later Stages

    Directory of Open Access Journals (Sweden)

    Tatyana V. Sleptsova

    2016-01-01

    Full Text Available The article describes a case of late diagnosis of mucocutaneous lymphonodular syndrome (Kawasaki syndrome. At the beginning of the therapy, the child had fever, conjunctivitis, stomatitis, rash, solid swelling of hands and feet, and coronaritis with the development of aneurysms. The article describes the successful use of normal human immunoglobulin for intravenous administration at a dose of 2 g/kg body weight per course in combination with acetylsalicylic acid at the dose of 80 mg/kg per day. After 3 days of treatment, the rash disappeared; limb swelling and symptoms of conjunctivitis significantly reduced; and laboratory parameters of disease activity became normal (erythrocyte sedimentation rate, C-reactive protein concentration. After 3 months, inflammation in the coronary arteries was stopped. After 6 months, a regression of coronary artery aneurysms was recorded. No adverse effects during the immunoglobulin therapy were observed.

  9. Scaling of City Attractiveness for Foreign Visitors through Big Data of Human Economical and Social Media Activity

    OpenAIRE

    Sobolevsky, Stanislav; Bojic, Iva; Belyi, Alexander; Sitko, Izabela; Hawelka, Bartosz; Arias, Juan Murillo; Ratti, Carlo

    2015-01-01

    Scientific studies investigating laws and regularities of human behavior are nowadays increasingly relying on the wealth of widely available digital information produced by human activity. In this paper we use big data created by three different aspects of this activity (i.e., Bank card transactions, geotagged photographs and tweets) in Spain for quantifying city attractiveness for foreign visitors. An important finding of this paper is a strong super linear scaling law of city attractiveness...

  10. Noninvasive quantification of human brain antioxidant concentrations after an intravenous bolus of vitamin C

    Science.gov (United States)

    Background: Until now, antioxidant based initiatives for preventing dementia have lacked a means to detect deficiency or measure pharmacologic effect in the human brain in situ. Objective: Our objective was to apply a novel method to measure key human brain antioxidant concentrations throughout the ...

  11. Intravenous Lipid Emulsion as an Antidote for the Treatment of Acute Poisoning: A Bibliometric Analysis of Human and Animal Studies.

    Science.gov (United States)

    Zyoud, Sa'ed H; Waring, W Stephen; Al-Jabi, Samah W; Sweileh, Waleed M; Rahhal, Belal; Awang, Rahmat

    2016-11-01

    In recent years, there has been increasing interest in the role of intravenous lipid formulations as potential antidotes in patients with severe cardiotoxicity caused by drug toxicity. The aim of this study was to conduct a comprehensive bibliometric analysis of all human and animal studies featuring lipid emulsion as an antidote for the treatment of acute poisoning. The Scopus database search was performed on 5 February 2016 to analyse the research output related to intravenous lipid emulsion as an antidote for the treatment of acute poisoning. Research indicators used for analysis included total number of articles, date (year) of publication, total citations, value of the h-index, document types, countries of publication, journal names, collaboration patterns and institutions. A total of 594 articles were retrieved from Scopus database for the period of 1955-2015. The percentage share of global intravenous lipid emulsion research output showed that research output was 85.86% in 2006-2015 with yearly average growth in this field of 51 articles per year. The USA, United Kingdom (UK), France, Canada, New Zealand, Germany, Australia, China, Turkey and Japan accounted for 449 (75.6%) of all the publications. The total number of citations for all documents was 9,333, with an average of 15.7 citations per document. The h-index of the retrieved documents for lipid emulsion research as antidote for the treatment of acute poisoning was 49. The USA and the UK achieved the highest h-indices, 34 and 14, respectively. New Zealand produced the greatest number of documents with international collaboration (51.9%) followed by Australia (50%) and Canada (41.4%) out of the total number of publications for each country. In summary, we found an increase in the number of publications in the field of lipid emulsion after 2006. The results of this study demonstrate that the majority of publications in the field of lipid emulsion were published by high-income countries. Researchers from

  12. Intravenous Leiomyomatosis

    African Journals Online (AJOL)

    Hemostasis was well achieved. The tumor weighed 6.7 kg. The postoperative course. Intravenous Leiomyomatosis. Narayanaswamy Mariyappa, Uday Kumar Manikyam1, Dinesh Krishnamurthy2, Preeti K,. Yamini Agarwal, Prakar U. Departments of Obstetrics and Gynaecology, 1Pathology and 2Anaesthesia, Sri Devaraj ...

  13. Where are human subjects in Big Data research? The emerging ethics divide

    Directory of Open Access Journals (Sweden)

    Jacob Metcalf

    2016-06-01

    Full Text Available There are growing discontinuities between the research practices of data science and established tools of research ethics regulation. Some of the core commitments of existing research ethics regulations, such as the distinction between research and practice, cannot be cleanly exported from biomedical research to data science research. Such discontinuities have led some data science practitioners and researchers to move toward rejecting ethics regulations outright. These shifts occur at the same time as a proposal for major revisions to the Common Rule—the primary regulation governing human-subjects research in the USA—is under consideration for the first time in decades. We contextualize these revisions in long-running complaints about regulation of social science research and argue data science should be understood as continuous with social sciences in this regard. The proposed regulations are more flexible and scalable to the methods of non-biomedical research, yet problematically largely exclude data science methods from human-subjects regulation, particularly uses of public datasets. The ethical frameworks for Big Data research are highly contested and in flux, and the potential harms of data science research are unpredictable. We examine several contentious cases of research harms in data science, including the 2014 Facebook emotional contagion study and the 2016 use of geographical data techniques to identify the pseudonymous artist Banksy. To address disputes about application of human-subjects research ethics in data science, critical data studies should offer a historically nuanced theory of “data subjectivity” responsive to the epistemic methods, harms and benefits of data science and commerce.

  14. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  15. Long-term intravenous treatment of Pompe disease with recombinant human alpha-glucosidase from milk.

    NARCIS (Netherlands)

    Hout, J.M. van den; Kamphoven, J.H.; Winkel, L.P.; Arts, W.F.M.; Klerk, J.B.C. de; Loonen, M.C.B.; Vulto, A.G.; Cromme-Dijkhuis, A.H.; Weisglas-Kuperus, N.; Hop, W.C.J.; Hirtum, H. van; Diggelen, O.P. van; Boer, M. de; Kroos, M.A.; Doorn, P.A. van; Voort, E.I. van der; Sibbles, B.; Corven, E.J. van; Brakenhoff, J.P.; Hove, J.L. van; Smeitink, J.A.M.; Jong, G. de; Reuser, A.J.J.; Ploeg, A.T. van der

    2004-01-01

    OBJECTIVE: Recent reports warn that the worldwide cell culture capacity is insufficient to fulfill the increasing demand for human protein drugs. Production in milk of transgenic animals is an attractive alternative. Kilogram quantities of product per year can be obtained at relatively low costs,

  16. Long-term intravenous treatment of Pompe disease with recombinant human alpha-glucosidase from milk

    NARCIS (Netherlands)

    J.M.P. van den Hout (Johanna); B. Sibbles (Barbara); J.P. Brakenhoff (Just); A.H. Cromme-Dijkhuis (Adri); N. Weisglas-Kuperus (Nynke); A.J.J. Reuser (Arnold); M.A. Boer (Marijke); J.A.M. Smeitink (Jan); O.P. van Diggelen (Otto); E. van der Voort (Edwin); E.J.J.M. van Corven (Emiel); H. van Hirtum (Hans); J.H.J. Kamphoven (Joep); A.T. van der Ploeg (Ans); J. van Hove (Johan); W.F.M. Arts (Willem Frans); P.A. van Doorn (Pieter); J.B.C. de Klerk (Johannes); M.C.B. Loonen (Christa); A.G. Vulto (Arnold); M.A. Kroos (Marian); W.C.J. Hop (Wim); L.P.F. Winkel (Léon); G. de Jong (Gerard)

    2004-01-01

    textabstractOBJECTIVE: Recent reports warn that the worldwide cell culture capacity is insufficient to fulfill the increasing demand for human protein drugs. Production in milk of transgenic animals is an attractive alternative. Kilogram quantities of product per year can be

  17. Dosimetry of intravenously administered oxygen-15 labelled water in man: a model based on experimental human data from 21 subjects

    International Nuclear Information System (INIS)

    Smith, T.; Tong, C.; Lammertsma, A.A.; Butler, K.R.; Schnorr, L.; Watson, J.D.G.; Ramsay, S.; Clark, J.C.; Jones, T.

    1994-01-01

    Models based on uniform distribution of tracer in total body water underestimate the absorbed dose from H 2 15 O because of the short half-life (2.04 min) of 15 O, which leads to non-uniform distribution of absorbed dose and also complicates the direct measurement of organ retention curves. However, organ absorbed doses can be predicted by the present kinetic model based on the convolution technique. The measured time course of arterial H 2 15 O concentration following intravenous administration represents the input function to organs. The impulse response of a given organ is its transit time function determined by blood flow and the partition of water between tissue and blood. Values of these two parameters were taken from the literature. Integrals of the arterial input function and organ transit time functions were used to derive integrals of organ retention functions (organ residence times). The latter were used with absorbed dose calculation software (MIRDOSE-2) to obtain estimates for 24 organs. From the mean values of organ absorbed doses, the effective dose equivalent (EDE) and effective dose (ED) were calculated. From measurements on 21 subjects, the average value for both EDE and ED was calculated to be 1.2 μSv.MBq -1 compared with a value of about 0.5 μSv.MBq -1 predicted by uniform water distribution models. Based on the human data, a method of approximating H 2 15 O absorbed dose values from body surface area is described. (orig.)

  18. GH receptor signaling in skeletal muscle and adipose tissue in human subjects following exposure to an intravenous GH bolus

    DEFF Research Database (Denmark)

    Jørgensen, Jens O L; Jessen, Niels; Pedersen, Steen Bønløkke

    2006-01-01

    Growth hormone (GH) regulates muscle and fat metabolism, which impacts on body composition and insulin sensitivity, but the underlying GH signaling pathways have not been studied in vivo in humans. We investigated GH signaling in biopsies from muscle and abdominal fat obtained 30 (n = 3) or 60 (n...... = 3) min after an intravenous bolus of GH (0.5 mg) vs. saline in conjunction with serum sampling in six healthy males after an overnight fast. Expression of the following signal proteins were assayed by Western blotting: STAT5/p-STAT5, MAPK, and Akt/PKB. IRS-1-associated PI 3-kinase activity...... was measured by in vitro phosphorylation of PI. STAT5 DNA binding activity was assessed with EMSA, and the expression of IGF-I and SOCS mRNA was measured by real-time RT-PCR. GH induced a 52% increase in circulating FFA levels with peak values after 155 min (P = 0.03). Tyrosine-phosphorylated STAT5...

  19. Effects of Intravenous Administration of Human Umbilical Cord Blood Stem Cells in 3-Acetylpyridine-Lesioned Rats

    Directory of Open Access Journals (Sweden)

    Lucía Calatrava-Ferreras

    2012-01-01

    Full Text Available Cerebellar ataxias include a heterogeneous group of infrequent diseases characterized by lack of motor coordination caused by disturbances in the cerebellum and its associated circuits. Current therapies are based on the use of drugs that correct some of the molecular processes involved in their pathogenesis. Although these treatments yielded promising results, there is not yet an effective therapy for these diseases. Cell replacement strategies using human umbilical cord blood mononuclear cells (HuUCBMCs have emerged as a promising approach for restoration of function in neurodegenerative diseases. The aim of this work was to investigate the potential therapeutic activity of HuUCBMCs in the 3-acetylpyridine (3-AP rat model of cerebellar ataxia. Intravenous administered HuUCBMCs reached the cerebellum and brain stem of 3-AP ataxic rats. Grafted cells reduced 3-AP-induced neuronal loss promoted the activation of microglia in the brain stem, and prevented the overexpression of GFAP elicited by 3-AP in the cerebellum. In addition, HuUCBMCs upregulated the expression of proteins that are critical for cell survival, such as phospho-Akt and Bcl-2, in the cerebellum and brain stem of 3-AP ataxic rats. As all these effects were accompanied by a temporal but significant improvement in motor coordination, HuUCBMCs grafts can be considered as an effective cell replacement therapy for cerebellar disorders.

  20. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    Science.gov (United States)

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  1. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...... and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative...

  2. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  3. Characterization of ornidazole metabolites in human bile after intraveneous doses by ultraperformance liquid chromatography/quadrupole time-of-flight mass spectrometry

    Directory of Open Access Journals (Sweden)

    Jiangbo Du

    2012-04-01

    Full Text Available Ultraperformance liquid chromatography/quadrupole time-of-flight mass spectrometry (UPLC/Q-TOF MS was used to characterize ornidazole metabolites in human bile after intravenous doses. A liquid chromatography tandem mass spectrometry (LC–MS/MS assay was developed for the determination of the bile level of ornidazole. Bile samples, collected from four patients with T-tube drainage after biliary tract surgery, were prepared by protein precipitation with acetonitrile before analysis. A total of 12 metabolites, including 10 novel metabolites, were detected and characterized. The metabolites of ornidazole in human bile were the products of hydrochloride (HCl elimination, oxidative dechlorination, hydroxylation, sulfation, diastereoisomeric glucuronation, and substitution of NO2 or Cl atom by cysteine or N-acetylcysteine, and oxidative dechlorination followed by further carboxylation. The bile levels of ornidazole at 12 h after multiple intravenous infusions were well above its minimal inhibitory concentration for common strains of anaerobic bacteria.

  4. Big Society, Big Deal?

    Science.gov (United States)

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  5. Early Intravenous Delivery of Human Brain Stromal Cells Modulates Systemic Inflammation and Leads to Vasoprotection in Traumatic Spinal Cord Injury.

    Science.gov (United States)

    Badner, Anna; Vawda, Reaz; Laliberte, Alex; Hong, James; Mikhail, Mirriam; Jose, Alejandro; Dragas, Rachel; Fehlings, Michael

    2016-08-01

    : Spinal cord injury (SCI) is a life-threatening condition with multifaceted complications and limited treatment options. In SCI, the initial physical trauma is closely followed by a series of secondary events, including inflammation and blood spinal cord barrier (BSCB) disruption, which further exacerbate injury. This secondary pathology is partially mediated by the systemic immune response to trauma, in which cytokine production leads to the recruitment/activation of inflammatory cells. Because early intravenous delivery of mesenchymal stromal cells (MSCs) has been shown to mitigate inflammation in various models of neurologic disease, this study aimed to assess these effects in a rat model of SCI (C7-T1, 35-gram clip compression) using human brain-derived stromal cells. Quantitative polymerase chain reaction for a human-specific DNA sequence was used to assess cell biodistribution/clearance and confirmed that only a small proportion (approximately 0.001%-0.002%) of cells are delivered to the spinal cord, with the majority residing in the lung, liver, and spleen. Intriguingly, although cell populations drastically declined in all aforementioned organs, there remained a persistent population in the spleen at 7 days. Furthermore, the cell infusion significantly increased splenic and circulating levels of interleukin-10-a potent anti-inflammatory cytokine. Through this suppression of the systemic inflammatory response, the cells also reduced acute spinal cord BSCB permeability, hemorrhage, and lesion volume. These early effects further translated into enhanced functional recovery and tissue sparing 10 weeks after SCI. This work demonstrates an exciting therapeutic approach whereby a minimally invasive cell-transplantation procedure can effectively reduce secondary damage after SCI through systemic immunomodulation. Central nervous system pericytes (perivascular stromal cells) have recently gained significant attention within the scientific community. In addition to

  6. Rapid intravenous infusion of 20 mL/kg saline alters the distribution of perfusion in healthy supine humans.

    Science.gov (United States)

    Henderson, A C; Sá, R C; Barash, I A; Holverda, S; Buxton, R B; Hopkins, S R; Prisk, G K

    2012-03-15

    Rapid intravenous saline infusion, a model meant to replicate the initial changes leading to pulmonary interstitial edema, increases pulmonary arterial pressure in humans. We hypothesized that this would alter lung perfusion distribution. Six healthy subjects (29 ± 6 years) underwent magnetic resonance imaging to quantify perfusion using arterial spin labeling. Regional proton density was measured using a fast-gradient echo sequence, allowing blood delivered to the slice to be normalized for density and quantified in mL/min/g. Contributions from flow in large conduit vessels were minimized using a flow cutoff value (blood delivered > 35% maximum in mL/min/cm(3)) in order to obtain an estimate of blood delivered to the capillary bed (perfusion). Images were acquired supine at baseline, after infusion of 20 mL/kg saline, and after a short upright recovery period for a single sagittal slice in the right lung during breath-holds at functional residual capacity. Thoracic fluid content measured by impedance cardiography was elevated post-infusion by up to 13% (pchanges in conduit vessels, there were no significant changes in perfusion in dependent lung following infusion (7.8 ± 1.9 mL/min/g baseline, 7.9 ± 2.0 post, 8.5 ± 2.1 recovery, p=0.36). There were no significant changes in lung density. These data suggest that saline infusion increased perfusion to nondependent lung, consistent with an increase in intravascular pressures. Dependent lung may have been "protected" from increases in perfusion following infusion due to gravitational compression of the pulmonary vasculature. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Near Real Time Analytics of Human Sensor Networks in the Realm of Big Data

    Science.gov (United States)

    Aulov, O.; Halem, M.

    2012-12-01

    With the prolific development of social media, emergency responders have an increasing interest in harvesting social media from outlets such as Flickr, Twitter, and Facebook, in order to assess the scale and specifics of extreme events including wild fires, earthquakes, terrorist attacks, oil spills, etc. A number of experimental platforms have successfully been implemented to demonstrate the utilization of social media data in extreme events, including Twitter Earthquake Detector, which relied on tweets for earthquake monitoring; AirTwitter, which used tweets for air quality reporting; and our previous work, using Flickr data as boundary value forcings to improve the forecast of oil beaching in the aftermath of the Deepwater Horizon oil spill. The majority of these platforms addressed a narrow, specific type of emergency and harvested data from a particular outlet. We demonstrate an interactive framework for monitoring, mining and analyzing a plethora of heterogeneous social media sources for a diverse range of extreme events. Our framework consists of three major parts: a real time social media aggregator, a data processing and analysis engine, and a web-based visualization and reporting tool. The aggregator gathers tweets, Facebook comments from fan pages, Google+ posts, forum discussions, blog posts (such as LiveJournal and Blogger.com), images from photo-sharing platforms (such as Flickr, Picasa), videos from video-sharing platforms (youtube, Vimeo), and so forth. The data processing and analysis engine pre-processes the aggregated information and annotates it with geolocation and sentiment information. In many cases, the metadata of the social media posts does not contain geolocation information—-however, a human reader can easily guess from the body of the text what location is discussed. We are automating this task by use of Named Entity Recognition (NER) algorithms and a gazetteer service. The visualization and reporting tool provides a web-based, user

  8. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    Directory of Open Access Journals (Sweden)

    Benjamin Ulfenborg

    Full Text Available The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be

  9. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    Science.gov (United States)

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure

  10. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  11. Big Data for Global History: The Transformative Promise of Digital Humanities

    Directory of Open Access Journals (Sweden)

    Joris van Eijnatten

    2013-12-01

    Full Text Available This article discusses the promises and challenges of digital humanitiesmethodologies for historical inquiry. In order to address the great outstanding question whether big data will re-invigorate macro-history, a number of research projects are described that use cultural text mining to explore big data repositories of digitised newspapers. The advantages of quantitative analysis, visualisation and named entity recognition in both exploration and analysis are illustrated in the study of public debates on drugs, drug trafficking, and drug users in the early twentieth century (wahsp, the comparative study of discourses about heredity, genetics, and eugenics in Dutch and German newspapers, 1863-1940 (biland and the study of trans-Atlantic discourses (Translantis. While many technological and practical obstacles remain, advantages over traditional hermeneutic methodology are found in heuristics, analytics, quantitative trans-disciplinarity, and reproducibility, offering a quantitative and trans-national perspective on the history of mentalities.

  12. Metabolic coronary-flow regulation and exogenous nitric oxide in human coronary artery disease: assessment by intravenous administration of nitroglycerin

    NARCIS (Netherlands)

    Kal, J. E.; van Wezel, H. B.; Porsius, M.; Vergroesen, I.; Spaan, J. A.

    2000-01-01

    We sought to evaluate the effect of intravenous administration of the nitric oxide--donor substance nitroglycerin (NTG) on metabolic coronary-flow regulation in patients with coronary artery disease (CAD). In 12 patients with stable CAD, we measured coronary sinus blood flow and myocardial oxygen

  13. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  14. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  15. Intraarticular and intravenous administration of 99MTc-HMPAO-labeled human mesenchymal stem cells (99MTC-AH-MSCS): In vivo imaging and biodistribution

    International Nuclear Information System (INIS)

    Meseguer-Olmo, Luis; Montellano, Antonio Jesús; Martínez, Teresa; Martínez, Carlos M.; Revilla-Nuin, Beatriz; Roldán, Marta; Mora, Cristina Fuente; López-Lucas, Maria Dolores; Fuente, Teodomiro

    2017-01-01

    Introduction: Therapeutic application of intravenous administered (IV) human bone marrow-derived mesenchymal stem cells (ahMSCs) appears to have as main drawback the massive retention of cells in the lung parenchyma, questioning the suitability of this via of administration. Intraarticular administration (IAR) could be considered as an alternative route for therapy in degenerative and traumatic joint lesions. Our work is outlined as a comparative study of biodistribution of 99m Tc-ahMSCs after IV and IAR administration, via scintigraphic study in an animal model. Methods: Isolated primary culture of adult human mesenchymal stem cells was labeled with 99m Tc-HMPAO for scintigraphic study of in vivo distribution after intravenous and intra-articular (knee) administration in rabbits. Results: IV administration of radiolabeled ahMSCs showed the bulk of radioactivity in the lung parenchyma while IAR images showed activity mainly in the injected cavity and complete absence of uptake in pulmonary bed. Conclusions: Our study shows that IAR administration overcomes the limitations of IV injection, in particular, those related to cells destruction in the lung parenchyma. After IAR administration, cells remain within the joint cavity, as expected given its size and adhesion properties. Advances in knowledge: Intra-articular administration of adult human mesenchymal stem cells could be a suitable route for therapeutic effect in joint lesions. Implications for patient care: Local administration of adult human mesenchymal stem cells could improve their therapeutic effects, minimizing side effects in patients.

  16. Crystallization and preliminary crystallographic analysis of the fourth FAS1 domain of human BigH3

    International Nuclear Information System (INIS)

    Yoo, Ji-Ho; Kim, EungKweon; Kim, Jongsun; Cho, Hyun-Soo

    2007-01-01

    The crystallization and X-ray diffraction analysis of the fourth FAS1 domain of human BigH3 are reported. The protein BigH3 is a cell-adhesion molecule induced by transforming growth factor-β (TGF-β). It consists of four homologous repeat domains known as FAS1 domains; mutations in these domains have been linked to corneal dystrophy. The fourth FAS1 domain was expressed in Escherichia coli B834 (DE3) (a methionine auxotroph) and purified by DEAE anion-exchange and gel-filtration chromatography. The FAS1 domain was crystallized using the vapour-diffusion method. A SAD diffraction data set was collected to a resolution of 2.5 Å at 100 K. The crystal belonged to space group P6 1 or P6 5 and had two molecules per asymmetric unit, with unit-cell parameters a = b = 62.93, c = 143.27 Å, α = β = 90.0, γ = 120.0°

  17. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  18. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...

  19. Intravenously injected human multilineage-differentiating stress-enduring cells selectively engraft into mouse aortic aneurysms and attenuate dilatation by differentiating into multiple cell types.

    Science.gov (United States)

    Hosoyama, Katsuhiro; Wakao, Shohei; Kushida, Yoshihiro; Ogura, Fumitaka; Maeda, Kay; Adachi, Osamu; Kawamoto, Shunsuke; Dezawa, Mari; Saiki, Yoshikatsu

    2018-02-21

    Aortic aneurysms result from the degradation of multiple components represented by endothelial cells, vascular smooth muscle cells, and elastic fibers. Cells that can replenish these components are desirable for cell-based therapy. Intravenously injected multilineage-differentiating stress-enduring (Muse) cells, endogenous nontumorigenic pluripotent-like stem cells, reportedly integrate into the damaged site and repair the tissue through spontaneous differentiation into tissue-compatible cells. We evaluated the therapeutic efficacy of Muse cells in a murine aortic aneurysm model. Human bone marrow Muse cells, isolated as stage-specific embryonic antigen-3 + from bone marrow mesenchymal stem cells, or non-Muse cells (stage-specific embryonic antigen-3 - cells in mesenchymal stem cells), bone marrow mesenchymal stem cells, or vehicle was intravenously injected at day 0, day 7, and 2 weeks (20,000 cells/injection) after inducing aortic aneurysms by periaortic incubation of CaCl 2 and elastase in severe combined immunodeficient mice. At 8 weeks, infusion of human Muse cells attenuated aneurysm dilation, and the aneurysmal size in the Muse group corresponded to approximately 62.5%, 55.6%, and 45.6% in the non-Muse, mesenchymal stem cell, and vehicle groups, respectively. Multiphoton laser confocal microscopy revealed that infused Muse cells migrated into aneurysmal tissue from the adventitial side and penetrated toward the luminal side. Histologic analysis demonstrated robust preservation of elastic fibers and spontaneous differentiation into endothelial cells and vascular smooth muscle cells. After intravenous injection, Muse cells homed and expanded to the aneurysm from the adventitial side. Subsequently, Muse cells differentiated spontaneously into vascular smooth muscle cells and endothelial cells, and elastic fibers were preserved. These Muse cell features together led to substantial attenuation of aneurysmal dilation. Copyright © 2018 The American Association

  20. Age diminishes the testicular steroidogenic response to repeated intravenous pulses of recombinant human LH during acute GnRH-receptor blockade in healthy men.

    Science.gov (United States)

    Veldhuis, Johannes D; Veldhuis, Nathan J D; Keenan, Daniel M; Iranmanesh, Ali

    2005-04-01

    Testosterone (Te) concentrations fall gradually in healthy aging men. Postulated mechanisms include relative failure of gonadotropin-releasing hormone (GnRH), luteinizing hormone (LH), and/or gonadal Te secretion. Available methods to test Leydig cell Te production include pharmacological stimulation with human chorionic gonadotropin (hCG). We reasoned that physiological lutropic signaling could be mimicked by pulsatile infusion of recombinant human (rh) LH during acute suppression of LH secretion. To this end, we studied eight young (ages 19-30 yr) and seven older (ages 61-73 yr) men in an experimental paradigm comprising 1) inhibition of overnight LH secretion with a potent selective GnRH-receptor antagonist (ganirelix, 2 mg sc), 2) intravenous infusion of consecutive pulses of rh LH (50 IU every 2 h), and 3) chemiluminometric assay of LH and Te concentrations sampled every 10 min for 26 h. Statistical analyses revealed that 1) ganirelix suppressed LH and Te equally (> 75% median inhibition) in young and older men, 2) infused LH pulse profiles did not differ by age, and 3) successive intravenous pulses of rh LH increased concentrations of free Te (ng/dl) to 4.6 +/- 0.38 (young) and 2.1 +/- 0.14 (older; P physiological LH pulses unmasks significant impairment of short-term Leydig cell steroidogenesis in aging men. Whether more prolonged pulsatile LH stimulation would normalize this inferred defect is unknown.

  1. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  2. Big projects, big problems.

    Science.gov (United States)

    Zeide, B

    1994-11-01

    Our environmental concerns prompt launching large monitoring programs. Examining the history and accomplishments of similar endeavors is the best way to avoid errors. One lesson taught by the oldest and largest survey of national renewable resources in the United States, the Forest Inventory and Analysis Program, is that the program itself is not capable of learning from its errors. Among other problems that beset big programs are unrealistic promises. It is not possible to inventory "every animal and plant species in the United States and their habitats" as the newly created National Biological Survey vows to do. Even if it were possible, this would hardly help to attain the ultimate goal of the Survey, survival of all species. In his request to fund the Survey, Secretary of the Interior, Bruce Babbitt, compared the conflict between economic development and environmental integrity with train wrecks. This metaphor is as brilliant as it is deceitful. Its brilliance, attested by the success with the press and legislature, is in a vivid and blithe image suggesting that, given sufficient information, the conflict could be averted. After all, railroad accidents are rare and avoidable exceptions. This hopeful situation cannot be honestly compared with the plight of our environment. The crucial piece of information - that there is no spare track for economic development - is readily available. Our population and economic growth take place in the same space that has already been fully occupied by other species. To be trustworthy, monitoring programs should face the reality that development necessitates environmental degradation.

  3. Kinetics of intravenous radiographic contrast medium injections as used on CT: simulation with time delay differential equations in a basic human cardiovascular multicompartment model.

    Science.gov (United States)

    Violon, D

    2012-12-01

    To develop a multicompartment model of only essential human body components that predicts the contrast medium concentration vs time curve in a chosen compartment after an intravenous injection. Also to show that the model can be used to time adequately contrast-enhanced CT series. A system of linked time delay instead of ordinary differential equations described the model and was solved with a Matlab program (Matlab v. 6.5; The Mathworks, Inc., Natick, MA). All the injection and physiological parameters were modified to cope with normal or pathological situations. In vivo time-concentration curves from the literature were recalculated to validate the model. The recalculated contrast medium time-concentration curves and parameters are given. The results of the statistical analysis of the study findings are expressed as the median prediction error and the median absolute prediction error values for both the time delay and ordinary differential equation systems; these are situated well below the generally accepted maximum 20% limit. The presented program correctly predicts the time-concentration curve of an intravenous contrast medium injection and, consequently, allows an individually tailored approach of CT examinations with optimised use of the injected contrast medium volume, as long as time delay instead of ordinary differential equations are used. The presented program offers good preliminary knowledge of the time-contrast medium concentration curve after any intravenous injection, allowing adequate timing of a CT examination, required by the short scan time of present-day scanners. The injected volume of contrast medium can be tailored to the individual patient with no more contrast medium than is strictly needed.

  4. Human neuroblastoma cell growth in xenogeneic hosts: comparison of T cell-deficient and NK-deficient hosts, and subcutaneous or intravenous injection routes.

    Science.gov (United States)

    Turner, W J; Chatten, J; Lampson, L A

    1990-04-01

    We have examined two features of neuroblastoma cells that had not been well-characterized in a xenogeneic model: The cells display unusual immunologic properties in other experimental systems, and the original tumors display widespread and characteristic patterns of metastasis. To determine the most appropriate immunodeficient host for primary tumor growth, T cell-deficient nude mice, NK-deficient beige mice, beige-nudes, and controls were injected with the well-characterized line CHP-100. To define the pattern of tumor spread, complete autopsies were performed following subcutaneous, intraperitoneal and intravenous injections. CHP-100 consistently formed subcutaneous tumors in T cell-deficient mice (nude and beige-nude), but not in T cell-competent mice (beige, heterozygous nu/+ and bg/+, or wild-type). The growth rate and final size of the subcutaneous tumors were not greater in beige-nudes than in nudes. All mice showed early CHP-100 cell death after subcutaneous injection; the nature of the immunodeficiency was more relevant for the surviving subpopulation. Widespread dissemination was seen following intravenous injection, particularly in beige-nudes. Aspects of the growth patterns were appropriate to the tumor of origin. The behavior in immunodeficient mice suggests that T cells can play a role in controlling the growth of these cells; the next steps will be to define the effector mechanisms, and to determine if they can be exploited for human patients. The hematogenous spread following intravenous injection suggests that insights into the control of blood-borne tumor may also come from further study of this model.

  5. Comparative methods of quantifying fecal neutral sterols in rats and humans after intravenous [14C]-, [3H] or [2H]cholesterol labeling.

    Science.gov (United States)

    Ferezou, J; Chevallier, F

    1982-12-13

    Several methods used to estimate the fecal elimination of neutral sterols and of cholesterol having a plasmatic origin (called 'excreted cholesterol') were compared in rats and humans according to the tracer intravenously administered: 14-14C], [1,2-3H]- or octadeuterated cholesterol. In both species, octadeuterated cholesterol had no isotopic effect and the chance occurrence of epicoprostanol in fecal sterols induced an error in the calculation of the fecal excretion of cholesterol. In humans, the use of [1,2-3H]cholesterol appeared to be inaccurate in measuring the fecal flows of cholesterol, because of a loss of 3H radioactivity during the bacterial transformation of cholesterol in the digestive tract. Consequently, the reference method needed to calculate the proportion of excreted cholesterol in fecal cholesterol consisted in dividing the isotopic concentration measured in purified fecal cholesterol by that measured in the appropriate plasma cholesterol sample.

  6. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  7. Reduction of microhemorrhages in the spinal cord of symptomatic ALS mice after intravenous human bone marrow stem cell transplantation accompanies repair of the blood-spinal cord barrier

    Science.gov (United States)

    Eve, David J.; Steiner, George; Mahendrasah, Ajay; Sanberg, Paul R.; Kurien, Crupa; Thomson, Avery; Borlongan, Cesar V.; Garbuzova-Davis, Svitlana

    2018-01-01

    Blood-spinal cord barrier (BSCB) alterations, including capillary rupture, have been demonstrated in animal models of amyotrophic lateral sclerosis (ALS) and ALS patients. To date, treatment to restore BSCB in ALS is underexplored. Here, we evaluated whether intravenous transplantation of human bone marrow CD34+ (hBM34+) cells into symptomatic ALS mice leads to restoration of capillary integrity in the spinal cord as determined by detection of microhemorrhages. Three different doses of hBM34+ cells (5 × 104, 5 × 105 or 1 × 106) or media were intravenously injected into symptomatic G93A SOD1 mice at 13 weeks of age. Microhemorrhages were determined in the cervical and lumbar spinal cords of mice at 4 weeks post-treatment, as revealed by Perls’ Prussian blue staining for ferric iron. Numerous microhemorrhages were observed in the gray and white matter of the spinal cords in media-treated mice, with a greater number of capillary ruptures within the ventral horn of both segments. In cell-treated mice, microhemorrhage numbers in the cervical and lumbar spinal cords were inversely related to administered cell doses. In particular, the pervasive microvascular ruptures determined in the spinal cords in late symptomatic ALS mice were significantly decreased by the highest cell dose, suggestive of BSCB repair by grafted hBM34+ cells. The study results provide translational outcomes supporting transplantation of hBM34+ cells at an optimal dose as a potential therapeutic strategy for BSCB repair in ALS patients. PMID:29535831

  8. The big challenges in modeling human and environmental well-being.

    Science.gov (United States)

    Tuljapurkar, Shripad

    2016-01-01

    This article is a selective review of quantitative research, historical and prospective, that is needed to inform sustainable development policy. I start with a simple framework to highlight how demography and productivity shape human well-being. I use that to discuss three sets of issues and corresponding challenges to modeling: first, population prehistory and early human development and their implications for the future; second, the multiple distinct dimensions of human and environmental well-being and the meaning of sustainability; and, third, inequality as a phenomenon triggered by development and models to examine changing inequality and its consequences. I conclude with a few words about other important factors: political, institutional, and cultural.

  9. Elevated Endotoxin Levels in Human Intravenous Immunoglobulin Concentrates Caused by (1->3)-{beta}-D-Glucans.

    Science.gov (United States)

    Buchacher, Andrea; Krause, Dagmar; Wiry, Gerda; Weinberger, Josef

    2010-01-01

    Endotoxins have been measured routinely in the final product and during the production process to produce non-pyrogenic parenterals. Limulus-amoebocyte-lysate-reactive material was found in in-process samples and final product of one of Octapharma's intravenous immunoglobulin (IVIG) preparations. Limulus-amoebocyte-lysate (LAL) is activated by bacterial endotoxins and by (1→3)-β-D-glucans. The contribution of both compounds on the LAL-related signal was determined by three different approaches: (1) using a test specific for (1→3)-β-D-glucans, (2) by addition of β-glucan blocker, and (3) by the use of a recombinant endotoxin assay. It was shown that none of our IVIG concentrates contained elevated endotoxin contents but that the higher LAL reaction could be ascribed to (1→3)-β-D-glucans extracted from cellulose filter pads. The use of an endotoxin test kit highly sensitive for (1→3)-β-D-glucans might lead to false-positive results. (1→3)-β-D-glucans spike solutions did not evoke an increase of temperature in rabbits, suggesting that a pyrogenic reaction is not expected in patients. Endotoxins have been measured routinely in the final product and during the production process to produce non-pyrogenic parenterals. Limulus-amoebocyte-lysate-reactive material was found in in-process samples and final product of one of Octapharma's intravenous immunoglobulin (IVIG) preparations. Limulus-amoebocyte-lysate (LAL) is activated by bacterial endotoxins and by (1→3)-β-D-glucans. The contribution of both compounds on the LAL-related signal was determined by three different approaches: (1) using a test specific for (1→3)-β-D-glucans, (2) by addition of β-glucan blocker, and (3) by the use of a recombinant endotoxin assay. It has been shown that none of our IVIG concentrates contained elevated endotoxin contents but that the higher LAL reaction could be ascribed to (1→3)-β-D-glucans extracted from cellulose filter pads. The use of an endotoxin test kit

  10. Efficacy, pharmacokinetics, safety, and tolerability of Flebogamma 10% DIF, a high-purity human intravenous immunoglobulin, in primary immunodeficiency.

    Science.gov (United States)

    Berger, Melvin; Pinciaro, Paul J; Althaus, Arthur; Ballow, Mark; Chouksey, Akhilesh; Moy, James; Ochs, Hans; Stein, Mark

    2010-03-01

    Flebogamma 10% DIF represents an evolution of intravenous immune globulin from the previous 5% product to be administered at higher rates and with smaller infusion volumes. Pathogen safety is enhanced by the combination of multiple methods with different mechanisms of action. The objective of this study as to evaluate the efficacy, pharmacokinetics, and safety of Flebogamma 10% DIF for immunoglobulin replacement therapy in primary immunodeficiency diseases (PIDD). Flebogamma 10% DIF was administered to 46 subjects with well-defined PIDD at a dose of 300-600 mg/kg every 21-28 days for 12 months. Serious bacterial infection rate was 0.025/subject/year. Half-life in serum of the administered IgG was approximately 35 days. No serious treatment-related adverse event (AE) occurred in any patient. Most of the potentially treatment-related AEs occurred during the infusion, accounting for 20% of the 601 infusions administered. Flebogamma 10% DIF is efficacious and safe, has adequate pharmacokinetic properties, and is well-tolerated for the treatment of PIDD.

  11. Zooniverse: Combining Human and Machine Classifiers for the Big Survey Era

    Science.gov (United States)

    Fortson, Lucy; Wright, Darryl; Beck, Melanie; Lintott, Chris; Scarlata, Claudia; Dickinson, Hugh; Trouille, Laura; Willi, Marco; Laraia, Michael; Boyer, Amy; Veldhuis, Marten; Zooniverse

    2018-01-01

    Many analyses of astronomical data sets, ranging from morphological classification of galaxies to identification of supernova candidates, have relied on humans to classify data into distinct categories. Crowdsourced galaxy classifications via the Galaxy Zoo project provided a solution that scaled visual classification for extant surveys by harnessing the combined power of thousands of volunteers. However, the much larger data sets anticipated from upcoming surveys will require a different approach. Automated classifiers using supervised machine learning have improved considerably over the past decade but their increasing sophistication comes at the expense of needing ever more training data. Crowdsourced classification by human volunteers is a critical technique for obtaining these training data. But several improvements can be made on this zeroth order solution. Efficiency gains can be achieved by implementing a “cascade filtering” approach whereby the task structure is reduced to a set of binary questions that are more suited to simpler machines while demanding lower cognitive loads for humans.Intelligent subject retirement based on quantitative metrics of volunteer skill and subject label reliability also leads to dramatic improvements in efficiency. We note that human and machine classifiers may retire subjects differently leading to trade-offs in performance space. Drawing on work with several Zooniverse projects including Galaxy Zoo and Supernova Hunter, we will present recent findings from experiments that combine cohorts of human and machine classifiers. We show that the most efficient system results when appropriate subsets of the data are intelligently assigned to each group according to their particular capabilities.With sufficient online training, simple machines can quickly classify “easy” subjects, leaving more difficult (and discovery-oriented) tasks for volunteers. We also find humans achieve higher classification purity while samples

  12. How Do Small Things Make a Big Difference? Activities to Teach about Human-Microbe Interactions.

    Science.gov (United States)

    Jasti, Chandana; Hug, Barbara; Waters, Jillian L; Whitaker, Rachel J

    2014-11-01

    Recent scientific studies are providing increasing evidence for how microbes living in and on us are essential to our good health. However, many students still think of microbes only as germs that harm us. The classroom activities presented here are designed to shift student thinking on this topic. In these guided inquiry activities, students investigate human-microbe interactions as they work together to interpret and analyze authentic data from published articles and develop scientific models. Through the activities, students learn and apply ecological concepts as they come to see the human body as a fascinatingly complex ecosystem.

  13. A big blank white canvas? Mapping and modeling human impact in Antarctica

    Science.gov (United States)

    Steve Carver; Tina Tin

    2015-01-01

    Antarctica is certainly what most people would consider being the world's last great wilderness; largely untouched and undeveloped by humans. Yet it is not inviolate - there are scientific bases, tourist operations, expeditions, airstrips and even roads. Although these impacts are by and large limited in extent, their very presence in an otherwise "blank...

  14. Reduction of microhemorrhages in the spinal cord of symptomatic ALS mice after intravenous human bone marrow stem cell transplantation accompanies repair of the blood-spinal cord barrier.

    Science.gov (United States)

    Eve, David J; Steiner, George; Mahendrasah, Ajay; Sanberg, Paul R; Kurien, Crupa; Thomson, Avery; Borlongan, Cesar V; Garbuzova-Davis, Svitlana

    2018-02-13

    Blood-spinal cord barrier (BSCB) alterations, including capillary rupture, have been demonstrated in animal models of amyotrophic lateral sclerosis (ALS) and ALS patients. To date, treatment to restore BSCB in ALS is underexplored. Here, we evaluated whether intravenous transplantation of human bone marrow CD34 + (hBM34 + ) cells into symptomatic ALS mice leads to restoration of capillary integrity in the spinal cord as determined by detection of microhemorrhages. Three different doses of hBM34 + cells (5 × 10 4 , 5 × 10 5 or 1 × 10 6 ) or media were intravenously injected into symptomatic G93A SOD1 mice at 13 weeks of age. Microhemorrhages were determined in the cervical and lumbar spinal cords of mice at 4 weeks post-treatment, as revealed by Perls' Prussian blue staining for ferric iron. Numerous microhemorrhages were observed in the gray and white matter of the spinal cords in media-treated mice, with a greater number of capillary ruptures within the ventral horn of both segments. In cell-treated mice, microhemorrhage numbers in the cervical and lumbar spinal cords were inversely related to administered cell doses. In particular, the pervasive microvascular ruptures determined in the spinal cords in late symptomatic ALS mice were significantly decreased by the highest cell dose, suggestive of BSCB repair by grafted hBM34 + cells. The study results provide translational outcomes supporting transplantation of hBM34 + cells at an optimal dose as a potential therapeutic strategy for BSCB repair in ALS patients.

  15. Development of a translational model to screen medications for cocaine use disorder II: Choice between intravenous cocaine and money in humans

    Science.gov (United States)

    Lile, Joshua A.; Stoops, William W.; Rush, Craig R.; Negus, S. Stevens; Glaser, Paul E. A.; Hatton, Kevin W.; Hays, Lon R.

    2016-01-01

    Background A medication for treating cocaine use disorder has yet to be approved. Laboratory-based evaluation of candidate medications in animals and humans is a valuable means to demonstrate safety, tolerability and initial efficacy of potential medications. However, animal-to-human translation has been hampered by a lack of coordination. Therefore, we designed homologous cocaine self-administration studies in rhesus monkeys (see companion article) and human subjects in an attempt to develop linked, functionally equivalent procedures for research on candidate medications for cocaine use disorder. Methods Eight (N=8) subjects with cocaine use disorder completed 12 experimental sessions in which they responded to receive money ($0.01, $1.00 and $3.00) or intravenous cocaine (0, 3, 10 and 30 mg/70 kg) under independent, concurrent progressive-ratio schedules. Prior to the completion of 9 choice trials, subjects sampled the cocaine dose available during that session and were informed of the monetary alternative value. Results The allocation of behavior varied systematically as a function of cocaine dose and money value. Moreover, a similar pattern of cocaine choice was demonstrated in rhesus monkeys and humans across different cocaine doses and magnitudes of the species-specific alternative reinforcers. The subjective and cardiovascular responses to IV cocaine were an orderly function of dose, although heart rate and blood pressure remained within safe limits. Conclusions These coordinated studies successfully established drug vs. non-drug choice procedures in humans and rhesus monkeys that yielded similar cocaine choice behavior across species. This translational research platform will be used in future research to enhance the efficiency of developing interventions to reduce cocaine use. PMID:27269368

  16. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  17. Big bad wolf or man's best friend? Unmasking a false wolf aggression on humans.

    Science.gov (United States)

    Caniglia, R; Galaverni, M; Delogu, M; Fabbri, E; Musto, C; Randi, E

    2016-09-01

    The return of the wolf in its historical range is raising social conflicts with local communities for the perceived potential threat to people safety. In this study we applied molecular methods to solve an unusual case of wolf attack towards a man in the Northern Italian Apennines. We analysed seven biological samples, collected from the clothes of the injured man, using mtDNA sequences, the Amelogenin gene, 39 unlinked autosomal and four Y-linked microsatellites. Results indicated that the aggression was conducted by a male dog and not by a wolf nor a wolf x dog hybrid. Our findings were later confirmed by the victim, who confessed he had been attacked by the guard dog of a neighbour. The genetic profile of the owned dog perfectly matched with that identified from the samples previously collected. Our results prove once again that the wolf does not currently represent a risk for human safety in developed countries, whereas most animal aggressions are carried out by its domestic relative, the dog. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. The metabolic fate of the Anti-HIV active drug carrier succinylated human serum albumin after intravenous administration in rats

    NARCIS (Netherlands)

    Swart, P J; Kuipers, M E; Smit, C; Beljaars, L; Ter Wiel, J; Meijer, D K

    The pharmacokinetics and metabolic fate of the intrinsically active (anti-HIV) drug carrier succinylated human serum albumin (Suc-HSA) was studied in rats. Suc-HSA was prepared by derivatizing HSA with 1,4-[C-14]-succinic anhydride, a modification by which all available epsilonNH2-groups in HSA were

  19. Distribution of 131I-labeled recombinant human erythropoietin in maternal and fetal organs following intravenous administration in pregnant rats

    International Nuclear Information System (INIS)

    Yilmaz, O.; Lambrecht, F.Y.; Durkan, K.; Gokmen, N.; Erbayraktar, S.

    2007-01-01

    The aim of the present study was to demonstrate the possible transplacental transmission of 131 I labeled recombinant human erythropoietin ( 131 I-rh-EPO) in pregnant rats and its distribution through maternal and fetal organs. Six Wistar Albino Rats in their pregnancy of 18 days were used 131 I labeled recombinant human erythropoietin (specific activity = 2.4 μCi/IU) was injected into the tail vein of rats. After 30 minutes labeled erythropoietin infusion maternal stomach, kidney, lung, liver, brain and heart as well as fetus were removed. Then, the same organs were removed from each fetus. Measuring weight of maternal and fetal organs as well as placenta were followed by radioactivity count via Cd(Te) detector. 131 I labeled recombinant human erythropoietin was found to be able to pass rat placenta and its distribution order in fetal organs was similar to those of maternal organs. Besides, as measurements were performed closer to cornu uteri, uptakes were decreasing in every fetus and its corresponding placenta. (author)

  20. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  1. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  2. Intentional intravenous mercury injection

    African Journals Online (AJOL)

    In this case report, intravenous complications, treatment strategies and possible ... Mercury toxicity is commonly associated with vapour inhalation or oral ingestion, for which there exist definite treatment options. Intravenous mercury ... personality, anxiousness, irritability, insomnia, depression and drowsi- ness.[1] However ...

  3. Significant neutralizing activities against H2N2 influenza A viruses in human intravenous immunoglobulin lots manufactured from 1993 to 2010

    Directory of Open Access Journals (Sweden)

    Ikuta K

    2012-07-01

    Full Text Available Ritsuko Kubota-Koketsu,1,2 Mikihiro Yunoki,2,3 Yoshinobu Okuno,1 Kazuyoshi Ikuta21Kanonji Institute, The Research Foundation for Microbial Diseases of Osaka University, Kagawa; 2Department of Virology, Research Institute for Microbial Diseases, Osaka University, 3Pathogenic Risk Management, Benesis Corporation, Osaka, JapanAbstract: Influenza A H2N2 virus, also known as the Asian flu, spread worldwide from 1957 to 1967, although there have been no cases reported in humans in the past 40 years. A vaccination program was introduced in Japan in the 1960s. Older Japanese donors could have been naturally infected with the H2N2 virus or vaccinated in the early 1960s. Human intravenous immunoglobulin (IVIG reflects the epidemiological status of the donating population in a given time period. Here, the possible viral neutralizing (VN activities of IVIG against the H2N2 virus were examined. Hemagglutination inhibition (HI and VN activities of IVIG lots manufactured from 1993 to 2010 in Japan and the United States were evaluated against H2N2 viruses. High HI and VN activities against H2N2 viruses were found in all the IVIG lots investigated. HI titers were 32–64 against the isolate in 1957 and 64–128 against the isolates in 1965. VN titers were 80–320 against the isolate in 1957 and 1280–5120 against the isolates in 1965. Both the HI and VN titers were higher against the isolate in 1965 than in 1957. Thus, antibody titers of IVIG against influenza viruses are well correlated with the history of infection and the vaccine program in Japan. Therefore, evaluation of antibody titers provides valuable information about IVIGs, which could be used for immune stimulation when a new influenza virus emerges in the human population.Keywords: IVIG, influenza, H2N2, neutralization

  4. Safety of intravenous insulin aspart compared to regular human insulin in patients undergoing ICU monitoring post cardiac surgery: an Indian experience.

    Science.gov (United States)

    Chawla, Manoj; Malve, Harshad; Shah, Harshvi; Shinde, Shwetal; Bhoraskar, Anil

    2015-01-01

    Poor perioperative glycemic control increases risk of infection, cardiovascular accidents and mortality in patients undergoing surgery. Tight glycemic control by insulin therapy is known to yield better outcomes in such patients. Intravenous (IV) insulin therapy with or without adjunctive subcutaneous insulin therapy is the mainstay of managing hyperglycemia in perioperative period. This observational study assessed the safety of IV Insulin Aspart (IAsp) as compared to Regular Human Insulin (RHI) in patients undergone cardiac surgery at a tertiary care hospital. 203 patients received IV IAsp (n = 103) and RHI (n = 100) respectively. Safety was assessed by frequency and severity of adverse events (AEs) & serious adverse events (SAEs) during hospitalization. IAsp effectively controlled mean blood glucose levels to 159.87 ± 41.41 mg/dl similar to RHI (160.77 ± 44.39 mg/dl). No serious adverse event was reported. The incidence of hypoglycemia was similar in both the groups. The insulin infusion rate, time for which insulin infusion was withheld and mean blood glucose during hypoglycemia was significantly high in RHI group. This study has shown similar safety of IV IAsp as compared to IV RHI in the post cardiac surgery patients. However physicians preferred IAsp as it offers advantage during transition. IV IAsp offers an effective and safe option for managing hyperglycemia in patients in ICU post cardiac procedures.

  5. The plasma and cerebrospinal fluid pharmacokinetics of erlotinib and its active metabolite (OSI-420) after intravenous administration of erlotinib in non-human primates.

    Science.gov (United States)

    Meany, Holly J; Fox, Elizabeth; McCully, Cynthia; Tucker, Chris; Balis, Frank M

    2008-08-01

    Erlotinib hydrochloride is a small molecule inhibitor of epidermal growth factor receptor (EGFR). EGFR is over-expressed in primary brain tumors and solid tumors that metastasize to the central nervous system. We evaluated the plasma and cerebrospinal fluid (CSF) pharmacokinetics of erlotinib and its active metabolite OSI-420 after an intravenous (IV) dose in a non-human primate model. Erlotinib was administered as a 1 h IV infusion to four adult rhesus monkeys. Serial blood and CSF samples were drawn over 48 h and erlotinib and OSI-420 were quantified with an HPLC/tandem mass spectroscopic assay. Pharmacokinetic parameters were estimated using non-compartmental and compartmental methods. CSF penetration was calculated from the AUC(CSF):AUC(plasma). Erlotinib disappearance from plasma after a short IV infusion was biexponential with a mean terminal half-life of 5.2 h and a mean clearance of 128 ml/min per m(2). OSI-420 exposure (AUC) in plasma was 30% (range 12-59%) of erlotinib, and OSI-420 clearance was more than 5-fold higher than erlotinib. Erlotinib and OSI-420 were detectable in CSF. The CSF penetration (AUC(CSF):AUC(plasma)) of erlotinib and OSI-420 was OSI-420 are measurable in CSF after an IV dose. The drug exposure (AUC) in the CSF is limited relative to total plasma concentrations but is substantial relative the free drug exposure in plasma.

  6. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    on preferentialattachment, but more of an assortativity effect creating not merely a rich-gets-richer effect but an elitist network with high entry barriers. In this acclaimed democratic and collaborative environment of Big Science, the elite closes in on itself. We propose this tendency to be even more explicit in other...

  7. The dose-response effect of acute intravenous transplantation of human umbilical cord blood cells on brain damage and spatial memory deficits in neonatal hypoxia-ischemia.

    Science.gov (United States)

    de Paula, S; Greggio, S; Marinowic, D R; Machado, D C; DaCosta, J Costa

    2012-05-17

    Despite the beneficial effects of cell-based therapies on brain repair shown in most studies, there has not been a consensus regarding the optimal dose of human umbilical cord blood cells (HUCBC) for neonatal hypoxia-ischemia (HI). In this study, we compared the long-term effects of intravenous administration of HUCBC at three different doses on spatial memory and brain morphological changes after HI in newborn Wistar rats. In addition, we tested whether the transplanted HUCBC migrate to the injured brain after transplantation. Seven-day-old animals underwent right carotid artery occlusion and were exposed to 8% O(2) inhalation for 2 h. After 24 h, randomly selected animals were assigned to four different experimental groups: HI rats administered with vehicle (HI+vehicle), HI rats treated with 1×10(6) (HI+low-dose), 1×10(7) (HI+medium-dose), and 1×10(8) (HI+high-dose) HUCBC into the jugular vein. A control group (sham-operated) was also included in this study. After 8 weeks of transplantation, spatial memory performance was assessed using the Morris water maze (MWM), and subsequently, the animals were euthanized for brain morphological analysis using stereological methods. In addition, we performed immunofluorescence and polymerase chain reaction (PCR) analyses to identify HUCBC in the rat brain 7 days after transplantation. The MWM test showed a significant spatial memory recovery at the highest HUCBC dose compared with HI+vehicle rats (P<0.05). Furthermore, the brain atrophy was also significantly lower in the HI+medium- and high-dose groups compared with the HI+vehicle animals (P<0.01; 0.001, respectively). In addition, HUCBC were demonstrated to be localized in host brains by immunohistochemistry and PCR analyses 7 days after intravenous administration. These results revealed that HUCBC transplantation has the dose-dependent potential to promote robust tissue repair and stable cognitive improvement after HI brain injury. Copyright © 2012 IBRO. Published by

  8. Biodistribution and Clearance of Human Mesenchymal Stem Cells by Quantitative Three-Dimensional Cryo-Imaging After Intravenous Infusion in a Rat Lung Injury Model.

    Science.gov (United States)

    Schmuck, Eric G; Koch, Jill M; Centanni, John M; Hacker, Timothy A; Braun, Rudolf K; Eldridge, Marlowe; Hei, Derek J; Hematti, Peiman; Raval, Amish N

    2016-12-01

    : Cell tracking is a critical component of the safety and efficacy evaluation of therapeutic cell products. To date, cell-tracking modalities have been hampered by poor resolution, low sensitivity, and inability to track cells beyond the shortterm. Three-dimensional (3D) cryo-imaging coregisters fluorescent and bright-field microcopy images and allows for single-cell quantification within a 3D organ volume. We hypothesized that 3D cryo-imaging could be used to measure cell biodistribution and clearance after intravenous infusion in a rat lung injury model compared with normal rats. A bleomycin lung injury model was established in Sprague-Dawley rats (n = 12). Human mesenchymal stem cells (hMSCs) labeled with QTracker655 were infused via jugular vein. After 2, 4, or 8 days, a second dose of hMSCs labeled with QTracker605 was infused, and animals were euthanized after 60, 120, or 240 minutes. Lungs, liver, spleen, heart, kidney, testis, and intestine were cryopreserved, followed by 3D cryo-imaging of each organ. At 60 minutes, 82% ± 9.7% of cells were detected; detection decreased to 60% ± 17% and 66% ± 22% at 120 and 240 minutes, respectively. At day 2, 0.06% of cells were detected, and this level remained constant at days 4 and 8 postinfusion. At 60, 120, and 240 minutes, 99.7% of detected cells were found in the liver, lungs, and spleen, with cells primarily retained in the liver. This is the first study using 3D cryo-imaging to track hMSCs in a rat lung injury model. hMSCs were retained primarily in the liver, with fewer detected in lungs and spleen. Effective bench-to-bedside clinical translation of cellular therapies requires careful understanding of cell fate through tracking. Tracking cells is important to measure cell retention so that delivery methods and cell dose can be optimized and so that biodistribution and clearance can be defined to better understand potential off-target toxicity and redosing strategies. This article demonstrates, for the first

  9. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness.

    Science.gov (United States)

    Dove, Edward S; Özdemir, Vural

    2015-09-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, "extreme centrism", and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics-separate and together-have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness . By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  10. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Directory of Open Access Journals (Sweden)

    Edward S. Dove

    2015-08-01

    Full Text Available The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science; and consortia ethics (Big Ethics. These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  11. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Science.gov (United States)

    Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit. PMID:26345196

  12. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  13. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  14. Phase I dose escalation pharmacokinetic assessment of intravenous humanized anti-MUC1 antibody AS1402 in patients with advanced breast cancer.

    Science.gov (United States)

    Pegram, Mark D; Borges, Virginia F; Ibrahim, Nuhad; Fuloria, Jyotsna; Shapiro, Charles; Perez, Susan; Wang, Karen; Schaedli Stark, Franziska; Courtenay Luck, Nigel

    2009-01-01

    MUC1 is a cell-surface glycoprotein that establishes a molecular barrier at the epithelial surface and engages in morphogenetic signal transduction. Alterations in MUC1 glycosylation accompany the development of cancer and influence cellular growth, differentiation, transformation, adhesion, invasion, and immune surveillance. A 20-amino-acid tandem repeat that forms the core protein of MUC1 is overexpressed and aberrantly glycosylated in the majority of epithelial tumors. AS1402 (formerly R1550) is a humanized IgG1k monoclonal antibody that binds to PDTR sequences within this tandem repeat that are not exposed in normal cells. AS1402 is a potent inducer of antibody-dependent cellular cytotoxicity (ADCC), specifically against MUC1-expressing tumor cells. The objective of this study was to determine the safety, tolerability, and pharmacokinetic (PK) characteristics of AS1402 monotherapy in patients with locally advanced or metastatic MUC1-positive breast cancer that had progressed after anthracyclines- and taxane-based therapy. Patients received AS1402 over a 1- to 3-hour intravenous (i.v.) infusion at doses between 1 and 16 mg/kg, with repeated dosing every 1 to 3 weeks (based on patient-individualized PK assessment) until disease progression. Serum AS1402 levels were measured at multiple times after i.v. administration. Human anti-human antibody (HAHA) responses were measured to determine the immunogenicity of AS1402. Noncompartmental pharmacokinetic parameters were determined and were used to assess dose dependency across the dose range studied. Twenty-six patients were treated. AS1402 was generally well tolerated. Two grade 3/4 drug-related adverse events were reported, both at the 3-mg/kg dose. Neither was observed in expanded or subsequent dosing cohorts. No anti-human antibodies were detected. Plasma concentrations of AS1402 appeared to be proportional to dose within the 1- to 16-mg/kg dose range assessed, with a mean terminal half-life of 115.4 +/- 37.1 hours

  15. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  16. Retrospective and Prospective Human Intravenous and Oral Pharmacokinetic Projection of Dipeptidyl peptidase-IV Inhibitors Using Simple Allometric Principles - Case Studies of ABT-279, ABT-341, Alogliptin, Carmegliptin, Sitagliptin and Vildagliptin.

    Science.gov (United States)

    Gilibili, Ravindranath R; Bhamidipati, Ravi Kanth; Mullangi, Ramesh; Srinivas, Nuggehally R

    2015-01-01

    The purpose of this exercise was to explore the utility of allometric scaling approach for the prediction of intravenous and oral pharmacokinetics of six dipeptidy peptidase-IV (DPP-IV) inhibitors viz. ABT-279, ABT-341, alogliptin, carmegliptin, sitagliptin and vildagliptin. The availability of intravenous and oral pharmacokinetic data in animals enabled the allometry scaling of 6 DPP-IV inhibitors. The relationship between the main pharmacokinetic parameters [viz. volume of distribution (Vd) and clearance (CL)] and body weight was studied across three or four mammalian species, using double logarithmic plots to predict the human pharmacokinetic parameters of CL and Vd using simple allometry. A simply allometry relationship: Y = aWb was found to be adequate for the prediction of intravenous and oral human clearance/volume of distribution for DPP-IV inhibitors. The allometric equations for alogliptin, carmegliptin, sitagliptin, vildagliptin, ABT-279 and ABT-341 were 1.867W0.780, 1.170W0.756, 2.020W0.529, 1.959 W0.847, 0.672 W1.016, 1.077W 0.649, respectively, to predict intravenous clearance (CL) and the corresponding equations to predict intravenous volume of distribution (Vd) were: 3.313W0.987, 6.096W0.992, 7.140W0.805, 2.742W0.941, 1.299W0.695 and 5.370W0.803. With the exception of a few discordant values the exponent rule appeared to hold for CL (0.75) and Vd (1.0) for the predictions of various DPP-IV inhibitors. Regardless of the routes, the predicted values were within 2-3 fold of observed values and intravenous allometry was better than oral allometry. Simple allometry retrospectively predicted with reasonable accuracy the human reported values of gliptins and could be used as a prospective tool for this class of drugs.

  17. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  18. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  19. Pain management in emergency department: intravenous morphine vs. intravenous acetaminophen

    Directory of Open Access Journals (Sweden)

    Morteza Talebi Doluee

    2015-01-01

    Full Text Available Pain is the most common complaint in emergency department and there are several methods for its control. Among them, pharmaceutical methods are the most effective. Although intravenous morphine has been the most common choice for several years, it has some adverse effects. There are many researches about intravenous acetaminophen as an analgesic agent and it appears that it has good analgesic effects for various types of pain. We searched some electronic resources for clinical trials comparing analgesic effects of intravenous acetaminophen vs. intravenous morphine for acute pain treatment in emergency setting.In two clinical trials, the analgesic effect of intravenous acetaminophen has been compared with intravenous morphine for renal colic. The results revealed no significant difference between analgesic effects of two medications. Another clinical trial revealed that intravenous acetaminophen has acceptable analgesic effects on the post-cesarean section pain when combined with other analgesic medications. One study revealed that administration of intravenous acetaminophen compared to placebo before hysterectomy decreased consumption of morphine via patient-controlled analgesia pump and decreased the side effects. Similarly, another study revealed that the infusion of intravenous acetaminophen vs. placebo after orthopedic surgery decreased the consumption of morphine after the surgery. A clinical trial revealed intravenous acetaminophen provided a level of analgesia comparable to intravenous morphine in isolated limb trauma, while causing less side effects than morphine.It appears that intravenous acetaminophen has good analgesic effects for visceral, traumatic and postoperative pains compare with intravenous morphine.

  20. Clinical Evaluation of Ciprofloxacin Intravenous Preparation ...

    African Journals Online (AJOL)

    The most common site of bacteria infection in humans is the urinary tract. For nosocomial infections it is the catheterized urinary tract. Compromised immune responses in hospitalized patients contribute to the difficulties encountered in treating their infections. In these patients, administration of intravenous antibiotic is ...

  1. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  2. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  3. Risk-benefit evaluation of fish from Chinese markets: Nutrients and contaminants in 24 fish species from five big cities and related assessment for human health

    Energy Technology Data Exchange (ETDEWEB)

    Du, Zhen-Yu, E-mail: zdu@nifes.no [National Institute of Nutrition and Seafood Research (NIFES), N-5817 Bergen (Norway); Zhang, Jian [National Institute of Nutrition and Seafood Research (NIFES), N-5817 Bergen (Norway); Institute of Nutrition and Food Safety, Chinese Center for Disease Control and Prevention, Beijing, 100050 (China); Department of Biomedicine, University of Bergen (Norway); Wang, Chunrong; Li, Lixiang; Man, Qingqing [Institute of Nutrition and Food Safety, Chinese Center for Disease Control and Prevention, Beijing, 100050 (China); Lundebye, Anne-Katrine; Froyland, Livar [National Institute of Nutrition and Seafood Research (NIFES), N-5817 Bergen (Norway)

    2012-02-01

    The risks and benefits of fish from markets in Chinese cities have not previously been fully evaluated. In the present study, 24 common fish species with more than 400 individual samples were collected from markets from five big Chinese cities in 2007. The main nutrients and contaminants were measured and the risk-benefit was evaluated based on recommended nutrient intakes and risk level criteria set by relevant authorities. The comprehensive effects of nutrients and contaminants in marine oily fish were also evaluated using the data of two related human dietary intervention trials performed in dyslipidemic Chinese men and women in 2008 and 2010, respectively. The results showed that concentrations of contaminants analyzed including DDT, PCB{sub 7}, arsenic and cadmium were much lower than their corresponding maximum limits with the exception of the mercury concentration in common carp. Concentrations of POPs and n-3 LCPUFA, mainly EPA and DHA, were positively associated with the lipid content of the fish. With a daily intake of 80-100 g marine oily fish, the persistent organic pollutants in fish would not counteract the beneficial effects of n-3 LCPUFA in reducing cardiovascular disease (CVD) risk markers. Marine oily fish provided more effective protection against CVD than lean fish, particularly for the dyslipidemic populations. The risk-benefit assessment based on the present daily aquatic product intake in Chinese urban residents (44.9 and 62.3 g for the average values for all cities and big cities, respectively) indicated that fish, particularly marine oily fish, can be regularly consumed to achieve optimal nutritional benefits from n-3 LCPUFA, without causing significant contaminant-related health risks. However, the potential health threat from contaminants in fish should still be emphasized for the populations consuming large quantities of fish, particularly wild fish. - Highlights: Black-Right-Pointing-Pointer We collected 24 fish species with more than

  4. Risk–benefit evaluation of fish from Chinese markets: Nutrients and contaminants in 24 fish species from five big cities and related assessment for human health

    International Nuclear Information System (INIS)

    Du, Zhen-Yu; Zhang, Jian; Wang, Chunrong; Li, Lixiang; Man, Qingqing; Lundebye, Anne-Katrine; Frøyland, Livar

    2012-01-01

    The risks and benefits of fish from markets in Chinese cities have not previously been fully evaluated. In the present study, 24 common fish species with more than 400 individual samples were collected from markets from five big Chinese cities in 2007. The main nutrients and contaminants were measured and the risk–benefit was evaluated based on recommended nutrient intakes and risk level criteria set by relevant authorities. The comprehensive effects of nutrients and contaminants in marine oily fish were also evaluated using the data of two related human dietary intervention trials performed in dyslipidemic Chinese men and women in 2008 and 2010, respectively. The results showed that concentrations of contaminants analyzed including DDT, PCB 7 , arsenic and cadmium were much lower than their corresponding maximum limits with the exception of the mercury concentration in common carp. Concentrations of POPs and n-3 LCPUFA, mainly EPA and DHA, were positively associated with the lipid content of the fish. With a daily intake of 80–100 g marine oily fish, the persistent organic pollutants in fish would not counteract the beneficial effects of n-3 LCPUFA in reducing cardiovascular disease (CVD) risk markers. Marine oily fish provided more effective protection against CVD than lean fish, particularly for the dyslipidemic populations. The risk–benefit assessment based on the present daily aquatic product intake in Chinese urban residents (44.9 and 62.3 g for the average values for all cities and big cities, respectively) indicated that fish, particularly marine oily fish, can be regularly consumed to achieve optimal nutritional benefits from n-3 LCPUFA, without causing significant contaminant-related health risks. However, the potential health threat from contaminants in fish should still be emphasized for the populations consuming large quantities of fish, particularly wild fish. - Highlights: ► We collected 24 fish species with more than 400 individual samples

  5. Computed tomography intravenous cholangiography

    International Nuclear Information System (INIS)

    Nascimento, S.; Murray, W.; Wilson, P.

    1997-01-01

    Indications for direct visualization of the bile ducts include bile duct dilatation demonstrated by ultrasound or computed tomography (CT) scanning, where the cause of the bile duct dilatation is uncertain or where the anatomy of bile duct obstruction needs further clarification. Another indication is right upper quadrant pain, particularly in a post-cholecystectomy patient, where choledocholithiasis is suspected. A possible new indication is pre-operative evaluation prior to laparoscopic cholecystectomy. The bile ducts are usually studied by endoscopic retrograde cholangiopancreatography (ERCP), or, less commonly, trans-hepatic cholangiography. The old technique of intravenous cholangiography has fallen into disrepute because of inconsistent bile-duct opacification. The advent of spiral CT scanning has renewed interest in intravenous cholangiography. The CT technique is very sensitive to the contrast agent in the bile ducts, and angiographic and three-dimensional reconstructions of the biliary tree can readily be obtained using the CT intravenous cholangiogram technique (CT IVC). Seven patients have been studied using this CT IVC technique, between February 1995 and June 1996, and are the subject of the present report. Eight further studies have since been performed. The results suggest that CT IVC could replace ERCP as the primary means of direct cholangiography, where pancreatic duct visualization is not required. (authors)

  6. Computed tomography intravenous cholangiography

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, S.; Murray, W.; Wilson, P. [Pittwater Radiology, Dee Why, NSW, (Australia)

    1997-08-01

    Indications for direct visualization of the bile ducts include bile duct dilatation demonstrated by ultrasound or computed tomography (CT) scanning, where the cause of the bile duct dilatation is uncertain or where the anatomy of bile duct obstruction needs further clarification. Another indication is right upper quadrant pain, particularly in a post-cholecystectomy patient, where choledocholithiasis is suspected. A possible new indication is pre-operative evaluation prior to laparoscopic cholecystectomy. The bile ducts are usually studied by endoscopic retrograde cholangiopancreatography (ERCP), or, less commonly, trans-hepatic cholangiography. The old technique of intravenous cholangiography has fallen into disrepute because of inconsistent bile-duct opacification. The advent of spiral CT scanning has renewed interest in intravenous cholangiography. The CT technique is very sensitive to the contrast agent in the bile ducts, and angiographic and three-dimensional reconstructions of the biliary tree can readily be obtained using the CT intravenous cholangiogram technique (CT IVC). Seven patients have been studied using this CT IVC technique, between February 1995 and June 1996, and are the subject of the present report. Eight further studies have since been performed. The results suggest that CT IVC could replace ERCP as the primary means of direct cholangiography, where pancreatic duct visualization is not required. (authors). 11 refs., 6 figs.

  7. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  8. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  9. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  10. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  11. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  12. A simple reason for a big difference: wolves do not look back at humans, but dogs do.

    Science.gov (United States)

    Miklósi, Adám; Kubinyi, Enikö; Topál, József; Gácsi, Márta; Virányi, Zsófia; Csányi, Vilmos

    2003-04-29

    The present investigations were undertaken to compare interspecific communicative abilities of dogs and wolves, which were socialized to humans at comparable levels. The first study demonstrated that socialized wolves were able to locate the place of hidden food indicated by the touching and, to some extent, pointing cues provided by the familiar human experimenter, but their performance remained inferior to that of dogs. In the second study, we have found that, after undergoing training to solve a simple manipulation task, dogs that are faced with an insoluble version of the same problem look/gaze at the human, while socialized wolves do not. Based on these observations, we suggest that the key difference between dog and wolf behavior is the dogs' ability to look at the human's face. Since looking behavior has an important function in initializing and maintaining communicative interaction in human communication systems, we suppose that by positive feedback processes (both evolutionary and ontogenetically) the readiness of dogs to look at the human face has lead to complex forms of dog-human communication that cannot be achieved in wolves even after extended socialization.

  13. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  14. Intravenous lidocaine infusion.

    Science.gov (United States)

    Soto, G; Naranjo González, M; Calero, F

    2018-02-26

    Systemic lidocaine used in continuous infusion during the peri-operative period has analgesic, anti-hyperalgesic, as well as anti-inflammatory properties. This makes it capable of reducing the use of opioids and inhalational anaesthetics, and the early return of bowel function, and patient hospital stay. The aim of this narrative review was to highlight the pharmacology and indications for clinical application, along with new and interesting research areas. The clinical applications of peri-operative lidocaine infusion have been reviewed in several recent systematic reviews and meta-analyses in patients undergoing open and laparoscopic abdominal procedures, ambulatory procedures, and other types of surgery. Peri-operative lidocaine infusion may be a useful analgesic adjunct in enhanced recovery protocols. Potential benefits of intravenous lidocaine in chronic post-surgical pain, post-operative cognitive dysfunction, and cancer recurrence are under investigation. Due to its immunomodulation properties over surgical stress, current evidence suggests that intravenous lidocaine could be used in the context of multimodal analgesia. Copyright © 2018 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Immunoglobulin (Ig)G purified from human sera mirrors intravenous Ig human leucocyte antigen (HLA) reactivity and recognizes one's own HLA types, but may be masked by Fab complementarity-determining region peptide in the native sera.

    Science.gov (United States)

    Ravindranath, M H; Terasaki, P I; Maehara, C Y; Jucaud, V; Kawakita, S; Pham, T; Yamashita, W

    2015-02-01

    Intravenous immunoglobulin (IVIg) reacted with a wide array of human leucocyte antigen (HLA) alleles, in contrast to normal sera, due possibly to the purification of IgG from the pooled plasma. The reactivity of IgG purified from normal sera was compared with that of native sera to determine whether any serum factors mask the HLA reactivity of anti-HLA IgG and whether IgG purified from sera can recognize the HLA types of the corresponding donors. The purified IgG, unlike native sera, mirrored IVIg reactivity to a wide array of HLA-I/-II alleles, indicating that anti-HLA IgG may be masked in normal sera - either by peptides derived from soluble HLA or by those from antibodies. A HLA peptides) masked HLA recognition by the purified IgG. Most importantly, some of the anti-HLA IgG purified from normal sera - and serum IgG from a few donors - indeed recognized the HLA types of the corresponding donors, confirming the presence of auto-HLA antibodies. Comparison of HLA types with the profile of HLA antibodies showed auto-HLA IgG to the donors' HLA antigens in this order of frequency: DPA (80%), DQA (71%), DRB345 (67%), DQB (57%), Cw (50%), DBP (43%), DRB1 (21%), A (14%) and B (7%). The auto-HLA antibodies, when unmasked in vivo, may perform immunoregulatory functions similar to those of therapeutic preparations of IVIg. © 2014 British Society for Immunology.

  16. [Effect of compound danshen dripping pill combined with intravenous transplantation of human umbilical cord blood mononuclear cells on local inflammatory response in the myocardium of rabbits with acute myocardial infarction].

    Science.gov (United States)

    Deng, Liu-xia; Yu, Guo-long; Al, Qi; Yuan, Chun-ju

    2013-11-01

    To investigate effect of Compound Danshen Dripping Pill (CDDP) on the inflammatory response of the myocardium of acute myocardial infarction (AMI) rabbits, to observe the therapeutic effect of CDDP combined intravenous transplantation of human umbilical cord blood mononuclear cells (HUCBMCs) on inflammatory response, pro-inflammatory cytokine tumor necrosis factor alpha (TNF-alpha) , and heart function in the myocardium of AMI rabbits, and to explore the possible protective mechanisms of the combined therapy. The AMI model was successfully established by ligation of the left anterior coronary artery (LAD) in 40 healthy rabbits.Then they were randomly divided into four groups, i.e., the control group, the CDDP group, the transplantation group, and the combined group, 10 in each group. Rabbits in the control group received intravenous injection of 0.5 mL normal saline via ear vein within 24 h after AMI and then intragastric infusion of normal saline at 5 mL per day. Rabbits in the CDDP group received intravenous injection of 0.5 mL normal saline via ear vein within 24 h after AMI and then intragastric infusion of solution obtained by solving 270 mg CDDP in 5 mL normal saline per day. Rabbits in the transplantation group received intravenous injection of 0.5 mL normal saline labeled with green fluorescent protein (GFP) containing 3 x 10(7) of HUCBMCs via ear vein within 24 h after AMI and then intragastric infusion of normal saline at 5 mL per day. Rabbits in the combined group received intravenous injection of 0.5 mL normal saline labeled with GFP containing 3 x 10(7) of HUCBMCs via ear vein within 24 h after AMI and then intragastric infusion of solution obtained by solving 270 mg CDDP in 5 mL normal saline per day. At week 1 and 4 after treatment, cardiac function indices such as left ventricular fractional shorting (LVFS) and left ventricular ejection fraction (LVEF) were performed by echocardiography; the number of transplanted cells in the myocardium was found

  17. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  18. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  19. Privacy as human flourishing: could a shift towards virtue ethics strengthen privacy protection in the age of Big Data?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and

  20. Big Data: Progress or a Big Headache?

    Science.gov (United States)

    Dhawan, Aman; Brand, Jefferson C; Rossi, Michael J; Lubowitz, James H

    2018-03-01

    Large administrative database, or "big data" research studies can include an immense number of patients. Strengths of research based on big data include generalizability resulting from diverse patients, diverse providers, and diverse clinical settings. Limitations of research based on large administrative databases may include indeterminate quality and obscure purpose of data entry, lack of information regarding confounding variables, and suboptimal clinical outcome measures. Thus, research conclusions based on big data must be scrutinized in a discerning and critical manner. Copyright © 2018 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  1. Intravenous fluids: balancing solutions.

    Science.gov (United States)

    Hoorn, Ewout J

    2017-08-01

    The topic of intravenous (IV) fluids may be regarded as "reverse nephrology", because nephrologists usually treat to remove fluids rather than to infuse them. However, because nephrology is deeply rooted in fluid, electrolyte, and acid-base balance, IV fluids belong in the realm of our specialty. The field of IV fluid therapy is in motion due to the increasing use of balanced crystalloids, partly fueled by the advent of new solutions. This review aims to capture these recent developments by critically evaluating the current evidence base. It will review both indications and complications of IV fluid therapy, including the characteristics of the currently available solutions. It will also cover the use of IV fluids in specific settings such as kidney transplantation and pediatrics. Finally, this review will address the pathogenesis of saline-induced hyperchloremic acidosis, its potential effect on outcomes, and the question if this should lead to a definitive switch to balanced solutions.

  2. Intravenous versus oral etoposide

    DEFF Research Database (Denmark)

    Ali, Abir Salwa; Grönberg, Malin; Langer, Seppo W.

    2018-01-01

    High-grade gastroenteropancreatic neuroendocrine neoplasms (GEP-NENs, G3) are aggressive cancers of the digestive system with poor prognosis and survival. Platinum-based chemotherapy (cisplatin/carboplatin + etoposide) is considered the first-line palliative treatment. Etoposide is frequently...... administered intravenously; however, oral etoposide may be used as an alternative. Concerns for oral etoposide include decreased bioavailability, inter- and intra-patient variability and patient compliance. We aimed to evaluate possible differences in progression-free survival (PFS) and overall survival (OS......) in patients treated with oral etoposide compared to etoposide given as infusion. Patients (n = 236) from the Nordic NEC study were divided into three groups receiving etoposide as a long infusion (24 h, n = 170), short infusion (≤ 5 h, n = 33) or oral etoposide (n = 33) according to hospital tradition. PFS...

  3. Intraarterial reteplase and intravenous abciximab for treatment of acute ischemic stroke. A preliminary feasibility and safety study in a non-human primate model

    International Nuclear Information System (INIS)

    Qureshi, Adnan I.; Suri, M. Fareed K.; Ali, Zulfiqar; Ringer, Andrew J.; Boulos, Alan S.; Guterman, Lee R.; Hopkins, L. Nelson; Nakada, Marian T.; Alberico, Ronald A.; Martin, Lisa B.E.

    2005-01-01

    We performed a preliminary feasibility and safety study using intravenous (IV) administration of a platelet glycoprotein IIb/IIIa inhibitor (abciximab) in conjunction with intraarterial (IA) administration of a thrombolytic agent (reteplase) in a primate model of intracranial thrombosis. We introduced thrombus through superselective catheterization of the intracranial segment of the internal carotid artery in 16 primates. The animals were randomly assigned to receive IA reteplase and IV abciximab (n =4), IA reteplase and IV placebo (n =4), IA placebo and IV abciximab (n =4) or IA and IV placebo (n =4). Recanalization was assessed by serial angiography during the 6-h period after initiation of treatment. Postmortem magnetic resonance (MR) imaging was performed to determine the presence of cerebral infarction or intracranial hemorrhage. Partial or complete recanalization at 6 h after initiation of treatment (decrease of two or more points in pre-treatment angiographic occlusion grade) was observed in two animals treated with IA reteplase and IV abciximab, three animals treated with IA reteplase alone and one animal treated with IV abciximab alone. No improvement in perfusion was observed in animals that received IV and IA placebo. Cerebral infarction was demonstrated on postmortem MR imaging in three animals that received IA and IV placebo and in one animal each from the groups that received IA reteplase and IV abciximab or IV abciximab alone. One animal that received IV abciximab alone had a small intracerebral hemorrhage on MR imaging. (orig.)

  4. Comparative removal of solvent and detergent viral inactivating agents from human intravenous immunoglobulin G preparations using SDR HyperD and C18 sorbents.

    Science.gov (United States)

    Burnouf, Thierry; Sayed, Makram A; Radosevich, Miryana; El-Ekiaby, Magdy

    2009-06-01

    The capacity of hydrophobic octadecyl (C18) and SDR HyperD materials to remove the combination of 1% (v/v) solvent (tri-n-butyl phosphate, TnBP) with 1% (v/v) nonionic detergents (Triton X-100 and Triton X-45) used for viral inactivation of plasma-derived polyvalent intravenous immunoglobulin G (IVIG) preparation has been evaluated. Efficient removal of TnBP (SDR HyperD/7 ml of IVIG. Binding capacities of TnBP were greater than 140 mg/g of C18 and greater than 318 mg/g of dry SDR HyperD. Complete removal of Triton X-45 (SDR HyperD/7 ml of IVIG or above, corresponding to binding capacities in excess of 70 mg/g of C18 and in excess of 159 mg/g of dry SDR HyperD. Residual Triton X-100 was less than 30 ppm at a ratio of 4 g/14 ml of immunoglobulin G (IgG) for the C18 sorbent. Triton X-100 was less than 10 ppm when using SDR HyperD at a ratio of 0.66 g/7 ml of IgG, corresponding to a binding capacity of approximately 106 mg of Triton X-100/g of dry SDR HyperD. Good recoveries of IVIG were achieved in the effluent from both sorbents.

  5. Big Data Mining in the Cloud

    OpenAIRE

    Shi, Zhongzhi

    2012-01-01

    Part 1: Keynote Presentations; International audience; Big Data is the growing challenge that organizations face as they deal with large and fast-growing sources of data or information that also present a complex range of analysis and use problems. Digital data production in many fields of human activity from science to enterprise is characterized by an exponential growth. Big data technologies will become a new generation of technologies and architectures which is beyond the ability of commo...

  6. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  7. Ultrasonography versus intravenous urography

    International Nuclear Information System (INIS)

    Aslaksen, A.

    1991-01-01

    The present study was performed to compare the clinical value of urography and ultrasonography in a non-selected group of patients referred for urography to a university hospital. The conslusions and clinical implications of the study are as follows: Intravenous urography remains the cornerstone imaging examination in the evaluation of ureteral calculi. Ultrasonography is a valuable adjunct in cases of non- visualization of the kidneys, in distal obstruction and known contrast media allergy. When women with recurrent urinary tract infection are referred for imaging of the urinary tract, ultrasonography should be used. Ultrasonography should replace urography for screening of non-acute hydronephrosis like in female genital cancer and benign prostate hyperplasia. There is good correlation between urography and ultrasonography in assessing the degree of hydronephrosis. However, more researh on the relationship between hydronephrosis and obstruction is necessary. Ultrasonography should be used as the only imaging method of the upper urinary tract in patients with microscopic hematuria. In patients less than 50 years with macroscopic hematuria, ultrasonography should be used as the only imaging of the upper urinary tract, and an examination of the urinary bladder should be included. In patients over 50 years, urography supplied with ultrasonography should be used, but more research is necessary on the subject of imaging method and age. 158 refs

  8. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  9. Intravenous immunoglobulin therapy for refractory recurrent pericarditis.

    Science.gov (United States)

    del Fresno, M Rosa; Peralta, Julio E; Granados, Miguel Ángel; Enríquez, Eugenia; Domínguez-Pinilla, Nerea; de Inocencio, Jaime

    2014-11-01

    Recurrent pericarditis is a troublesome complication of idiopathic acute pericarditis and occurs more frequently in pediatric patients after cardiac surgery (postpericardiotomy syndrome). Conventional treatment with nonsteroidal antiinflammatory drugs, corticosteroids, and colchicine is not always effective or may cause serious adverse effects. There is no consensus, however, on how to proceed in those patients whose disease is refractory to conventional therapy. In such cases, human intravenous immunoglobulin, immunosuppressive drugs, and biological agents have been used. In this report we describe 2 patients with refractory recurrent pericarditis after cardiac surgery who were successfully treated with 3 and 5 monthly high-dose (2 g/kg) intravenous immunoglobulin until resolution of the effusion. Our experience supports the effectiveness and safety of this therapy. Copyright © 2014 by the American Academy of Pediatrics.

  10. Digital humanitarians how big data is changing the face of humanitarian response

    CERN Document Server

    Meier, Patrick

    2015-01-01

    The Rise of Digital HumanitariansMapping Haiti LiveSupporting Search And Rescue EffortsPreparing For The Long Haul Launching An SMS Life Line Sending In The Choppers Openstreetmap To The Rescue Post-Disaster Phase The Human Story Doing Battle With Big Data Rise Of Digital Humanitarians This Book And YouThe Rise of Big (Crisis) DataBig (Size) Data Finding Needles In Big (Size) Data Policy, Not Simply Technology Big (False) Data Unpacking Big (False) Data Calling 991 And 999 Big (

  11. What do Big Data do in Global Governance?

    DEFF Research Database (Denmark)

    Krause Hansen, Hans; Porter, Tony

    2017-01-01

    Two paradoxes associated with big data are relevant to global governance. First, while promising to increase the capacities of humans in governance, big data also involve an increasingly independent role for algorithms, technical artifacts, the Internet of things, and other objects, which can...... reduce the control of human actors. Second, big data involve new boundary transgressions as data are brought together from multiple sources while also creating new boundary conflicts as powerful actors seek to gain advantage by controlling big data and excluding competitors. These changes are not just...... about new data sources for global decision-makers, but instead signal more profound changes in the character of global governance....

  12. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  13. Big Data Analytics

    Indian Academy of Sciences (India)

    This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big ...

  14. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  15. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    other categories that stem from computer science and engineering, namely ‘big/small’ and ‘open/closed’ to address the complex interplay between people and data, social interaction and technological operations. Thus conceived, this paper contributes an alternative approach for the study of open and big......Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD...

  16. Methods of preparing and using intravenous nutrient compositions

    International Nuclear Information System (INIS)

    Beigler, M.A.; Koury, A.J.

    1983-01-01

    A method for preparing a stable, dry-packaged, sterile, nutrient composition which upon addition of sterile, pyrogen-free water is suitable for intravenous administration to a mammal, including a human, is described. The method comprises providing the nutrients in a specific dry form and state of physical purity acceptable for intravenous administration, sealing the nutrients in a particular type of container adapted to receive and dispense sterile fluids and subjecting the container and its sealed contents to a sterilizing, nondestructive dose of ionizing radiation. The method results in a packaged, sterile nutrient composition which may be dissolved by the addition of sterile pyrogen-free water. The resulting aqueous intravenous solution may be safely administered to a mammal in need of nutrient therapy. The packaged nutrient compositions of the invention exhibit greatly extended storage life and provide an economical method of providing intravenous solutions which are safe and efficacious for use. (author)

  17. Causes of intravenous medication errors: an ethnographic study

    OpenAIRE

    Taxis, K; Barber, N

    2003-01-01

    Background: Intravenous (IV) medication errors are frequent events. They are associated with considerable harm, but little is known about their causes. Human error theory is increasingly used to understand adverse events in medicine, but has not yet been applied to study IV errors. Our aim was to investigate causes of errors in IV drug preparation and administration using a framework of human error theory.

  18. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  19. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  20. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  1. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  2. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  3. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  4. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  5. Big Data ethics

    Directory of Open Access Journals (Sweden)

    Andrej Zwitter

    2014-11-01

    Full Text Available The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with specific and knowable outcomes, towards actions by many unaware that they may have taken actions with unintended consequences for anyone. Responses will require a rethinking of ethical choices, the lack thereof and how this will guide scientists, governments, and corporate agencies in handling Big Data. This essay elaborates on the ways Big Data impacts on ethical conceptions.

  6. Pyeloureteral visualization using glucagon during intravenous urography

    International Nuclear Information System (INIS)

    Nepper-Rasmussen, J.; Nielsen, P.H.; Kruse, V.

    1983-01-01

    194 adult patients were subjected to intravenous urography. In order to study the effect of glucagon on the visualization of the pyeloureteral system, IVU's were performed in four different ways: I. with abdominal compression, II. with glucagon 1 mg.i.v., III. without abdominal compression and without glucagon, and IV. with abdominal compression and glucagon 1 mg.i.v. Coded objective and subjective analyses showed significant worsened visualization of the pyelocalyceal systems, when IVU was performed with glucagon alone. Ureteral visualization was equal in all four groups. Glucagon fails as a pharmacological alternative to abdominal compression in adult human subjects. (orig.) [de

  7. Intravenous pyogenic granuloma or intravenous lobular capillary hemangioma

    Energy Technology Data Exchange (ETDEWEB)

    Ghekiere, Olivier; Galant, Christine; Berg, Bruno Vande [Cliniques Universitaires St. Luc, Department of Radiology, Brussels (Belgium)

    2005-06-01

    Lobular capillary hemangioma is a vascular neoplasm that commonly occurs as a cutaneous tumor. When it involves the skin and mucosal surfaces, ulceration and suppuration may occur, hence the classic term of pyogenic granuloma. Intravenous pyogenic granuloma is a rare solitary form of lobular capillary hemangioma that usually occurs in the veins of the neck and upper extremities. We report the ultrasonographic and magnetic resonance imaging findings of a pyogenic intravenous granuloma localized in the right cephalic vein. The imaging and pathological findings and the differential diagnoses are discussed. (orig.)

  8. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  9. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  10. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  11. A liquid chromatography-tandem mass spectrometry method for the simultaneous quantification of escin Ia and escin Ib in human plasma: application to a pharmacokinetic study after intravenous administration.

    Science.gov (United States)

    Liu, Lidong; Wu, Xiujun; Wu, Dan; Wang, Yingwu; Li, Pengfei; Sun, Yantong; Yang, Yan; Gu, Jingkai; Cui, Yimin

    2010-12-01

    A rapid and sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated for simultaneous quantification of escin Ia and escin Ib in human plasma. After a solid-phase extraction (SPE), the analytes were separated on a Zorbax Extend C(18) column by isocratic elution with a mobile phase of methanol-acetonitrile-10 mm ammonium acetate (27:27:46, v/v/v) at a flow rate of 1.0 mL/min and analyzed by mass spectrometry in the positive ion multiple reaction monitoring mode. The precursor to product ion transitions of m/z 1131.8 → 807.6 was used to quantify escin Ia and escin Ib. Good linearity was achieved over a wide range of 2.00-900 ng/mL for escin Ia and 1.50-662 ng/mL for escin Ib. The intra- and inter-day precisions (as relative standard deviation) were less than 11% for each QC level of escin Ia and escin Ib. The accuracies (as relative error) were within ±5.27% for escin Ia and within ±4.07% for escin Ib. The method was successfully employed in a pharmacokinetic study after a single intravenous infusion administration of sodium aescinate injection containing 10 mg escin to each of the 10 healthy volunteers. Copyright © 2010 John Wiley & Sons, Ltd.

  12. Orthostatic stability with intravenous levodopa

    Directory of Open Access Journals (Sweden)

    Shan H. Siddiqi

    2015-08-01

    Full Text Available Intravenous levodopa has been used in a multitude of research studies due to its more predictable pharmacokinetics compared to the oral form, which is used frequently as a treatment for Parkinson’s disease (PD. Levodopa is the precursor for dopamine, and intravenous dopamine would strongly affect vascular tone, but peripheral decarboxylase inhibitors are intended to block such effects. Pulse and blood pressure, with orthostatic changes, were recorded before and after intravenous levodopa or placebo—after oral carbidopa—in 13 adults with a chronic tic disorder and 16 tic-free adult control subjects. Levodopa caused no statistically or clinically significant changes in blood pressure or pulse. These data add to previous data that support the safety of i.v. levodopa when given with adequate peripheral inhibition of DOPA decarboxylase.

  13. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  14. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  15. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  16. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  17. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  18. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  19. Big Science and the Large Hadron Collider

    CERN Document Server

    Giudice, Gian Francesco

    2012-01-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question here by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  20. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to

  1. Citizens’ Media Meets Big Data: The Emergence of Data Activism

    NARCIS (Netherlands)

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and technological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social change.

  2. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  3. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  4. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  5. Intravenous urography and childhood trauma

    OpenAIRE

    Okorie, N. M.; MacKinnon, A. E.

    1982-01-01

    Results of intravenous urography (IVU) in 33 patients suspected of suffering from renal trauma were reviewed. It was concluded that when haematuria is only detected microscopically and clears within 24 hr then an IVU is not necessary, in the absence of other evidence of significant urinary tract injury.

  6. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  7. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive International Journal of Astrobiology and Acta Astronautica. But those mathematical results will not be repeated in this paper in order not to make it too long. Possibly a whole new book about GBMs will be written by the author. Mass Extinctions of the geological past also are one more topic that may be cast in the language of a decreasing GBM over a short time lapse, since Mass Extinctions are sudden all-lows in the number of living species. In this paper, we give formulae for the decreasing GBMs of Mass Extinctions, like the K-Pg one of 64 million years ago. Finally, we note that the Big History Equation is just the extension of the Drake Equation to 13.8 billion

  8. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  9. Comparison of functional and histological outcomes after intralesional, intracisternal, and intravenous transplantation of human bone marrow-derived mesenchymal stromal cells in a rat model of spinal cord injury.

    Science.gov (United States)

    Shin, Dong Ah; Kim, Jin-Myung; Kim, Hyoung-Ihl; Yi, Seong; Ha, Yoon; Yoon, Do Heum; Kim, Keung Nyun

    2013-10-01

    Few studies have compared methods of stem cell transplantation. The aim of the present study was to determine the optimal method of delivery of therapeutic stem cells in spinal cord injury (SCI). We compared functional and histologic outcomes after administration of human bone marrow stromal cells (BMSCs) by intralesional (ILT), intracisternal (ICT), and intravenous transplantation (IVT). A rat model of spinal cord injury was produced by dropping a 10-g weight, 2 mm in diameter, onto the exposed spinal cords of animals from a height of 25 mm. In each treatment group, 24 animals were randomly assigned for functional assessment and 24 for histologic examination. BMSCs (3 × 10(5), ILT; 1 × 10(6), ICT; 2 × 10(6), IVT) were transplanted 1 week after SCI in numbers determined in previous studies. Basso-Beattie-Bresnahan scoring was performed in all animals weekly for 6 weeks. Spinal cord specimens were obtained from eight animals in each group 2, 4, and 6 weeks after SCI. Viable BMSCs were counted in six sagittal sections from each spinal cord. All three treatment groups showed improved functional recovery compared to controls beginning 2 weeks after stem cell injection (P < 0.01). The ICT group showed the best functional recovery, followed by the ILT and IVT groups, respectively (P < 0.01). Histological analysis showed the largest number of viable BMSCs in the ILT group, followed by the ICT and IVT groups, respectively (P < 0.01). ICT may be the safest and most effective method for delivering stem cells and improving functional outcome in SCI when no limits are placed on the number of cells transplanted. As research on enhancing engraftment rates advances, further improvement of functional outcome can be expected.

  10. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  11. Big Data and Peacebuilding

    Directory of Open Access Journals (Sweden)

    Sanjana Hattotuwa

    2013-11-01

    Full Text Available Any peace process is an exercise in the negotiation of big data. From centuries old communal hagiography to the reams of official texts, media coverage and social media updates, peace negotiations generate data. Peacebuilding and peacekeeping today are informed by, often respond and contribute to big data. This is no easy task. As recently as a few years ago, before the term big data embraced the virtual on the web, what informed peace process design and implementation was in the physical domain – from contested borders and resources to background information in the form of text. The move from analogue, face-to-face negotiations to online, asynchronous, web-mediated negotiations – which can still include real world meetings – has profound implications for how peace is strengthened in fragile democracies.

  12. A tomographic approach to intravenous coronary arteriography

    International Nuclear Information System (INIS)

    Ritman, E.L.; Bove, A.A.

    1986-01-01

    Coronary artery anatomy can be visualized using high speed, volume scanning X-ray CT. A single scan during a bolus injection of contrast medium provides image data for display of all angles of view of the opacified coronary arterial tree. Due to the tomographic nature of volume image data the superposition of contrast filled cardiac chambers, such as would occur in the levophase of an intravenous injection of contrast agent, can be eliminated. Data are presented which support these statements. The Dynamic Spatial Reconstructor (DSR) was used to scan a life-like radiologic phantom of an adult human thorax in which the left atrial and ventricular chambers and the major epicardial coronary arteries were opacified so as to simulate the levophase of an intravenous injection of contrast agent. A catheter filled with diluted contrast agent and with regions of luminal narrowing (i.e. 'stenoses') was advanced along a tract equivalent to a right ventricular catheterization. Ease of visualization of the catheter 'stenoses' and the accuracy with which they can be measured are presented. (Auth.)

  13. Intravenous Antiepileptic Drugs in Russia

    Directory of Open Access Journals (Sweden)

    P. N. Vlasov

    2014-01-01

    Full Text Available Launching four intravenous antiepileptic drugs: valproate (Depakene and Convulex, lacosamide (Vimpat, and levetiracetam (Keppra – into the Russian market has significantly broadened the possibilities of rendering care to patients in seizure emergency situations. The chemi- cal structure, mechanisms of action, indications/contraindications, clinical effectiveness and tolerability, advantages/disadvantages, and adverse events of using these drugs in urgent and elective neurology are discussed. 

  14. Muscle power during intravenous sedation

    Directory of Open Access Journals (Sweden)

    Nobuyuki Matsuura

    2017-11-01

    Full Text Available Intravenous sedation is effective to reduce fear and anxiety in dental treatment. It also has been used for behavior modification technique in dental patients with special needs. Midazolam and propofol are commonly used for intravenous sedation. Although there have been many researches on the effects of midazolam and propofol on vital function and the recovery profile, little is known about muscle power. This review discusses the effects of intravenous sedation using midazolam and propofol on both grip strength and bite force. During light propofol sedation, grip strength increases slightly and bite force increases in a dose-dependent manner. Grip strength decreases while bite force increases during light midazolam sedation, and also during light sedation using a combination of midazolam and propofol. Flumazenil did not antagonise the increase in bite force by midazolam. These results may suggest following possibilities; (1 Activation of peripheral benzodiazepine receptors located within the temporomandibular joint region and masticatory muscles may be the cause of increasing bite force. (2 Propofol limited the long-latency exteroceptive suppression (ES2 period during jaw-opening reflex. Thus, control of masticatory muscle contraction, which is thought to have a negative feedback effect on excessive bite force, may be depressed by propofol.

  15. Seeding considerations in restoring big sagebrush habitat

    Science.gov (United States)

    Scott M. Lambert

    2005-01-01

    This paper describes methods of managing or seeding to restore big sagebrush communities for wildlife habitat. The focus is on three big sagebrush subspecies, Wyoming big sagebrush (Artemisia tridentata ssp. wyomingensis), basin big sagebrush (Artemisia tridentata ssp. tridentata), and mountain...

  16. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  17. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  18. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  19. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  20. Big Data Analytics

    Indian Academy of Sciences (India)

    IAS Admin

    2016-08-20

    Aug 20, 2016 ... Facebook posts, images, and videos. This is called big data. It .... image data. Surveillance cameras and movies produce video data. ..... games. Over 30 million records are collected each day. Based on this data, they improve popular games and withdraw games that are less played. 9. Aviation industry: An ...

  1. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  2. Small Places, Big Stakes

    DEFF Research Database (Denmark)

    Garsten, Christina; Sörbom, Adrienne

    left of much of ‘what is really going on', and ‘what people are really up to.' Meetings, however, as organized and ritualized social events, may provide the ethnographer with a loupe through which key tenets of larger social groups and organizations, and big issues, may be carefully observed. In formal...

  3. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  4. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  5. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  6. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  7. Big Data Analytics

    Indian Academy of Sciences (India)

    IAS Admin

    2016-08-20

    Aug 20, 2016 ... The volume and variety of data being generated using com- puters is doubling every two years. It is estimated that in 2015,. 8 Zettabytes (Zetta=1021) were generated which consisted mostly of unstructured data such as emails, blogs, Twitter,. Facebook posts, images, and videos. This is called big data. It.

  8. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  9. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  10. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  11. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at CBS......) have developed a research-based capability mapping tool, entitled DataProfit, which the public business consultants can use to upgrade their tool kit to enable data-driven growth in manufacturing organizations. Benefit: The DataProfit model/tool comprises insights of an extensive research project...... that has been developed and tested in a multitude of manufacturing organizations – thus making it both a proven solution and a new idea. Moreover, resources for the application of the model are freely available online....

  12. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  13. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  14. Semantic Web technologies for the big data in life sciences.

    Science.gov (United States)

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  15. Treatment with high-dose recombinant human hyaluronidase-facilitated subcutaneous immune globulins in patients with juvenile dermatomyositis who are intolerant to intravenous immune globulins: a report of 5 cases.

    Science.gov (United States)

    Speth, Fabian; Haas, Johannes-Peter; Hinze, Claas H

    2016-09-13

    High-dose intravenous immune globulins (IVIg) are frequently used in refractory juvenile dermatomyositis (JDM) but are often poorly tolerated. High-dose recombinant human hyaluronidase-facilitated subcutaneous immune globulins (fSCIg) allow the administration of much higher doses of immune globulins than conventional subcutaneous immune globulin therapy and may be an alternative to IVIg. The safety and efficacy of fSCIg therapy in JDM is unknown. In this retrospective case series, five patients with steroid-refractory severe JDM were treated with high-dose fSCIg due to IVIg adverse effects (severe headaches, nausea, vomiting, difficult venous access). Peak serum IgG levels, muscle enzymes, the childhood myositis assessment scale and adverse effects were retrieved for at least 6 months following intiation of fSCIg. Data were analyzed by descriptive statistics. Patients initially received fSCIg 1 g/kg every 14 days, resulting in median IgG peak levels of 1901 mg/dl (1606-2719 mg/dl), compared to median IgG peak and trough levels while previously receiving IVIg of 2741 mg/dl (2429-2849 mg/dl) and 1351 mg/dl (1156-1710 mg/dl). Additional antirheumatic therapies consisted of low-dose glucocorticoid therapy, methotrexate, mycophenolate mofetil and/or rituximab. Two patients maintained clinically inactive disease and three patients had only a partial treatment response. In the three patients with partial treatment response, fSCIg 1 g/kg was then given on days 1 and 6 of every 28-day cycle resulting in IgG peak levels of between 2300-2846 mg/dl (previously 1606-1901 mg/dl on the biweekly regimen), resulting in clinically inactive disease in two of the three patients. There were no relevant adverse effects that limited continuation of fSCIg treatment. High-dose fSCIg is well-tolerated in patients with JDM and high peak serum IgG levels can be achieved which may be important for treatment success. High-dose fSCIg may therefore be an alternative to high-dose IVIg

  16. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  17. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  18. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  19. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  20. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  1. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  2. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  3. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  4. Big Data ethics

    OpenAIRE

    Andrej Zwitter

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with specific and knowable outcomes, towards actions by many unaware that they may have taken actions with unintended consequences for anyone. Responses will require a rethinking of ethical choices, the l...

  5. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  6. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  7. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  8. The Opportunities and Challenges of the Big Data Implementation in Social Science Research: a Literature Review - Peluang Dan Tantangan Big Data Dalam Penelitian Ilmu Sosial: Sebuah Kajian Literatur

    OpenAIRE

    Rumata, Vience Mutiara

    2016-01-01

    In the era of digital information, data can be accessed, recorded, trajected, and analysed conveniencely. Big Data is not solely a trend amongst the exculsive group, instead, it marks the paradigm swift particularly to undersand the social processes. Data, which generated on social media, is an avalanche of Big Data era. For the academics, Big Data challenges the social researchers by the changing of unit of analysis from human to algortihms. This article discusses the opportunities and pitfa...

  9. Use of intravenous immunoglobulin in neonates with haemolytic disease and immune thrombocytopenia

    Directory of Open Access Journals (Sweden)

    Marković-Sovtić Gordana

    2013-01-01

    Full Text Available Background/Aim. Intravenous immunoglobulin is a blood product made of human polyclonal immunoglobulin G. The mode of action of intravenous immunoglobulin is very complex. It is indicated in treatment of neonatal immune thrombocytopenia and haemolytic disease of the newborn. The aim of the study was to present our experience in the use of intravenous immunoglobulin in a group of term neonates. Methods. We analysed all relevant clinical and laboratory data of 23 neonates who recieved intravenous immunoglobulin during their hospitalization in Neonatal Intensive Care Unit of Mother and Child Health Care Institute over a five year period, from 2006. to 2010. Results. There were 11 patients with haemolytic disease of the newborn and 12 neonates with immune thrombocytopenia. All of them recieved 1-2 g/kg intravenous immunoglobulin in the course of their treatment. There was no adverse effects of intravenous immunoglobulin use. The use of intravenous immunoglobulin led to an increase in platelet number in thrombocytopenic patients, whereas in those with haemolytic disease serum bilirubin level decreased significantly, so that some patients whose bilirubin level was very close to the exchange transfusion criterion, avoided this procedure. Conclusion. The use of intravenous immunoglobulin was shown to be an effective treatment in reducing the need for exchange transfusion, duration of phototherapy and the length of hospital stay in neonates with haemolytic disease. When used in treatment of neonatal immune thrombocytopenia, it leads to an increase in the platelet number, thus decreasing the risk of serious complications of thrombocytopenia.

  10. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  11. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  12. Microsystems - The next big thing

    Energy Technology Data Exchange (ETDEWEB)

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  13. Disinfectant and Antimicrobial Susceptibility Profiles of the Big Six Non-O157 Shiga Toxin-Producing Escherichia coli Strains from Food Animals and Humans.

    Science.gov (United States)

    Beier, Ross C; Franz, Eelco; Bono, James L; Mandrell, Robert E; Fratamico, Pina M; Callaway, Todd R; Andrews, Kathleen; Poole, Toni L; Crippen, Tawni L; Sheffield, Cynthia L; Anderson, Robin C; Nisbet, David J

    2016-08-01

    The disinfectant and antimicrobial susceptibility profiles of 138 non-O157 Shiga toxin-producing Escherichia coli strains (STECs) from food animals and humans were determined. Antimicrobial resistance (AMR) was moderate (39.1% of strains) in response to 15 antimicrobial agents. Animal strains had a lower AMR prevalence (35.6%) than did human strains (43.9%) but a higher prevalence of the resistance profile GEN-KAN-TET. A decreasing prevalence of AMR was found among animal strains from serogroups O45 > O145 > O121 > O111 > O26 > O103 and among human strains from serogroups O145 > O103 > O26 > O111 > O121 > O45. One animal strain from serogroups O121 and O145 and one human strain from serogroup O26 had extensive drug resistance. A high prevalence of AMR in animal O45 and O121 strains and no resistance or a low prevalence of resistance in human strains from these serogroups suggests a source other than food animals for human exposure to these strains. Among the 24 disinfectants evaluated, all strains were susceptible to triclosan. Animal strains had a higher prevalence of resistance to chlorhexidine than did human strains. Both animal and human strains had a similar low prevalence of low-level benzalkonium chloride resistance, and animal and human strains had similar susceptibility profiles for most other disinfectants. Benzyldimethylammonium chlorides and C10AC were the primary active components in disinfectants DC&R and P-128, respectively, against non-O157 STECs. A disinfectant FS512 MIC ≥ 8 μg/ml was more prevalent among animal O121 strains (61.5%) than among human O121 strains (25%), which may also suggest a source of human exposure to STEC O121 other than food animals. Bacterial inhibition was not dependent solely on pH but was correlated with the presence of dissociated organic acid species and some undissociated acids.

  14. Today and tomorrow of intravenous coronary angiography programme in Japan

    International Nuclear Information System (INIS)

    Ando, Masami; Hyodo, Kazuyuki

    1994-01-01

    Development of an intravenous coronary angiography system using monochromated synchrotron radiation at the Photon Factory is described. This comprises an asymmetric cut silicon monochromator crystal to get a larger exposure area, a two dimensional imaging system using an imaging intensifier coupled to a CCD TV camera and a fast video data acquisition system. The whole system is under development using alive dogs. A future system including a dedicated insertion device applicable to alive humans is also proposed. (author)

  15. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  16. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  17. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  18. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  19. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  20. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  1. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  2. On minorities and outliers: The case for making Big Data small

    Directory of Open Access Journals (Sweden)

    Brooke Foucault Welles

    2014-07-01

    Full Text Available In this essay, I make the case for choosing to examine small subsets of Big Data datasets—making big data small. Big Data allows us to produce summaries of human behavior at a scale never before possible. But in the push to produce these summaries, we risk losing sight of a secondary but equally important advantage of Big Data—the plentiful representation of minorities. Women, minorities and statistical outliers have historically been omitted from the scientific record, with problematic consequences. Big Data affords the opportunity to remedy those omissions. However, to do so, Big Data researchers must choose to examine very small subsets of otherwise large datasets. I encourage researchers to embrace an ethical, empirical and epistemological stance on Big Data that includes minorities and outliers as reference categories, rather than the exceptions to statistical norms.

  3. Intravenous Therapy: Hazards, Complications and Their Prevention ...

    African Journals Online (AJOL)

    In this review article, the local and systemic complications of intravenous therapy are highlighted and their preventive measures are discussed. Intravenous therapy exposes the patient to numerous hazards and many of them are avoidable, if the health care provider understands the risks involved and acts appropriately and ...

  4. Intentional intravenous mercury injection | Yudelowitz | South African ...

    African Journals Online (AJOL)

    Intravenous mercury injection is rarely seen, with few documented cases. Treatment strategies are not clearly defined for such cases, although a few options do show benefit. This case report describes a 29-year-old man suffering from bipolar disorder, who presented following self-inflicted intravenous injection of mercury.

  5. Intravenous immunoglobulin prophylaxis in neonates on artificial ...

    African Journals Online (AJOL)

    The efficacy of the prophylactic use of intravenous immunoglobulin (Ig) was evaluated in a double-blind placebo-controlled trial of 21 pairs of ventilated neonates weighing more than 1 500 g, Each infant received 0.4 g/kglday of intravenous Ig or a similar volume of placebo daily for 5 days. Criteria used to assess the ...

  6. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  7. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  8. IZVEDBENI ELEMENTI U BIG BROTHERU

    OpenAIRE

    Radman, Korana

    2009-01-01

    Big Brother publici nudi "ultimativnu stvarnost" osiguranu cjelodnevnim nadzorom televizijskih kamera, o čemu je polemizirano od početka njegova prikazivanja u Europi i svijetu. Imajući to na umu, ovaj rad je pristupio Big Brotheru iz perspektive izvedbenih studija, pokušavajući u njemu prepoznati neke od mogućih izvedbi.

  9. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  10. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  11. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant future: Will the universe expand forever or will it eventually revert to a contraction that ends in an apocalyptic "big crunch?" Our own galaxy, the Milky Way, has ...

  12. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  13. Big Data and Nursing: Implications for the Future.

    Science.gov (United States)

    Topaz, Maxim; Pruinelli, Lisiane

    2017-01-01

    Big data is becoming increasingly more prevalent and it affects the way nurses learn, practice, conduct research and develop policy. The discipline of nursing needs to maximize the benefits of big data to advance the vision of promoting human health and wellbeing. However, current practicing nurses, educators and nurse scientists often lack the required skills and competencies necessary for meaningful use of big data. Some of the key skills for further development include the ability to mine narrative and structured data for new care or outcome patterns, effective data visualization techniques, and further integration of nursing sensitive data into artificial intelligence systems for better clinical decision support. We provide growth-path vision recommendations for big data competencies for practicing nurses, nurse educators, researchers, and policy makers to help prepare the next generation of nurses and improve patient outcomes trough better quality connected health.

  14. Intravenous adenosine SPECT thallium imaging

    International Nuclear Information System (INIS)

    Joyce, J.M.; Grossman, S.J.; Garrett, J.S.; Sharma, B.; Geller, M.; Sweeney, P.J.

    1991-01-01

    This paper determines the safety and efficacy of intravenous (IV) adenosine in females for the evaluation of coronary artery disease, since only limited data are available. Eighty consecutive studies of 78 female subjects (aged 43-83 years) using IV adenosine (0.14 mg/kg per minute) with T1-201 SPECT imaging were reviewed. Fifty-eight (73%) had mild symptoms; mild dyspnea (24%), flushing (23%), chest pain (23%), headache (11%), dizziness (11%), weakness (9%), nausea (8%), abdominal pain (8%), arm pain (6%), chest tightness (4%), neck tightness (4%), dry mouth (4%), and dropped P waves (4%). Four had moderate symptoms: dyspnea requiring Proventil or aminophylline (2%), significant hypotension (1%), and third-degree atrioventicular heart block (1%). Two had severe symptoms (ventricular tachycardia requiring cardioversion (1%) and severe dyspnea requiring epinephrine (1%). Twenty-two (28%) underwent cardiac catheterization that demonstrated coronary artery disease or postangioplasty results. The thallium SPECT images were 94% sensitive and 100% specific in detecting significant disease. The one false-negative result was in a subject who experienced no symptoms for ECG changes during adenosine infusion. Ischemic ECG changes were 35% sensitive and 100% specific. Chest pain was 53% sensitive and 60% specific

  15. Tiny tweaks, big changes: An alternative strategy to empower ethical culture of human research in anesthesia (A Taiwan Acta Anesthesiologica Taiwanica-Ethics Review Task Force Report).

    Science.gov (United States)

    Luk, Hsiang-Ning; Ennever, John F; Day, Yuan-Ji; Wong, Chih-Shung; Sun, Wei-Zen

    2015-03-01

    For this guidance article, the Ethics Review Task Force (ERTF) of the Journal reviewed and discussed the ethics issues related to publication of human research in the field of anesthesia. ERTF first introduced international ethics principles and minimal requirements of reporting of ethics practices, followed by discussing the universal problems of publication ethics. ERTF then compared the accountability and methodology of several medical journals in assuring authors' ethics compliance. Using the Taiwan Institutional Review Board system as an example, ERTF expressed the importance of institutional review board registration and accreditation to assure human participant protection. ERTF presented four major human research misconducts in the field of anesthesia in recent years. ERTF finally proposed a flow-chart to guide journal peer reviewers and editors in ethics review during the editorial process in publishing. Examples of template languages applied in the Ethics statement section in the manuscript are expected to strengthen the ethics compliance of the authors and to set an ethical culture for all the stakeholders involved in human research. Copyright © 2015. Published by Elsevier B.V.

  16. [Three applications and the challenge of the big data in otology].

    Science.gov (United States)

    Lei, Guanxiong; Li, Jianan; Shen, Weidong; Yang, Shiming

    2016-03-01

    With the expansion of human practical activities, more and more areas have suffered from big data problems. The emergence of big data requires people to update the research paradigm and develop new technical methods. This review discussed that big data might bring opportunities and challenges in the area of auditory implantation, the deafness genome, and auditory pathophysiology, and pointed out that we needed to find appropriate theories and methods to make this kind of expectation into reality.

  17. A matrix big bang

    Science.gov (United States)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-10-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  18. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  19. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  20. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  1. Big Data Aesthetics

    DEFF Research Database (Denmark)

    Bjørnsten, Thomas

    2016-01-01

    of large data sets – or Big Data – into the sphere of art and the aesthetic. Central to the discussion here is the analysis of how different structuring principles of data and the discourses that surround these principles shape our perception of data. This discussion involves considerations on various...... is to critically question how and whether such artistic practices can eventually lead to the experience and production of knowledge that could not otherwise be obtained via more traditional ways of data representation. The article, thus, addresses both the problems and possibilities entailed in extending the use...... notions of the ‘database’ and ‘narrative’, as well as ‘aesthetics’ and ‘aesthetic experience’, with the latter conceived as a theoretical field and methodological approach to understand the interplay between sensation, information, experience, and knowledge production....

  2. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  3. Intravenous Iron Carboxymaltose as a Potential Therapeutic in Anemia of Inflammation.

    Directory of Open Access Journals (Sweden)

    Niklas Lofruthe

    Full Text Available Intravenous iron supplementation is an effective therapy in iron deficiency anemia (IDA, but controversial in anemia of inflammation (AI. Unbound iron can be used by bacteria and viruses for their replication and enhance the inflammatory response. Nowadays available high molecular weight iron complexes for intravenous iron substitution, such as ferric carboxymaltose, might be useful in AI, as these pharmaceuticals deliver low doses of free iron over a prolonged period of time. We tested the effects of intravenous iron carboxymaltose in murine AI: Wild-type mice were exposed to the heat-killed Brucella abortus (BA model and treated with or without high molecular weight intravenous iron. 4h after BA injection followed by 2h after intravenous iron treatment, inflammatory cytokines were upregulated by BA, but not enhanced by iron treatment. In long term experiments, mice were fed a regular or an iron deficient diet and then treated with intravenous iron or saline 14 days after BA injection. Iron treatment in mice with BA-induced AI was effective 24h after iron administration. In contrast, mice with IDA (on iron deficiency diet prior to BA-IA required 7d to recover from AI. In these experiments, inflammatory markers were not further induced in iron-treated compared to vehicle-treated BA-injected mice. These results demonstrate that intravenous iron supplementation effectively treated the murine BA-induced AI without further enhancement of the inflammatory response. Studies in humans have to reveal treatment options for AI in patients.

  4. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  5. INFECTIVE ENDOCARDITIS IN INTRAVENOUS DRUGS ABUSED PATIENT

    Directory of Open Access Journals (Sweden)

    E. Y. Ponomareva

    2011-01-01

    Full Text Available Three-year observation of acute tricuspid infective endocarditis in intravenous drug abused patient: diagnosis, clinical features, visceral lesions, the possibility of cardiac surgery and conservative treatment, outcome.

  6. Intravenous immunoglobulin for chronic inflammatory demyelinating polyradiculoneuropathy

    NARCIS (Netherlands)

    Eftimov, Filip; Winer, John B.; Vermeulen, Marinus; de Haan, Rob; van Schaik, Ivo N.

    2013-01-01

    Chronic inflammatory demyelinating polyradiculoneuropathy (CIDP) causes progressive or relapsing weakness and numbness of the limbs, developing over at least two months. Uncontrolled studies suggest that intravenous immunoglobulin (IVIg) helps. This review was first published in 2002 and has since

  7. Intravenous immunoglobulin for chronic inflammatory demyelinating polyradiculoneuropathy

    NARCIS (Netherlands)

    Eftimov, Filip; Winer, John B.; Vermeulen, Marinus; de Haan, Rob; van Schaik, Ivo N.

    2009-01-01

    Background Chronic inflammatory demyelinating polyradiculoneuropathy (CIDP) causes progressive or relapsing weakness and numbness of the limbs, developing over at least two months. Uncontrolled studies suggest that intravenous immunoglobulin (IVIg) helps. Objectives To review systematically the

  8. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  9. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  10. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  11. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  12. Intravenous iron-containing products: EMA procrastination.

    Science.gov (United States)

    2014-07-01

    A European reassessment has led to identical changes in the summaries of product characteristics (SPCs) for all intravenous iron-containing products: the risk of serious adverse effects is now highlighted, underlining the fact that intravenous iron-containing products should only be used when the benefits clearly outweigh the harms. Unfortunately, iron dextran still remains on the market despite a higher risk of hypersensitivity reactions than with iron sucrose.

  13. Contrast medium extravasation in intravenous urography

    International Nuclear Information System (INIS)

    Tosch, U.; Becker-Gaab, C.; Hahn, D.

    1984-01-01

    Aetiology and diagnostic procedure of calyceal fornix rupture during intravenous urography are discussed. In the literature the fornix rupture is discribed as a spontaneous event - not so in the four cases presented. In two cases a sudden increase in intrapelvic pressure was due to an ureteric calculus, in the other cases an obstruction of the ureter was secondary to neoplasm. It is recommended to perform a CT as soon as a contrastmedium extravasation in intravenous urography is diagnosed. (orig.) [de

  14. Contrast medium extravasation in intravenous urography

    Energy Technology Data Exchange (ETDEWEB)

    Tosch, U.; Becker-Gaab, C.; Hahn, D.

    1984-09-01

    Aetiology and diagnostic procedure of calyceal fornix rupture during intravenous urography are discussed. In the literature the fornix rupture is discribed as a spontaneous event - not so in the four cases presented. In two cases a sudden increase in intrapelvic pressure was due to an ureteric calculus, in the other cases an obstruction of the ureter was secondary to neoplasm. It is recommended to perform a CT as soon as a contrast medium extravasation in intravenous urography is diagnosed.

  15. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  16. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  17. Intravenous and Intramuscular Formulations of Antiseizure Drugs in the Treatment of Epilepsy.

    Science.gov (United States)

    Patel, Sima I; Birnbaum, Angela K; Cloyd, James C; Leppik, Ilo E

    2015-12-01

    Intravenous and intramuscular antiseizure drugs (ASDs) are essential in the treatment of clinical seizure emergencies as well as in replacement therapy when oral administration is not possible. The parenteral formulations provide rapid delivery and complete (intravenous) or nearly complete (intramuscular) bioavailability. Controlled administration of the ASD is feasible with intravenous but not intramuscular formulations. This article reviews the literature and discusses the chemistry, pharmacology, pharmacokinetics, and clinical use of currently available intravenous and intramuscular ASD formulations as well as the development of new formulations and agents. Intravenous or intramuscular formulations of lorazepam, diazepam, midazolam, and clonazepam are typically used as the initial treatment agents in seizure emergencies. Recent studies also support the use of intramuscular midazolam as easier than the intravenous delivery of lorazepam in the pre-hospital setting. However, benzodiazepines may be associated with hypotension and respiratory depression. Although loading with intravenous phenytoin was an early approach to treatment, it is associated with cardiac arrhythmias, hypotension, and tissue injury at the injection site. This has made it less favored than fosphenytoin, a water-soluble, phosphorylated phenytoin molecule. Other drugs being used for acute seizure emergencies are intravenous formulations of valproic acid, levetiracetam, and lacosamide. However, the comparative effectiveness of these for status epilepticus (SE) has not been evaluated adequately. Consequently, guidelines for the medical management of SE continue to recommend lorazepam followed by fosphenytoin, or phenytoin if fosphenytoin is not available. Intravenous solutions for carbamazepine, lamotrigine, and topiramate have been developed but remain investigational. The current ASDs were not developed for use in emergency situations, but were adapted from ASDs approved for chronic oral use. New

  18. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  19. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  20. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  1. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  2. A History of Intravenous Anesthesia in War (1656-1988).

    Science.gov (United States)

    Roberts, Matthew; Jagdish, S

    2016-01-01

    The practice of anesthesia in war places significant restraints on the choice of anesthetic technique used; these include, but are not limited to, safety, simplicity, and portability. Ever since intravenous anesthesia became a practical alternative, there have been military doctors who felt that this technique was particularly suited to this environment. The challenge, as in civilian practice, has been to find the appropriate drugs as well as simple and safe delivery systems. The urgency of war has always stimulated innovation in medicine to counteract the ongoing development of weapons of war and their effects on the human body and to achieve improved survival as public expectations rise. This article traces the development of and the use of intravenous anesthesia by military physicians for battle casualties. The story starts long before the era of modern anesthesia, and the discussion concludes in the dog days of the cold war. The rapidly increasing interest in intravenous anesthesia in both civilian and military practice since the early 1990s is left for other authors to examine. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  4. Medios ciudadanos y big data: La emergencia del activismo de datos

    NARCIS (Netherlands)

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and techsnological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social

  5. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  6. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  7. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  8. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  9. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  10. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  11. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  12. Big Data Blind Separation

    Directory of Open Access Journals (Sweden)

    Mujahid N. Syed

    2018-02-01

    Full Text Available Data or signal separation is one of the critical areas of data analysis. In this work, the problem of non-negative data separation is considered. The problem can be briefly described as follows: given X ∈ R m × N , find A ∈ R m × n and S ∈ R + n × N such that X = A S . Specifically, the problem with sparse locally dominant sources is addressed in this work. Although the problem is well studied in the literature, a test to validate the locally dominant assumption is not yet available. In addition to that, the typical approaches available in the literature sequentially extract the elements of the mixing matrix. In this work, a mathematical modeling-based approach is presented that can simultaneously validate the assumption, and separate the given mixture data. In addition to that, a correntropy-based measure is proposed to reduce the model size. The approach presented in this paper is suitable for big data separation. Numerical experiments are conducted to illustrate the performance and validity of the proposed approach.

  13. Intravenous fluids in acute decompensated heart failure.

    Science.gov (United States)

    Bikdeli, Behnood; Strait, Kelly M; Dharmarajan, Kumar; Li, Shu-Xia; Mody, Purav; Partovian, Chohreh; Coca, Steven G; Kim, Nancy; Horwitz, Leora I; Testani, Jeffrey M; Krumholz, Harlan M

    2015-02-01

    This study sought to determine the use of intravenous fluids in the early care of patients with acute decompensated heart failure (HF) who are treated with loop diuretics. Intravenous fluids are routinely provided to many hospitalized patients. We conducted a retrospective cohort study of patients admitted with HF to 346 hospitals from 2009 to 2010. We assessed the use of intravenous fluids during the first 2 days of hospitalization. We determined the frequency of adverse in-hospital outcomes. We assessed variation in the use of intravenous fluids across hospitals and patient groups. Among 131,430 hospitalizations for HF, 13,806 (11%) were in patients treated with intravenous fluids during the first 2 days. The median volume of administered fluid was 1,000 ml (interquartile range: 1,000 to 2,000 ml), and the most commonly used fluids were normal saline (80%) and half-normal saline (12%). Demographic characteristics and comorbidities were similar in hospitalizations in which patients did and did not receive fluids. Patients who were treated with intravenous fluids had higher rates of subsequent critical care admission (5.7% vs. 3.8%; p fluid treatment varied widely across hospitals (range: 0% to 71%; median: 12.5%). Many patients who are hospitalized with HF and receive diuretics also receive intravenous fluids during their early inpatient care, and the proportion varies among hospitals. Such practice is associated with worse outcomes and warrants further investigation. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  14. Optimal composition of intravenous lipids

    African Journals Online (AJOL)

    63. Mayer K, Merfels M, Muhly-Reinholz M et al. Omega-3 fatty acids suppress monocyte adhesion to human endothelial cells: role of endothelial PAF generation. Am J Physiol Heart Circ Physiol. 2002;283(2):H811–H818. 64. Fisher M, Levine PH, Weiner BH et al. Dietary n-3 fatty acid supplementation reduces superoxide.

  15. What´s cheapest, intravenous iron sucrose- or intravenous iron carboxymaltose treatment in IBD patients?

    DEFF Research Database (Denmark)

    Bager, Palle; Dahlerup, Jens Frederik

      What´s cheapest, intravenous iron sucrose- or intravenous iron carboxymaltose treatment in IBD patients? It dependent on the economic evaluation perspective!   Aim: To evaluate the health care cost for intravenous iron sucrose (Venofer®, Vifor) and intravenous iron carboxymaltose (Ferinject......-cost per mg iron is for iron carboxymaltose approximately double the cost of iron sucrose.   Patients and Methods: Data related to 111 IBD-patients treated with intravenous iron at Aarhus University Hospital from August 2005 until October 2009 was used for the economic evaluation. Analysis included......, utensils and ½ hour spend by a nurse per visit; showed approximately 150€ extra cost per 1000 mg Fe++ administrated, if iron carboxymaltose was chosen. In contrast the CEA including both BIA-values and patient-related costs (transportation and lost income) showed iron carboxymaltose to be more cost...

  16. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  17. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  18. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  19. Priming the Pump for Big Data at Sentara Healthcare.

    Science.gov (United States)

    Kern, Howard P; Reagin, Michael J; Reese, Bertram S

    2016-01-01

    Today's healthcare organizations are facing significant demands with respect to managing population health, demonstrating value, and accepting risk for clinical outcomes across the continuum of care. The patient's environment outside the walls of the hospital and physician's office-and outside the electronic health record (EHR)-has a substantial impact on clinical care outcomes. The use of big data is key to understanding factors that affect the patient's health status and enhancing clinicians' ability to anticipate how the patient will respond to various therapies. Big data is essential to delivering sustainable, highquality, value-based healthcare, as well as to the success of new models of care such as clinically integrated networks (CINs) and accountable care organizations.Sentara Healthcare, based in Norfolk, Virginia, has been an early adopter of the technologies that have readied us for our big data journey: EHRs, telehealth-supported electronic intensive care units, and telehealth primary care support through MDLIVE. Although we would not say Sentara is at the cutting edge of the big data trend, it certainly is among the fast followers. Use of big data in healthcare is still at an early stage compared with other industries. Tools for data analytics are maturing, but traditional challenges such as heightened data security and limited human resources remain the primary focus for regional health systems to improve care and reduce costs. Sentara primarily makes actionable use of big data in our CIN, Sentara Quality Care Network, and at our health plan, Optima Health. Big data projects can be expensive, and justifying the expense organizationally has often been easier in times of crisis. We have developed an analytics strategic plan separate from but aligned with corporate system goals to ensure optimal investment and management of this essential asset.

  20. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  1. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  2. A phase I trial of intravenous catumaxomab

    DEFF Research Database (Denmark)

    Mau-Sørensen, Morten; Dittrich, Christian; Dienstmann, Rodrigo

    2015-01-01

    design in epithelial cancers with known EpCAM expression. The dose-limiting toxicity (DLT) period consisted of 4 weeks, with weekly intravenous administration of catumaxomab. Key DLTs were ≥grade 3 optimally treated non-hematological toxicity; ≥grade 3 infusion-related reactions refractory to supportive....... A reversible decrease in liver function test (prothrombin time) at the 7-µg dose level was considered a DLT. The first patient at 10 µg experienced a fatal hepatic failure related to catumaxomab that led to the termination of the study. CONCLUSIONS: The MTD of weekly intravenous catumaxomab was 7 µg. Major...

  3. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  4. Big Data Analytics in Healthcare

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Beard, Daniel A.

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  5. Big Five personality group differences across academic majors

    DEFF Research Database (Denmark)

    Vedel, Anna

    2016-01-01

    During the past decades, a number of studies have explored personality group differences in the Big Five personality traits among students in different academic majors. To date, though, this research has not been reviewed systematically. This was the aim of the present review. A systematic...... literature search identified twelve eligible studies yielding an aggregated sample size of 13,389. Eleven studies reported significant group differences in one or multiple Big Five personality traits. Consistent findings across studies were that students of arts/humanities and psychology scored high...... on Conscientiousness. Effect sizes were calculated to estimate the magnitude of the personality group differences. These effect sizes were consistent across studies comparing similar pairs of academic majors. For all Big Five personality traits medium effect sizes were found frequently, and for Openness even large...

  6. 'Big data', Hadoop and cloud computing in genomics.

    Science.gov (United States)

    O'Driscoll, Aisling; Daugelaite, Jurate; Sleator, Roy D

    2013-10-01

    Since the completion of the Human Genome project at the turn of the Century, there has been an unprecedented proliferation of genomic sequence data. A consequence of this is that the medical discoveries of the future will largely depend on our ability to process and analyse large genomic data sets, which continue to expand as the cost of sequencing decreases. Herein, we provide an overview of cloud computing and big data technologies, and discuss how such expertise can be used to deal with biology's big data sets. In particular, big data technologies such as the Apache Hadoop project, which provides distributed and parallelised data processing and analysis of petabyte (PB) scale data sets will be discussed, together with an overview of the current usage of Hadoop within the bioinformatics community. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. The Shadow of Big Data: Data-Citizenship and Exclusion

    DEFF Research Database (Denmark)

    Rossi, Luca; Hjelholt, Morten; Neumayer, Christina

    2016-01-01

    The shadow of Big Data: data-citizenship and exclusion Big data are understood as being able to provide insights on human behaviour at an individual as well as at an aggregated societal level (Manyka et al. 2011). These insights are expected to be more detailed and precise than anything before...... thanks to the large volume of digital data and to the unobstrusive nature of the data collection (Fishleigh 2014). Within this perspective, these two dimensions (volume and unobstrusiveness) define contemporary big data techniques as a socio-technical offering to society, a live representation of itself...... this process "data-citizenship" emerges. Data-citizenship assumes that citizens will be visible to the state through the data they produce. On a general level data-citizenship shifts citizenship from an intrinsic status of a group of people to a status achieved through action. This approach assumes equal...

  8. From big data to deep insight in developmental science.

    Science.gov (United States)

    Gilmore, Rick O

    2016-01-01

    The use of the term 'big data' has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data 'big' and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. © 2016 The Authors. WIREs Cognitive Science published by Wiley Periodicals, Inc.

  9. A practical guide to big data research in psychology.

    Science.gov (United States)

    Chen, Eric Evan; Wojcik, Sean P

    2016-12-01

    The massive volume of data that now covers a wide variety of human behaviors offers researchers in psychology an unprecedented opportunity to conduct innovative theory- and data-driven field research. This article is a practical guide to conducting big data research, covering data management, acquisition, processing, and analytics (including key supervised and unsupervised learning data mining methods). It is accompanied by walkthrough tutorials on data acquisition, text analysis with latent Dirichlet allocation topic modeling, and classification with support vector machines. Big data practitioners in academia, industry, and the community have built a comprehensive base of tools and knowledge that makes big data research accessible to researchers in a broad range of fields. However, big data research does require knowledge of software programming and a different analytical mindset. For those willing to acquire the requisite skills, innovative analyses of unexpected or previously untapped data sources can offer fresh ways to develop, test, and extend theories. When conducted with care and respect, big data research can become an essential complement to traditional research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  11. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  12. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  13. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  14. Big Textual Data in Transportation

    DEFF Research Database (Denmark)

    Beheshti-Kashi, Samaneh; Buch, Rasmus Brødsgaard; Lachaize, Maxime

    2018-01-01

    With the emergence of Big Data and growth in Big Data techniques, a huge number of textual information is now utilizable, which may be applied by different stakeholders. Formerly unexplored textual data from internal information assets of organisations, as well as textual data from social media...... applications have been converting to utilizable and meaningful insights. However, prior to this, the availability of textual sources relevant for logistics and transportation has to be examined. Accordingly, the identification of potential textual sources and their evaluation in terms of extraction barriers...

  15. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  16. Administration and monitoring of intravenous anesthetics

    NARCIS (Netherlands)

    Sahinovic, Marko M.; Absalom, Anthony R.; Struys, Michel M. R. F.

    2010-01-01

    Purpose of review The importance of accuracy in controlling the dose-response relation for intravenous anesthetics is directly related to the importance of optimizing the efficacy and quality of anesthesia while minimizing adverse drug effects. Therefore, it is important to measure and control all

  17. Intravenous immunoglobulin treatment for secondary recurrent miscarriage

    DEFF Research Database (Denmark)

    Christiansen, O B; Larsen, E C; Egerup, P

    2015-01-01

    OBJECTIVE: To determine whether infusions with intravenous immunoglobulin (IVIg) during early pregnancy increase live birth rate in women with secondary recurrent miscarriage compared with placebo. DESIGN: A single-centre, randomised, double-blind, placebo-controlled trial. SETTING: A tertiary...

  18. Intravenous iron supplementation in children on hemodialysis.

    NARCIS (Netherlands)

    Leijn, E.; Monnens, L.A.H.; Cornelissen, E.A.M.

    2004-01-01

    BACKGROUND: Children with end-stage renal disease (ESRD) on hemodialysis (HD) are often absolute or functional iron deficient. There is little experience in treating these children with intravenous (i.v.) iron-sucrose. In this prospective study, different i.v. iron-sucrose doses were tested in

  19. Intravenous and intramuscular magnesium sulphate regimens in ...

    African Journals Online (AJOL)

    1993-09-03

    Sep 3, 1993 ... parenterally, usually according to one of two popular regimens: the intramuscular (IM) regimen introduced by. Pritchard' and a continuous intravenous (IV) infusion described by Zuspan! Sibai et a/.3 have reported that lower serum magnesium values are achieved with Zuspan's regimen (maintenance dose ...

  20. A Comparison of Prophylactic Intravenous Glycopyrrolate and ...

    African Journals Online (AJOL)

    Ephedrine is gradually falling out of favour because of the associated tachyarrhythmia and foetal acidosis. This study compared the effect of preoperative administration of intravenous glycopyrrolate and ephedrine on spinal induced maternal hypotension. Patients and Methods: Fifty patients scheduled for elective C/S were ...

  1. Comparative Evaluation of Ultrasonography and Intravenous ...

    African Journals Online (AJOL)

    Background: Renal ultrasonography an easily available procedure was compared to intravenous urogram (IVU) to determine its suitability as an alternative to the latter, which is a relatively invasive test for demonstrating hydronephrosis/ or ureteric obstruction in cervical cancer staging. Study design: Thirty five histologically ...

  2. Intravenous voriconazole after toxic oral administration

    NARCIS (Netherlands)

    Alffenaar, J.W.C.; Van Assen, S.; De Monchy, J.G.R.; Uges, D.R.A.; Kosterink, J.G.W.; Van Der Werf, T.S.

    In a male patient with rhinocerebral invasive aspergillosis, prolonged high-dosage oral administration of voriconazole led to hepatotoxicity combined with a severe cutaneous reaction while intravenous administration in the same patient did not. High concentrations in the portal blood precipitate

  3. Intravenous paracetamol overdose in a paediatric patient

    NARCIS (Netherlands)

    Broeks, Ilse J.; Van Roon, Eric N.; Van Pinxteren-Nagler, Evelyn; De Vries, Tjalling W.

    2013-01-01

    BACKGROUND: Paracetamol is a widely used drug in children. In therapeutic doses, paracetamol has an excellent safety profile. Since the introduction of the intravenous form in 2004, only three reports of accidental overdose in children have been published. The low number probably is due to

  4. Intramuscular compared to intravenous midazolam for paediatric ...

    African Journals Online (AJOL)

    Background: Sedation in children remains a controversial issue in emergency departments (ED). Midazolam, as a benzodiazepine is widely used for procedural sedation among paediatrics. We compared the effectiveness and safety of two forms of midazolam prescription; intramuscular (IM) and intravenous (IV). Patients ...

  5. Intravenous platelet blockade with cangrelor during PCI

    NARCIS (Netherlands)

    Bhatt, Deepak L.; Lincoff, A. Michael; Gibson, C. Michael; Stone, Gregg W.; McNulty, Steven; Montalescot, Gilles; Kleiman, Neal S.; Goodman, Shaun G.; White, Harvey D.; Mahaffey, Kenneth W.; Pollack, Charles V.; Manoukian, Steven V.; Widimsky, Petr; Chew, Derek P.; Cura, Fernando; Manukov, Ivan; Tousek, Frantisek; Jafar, M. Zubair; Arneja, Jaspal; Skerjanec, Simona; Harrington, Robert A.; Bhatt, D. L.; Harrington, R. A.; Lincoff, A. M.; Pollack, C. V.; Gibson, C. M.; Stone, G. W.; Mahaffey, K. W.; Kleiman, N. S.; Montalescot, G.; White, H. D.; Goodman, S. G.; Greenbaum, A.; Simon, D.; Lee, D.; Feit, F.; Dauerman, H.; Gurbel, P.; Berger, P.; Makkar, R.; Becker, R. C.; Manoukian, S.; Jorgova, J.; Chew, D. P.; Storey, R.; Desmet, W.; Cura, F.; Herrmann, H.; Rizik, D.; DeServi, S.; Huber, K.; Jukema, W. J.; Knopf, W.; Steg, P. G.; Schunkert, H.; Widimsky, P.; Betriu, A.; Aylward, P.; Polonestsky, L.; Lima, V.; Kobulia, B.; Navickas, R.; Gasior, Z.; Vasilieva, E.; Bennett, J. M.; Kraiz, I.; Van de Werf, F.; Faxon, D.; Ohman, E. M.; Tijssen, J. G. P.; Verheugt, F.; Weaver, W. D.; Califf, R. M.; Mehta, C.; Hamm, C. W.; Pepine, C. J.; Ware, J.; Wilson, M.; Gorham, C.; Maran, A.; McNulty, S.; Fasteson, D.; Ryan, G.; Bradsher, J.; Connolly, P.; Mehta, R.; Leonardi, S.; Brennan, M.; Patel, M.; Petersen, J.; Bushnel, C.; Jolicoeur, M.; Chan, M.; Dowd, L.; Skinner, P.; Lawrence, G.; Jordon, M.; Dickerson, S.; Meyer, M.; Hartford, S.; Garcia Escudero, Alejandro; Poy, Carlos; Miceli, Miguel; Pocovi, Antonio; Londero, Hugo; Baccaro, Jorge; Polonetsky, Leonid; Karotkin, Aliaksey; Shubau, Leanid; Maffini, Eduardo; Machado, Bruno; Airton, José; Lima, Valter; Martinez Filho, Eulogio; Herdy, Arthur; Tumelero, Rogerio; Precoma, Dalton; Botelho, Roberto; Saad, Jamil; Jatene, Jose; Vilas-Boas, Fabio; Godinho, Antonio; Perin, Marco; Caramori, Paulo; Castro, Iran; Grigorov, Mladen; Milkov, Plamen; Jorgova, Julia; Georgiev, Svetoslav; Rifai, Nizar; Doganov, Alexander; Petrov, Ivo; Hui, William; Lazzam, Charles; Reeves, Francois; Tanguay, Jean-Francois; Richter, Marek; Klimsa, Zdenek; Padour, Michal; Mrozek, Jan; Branny, Marian; Coufal, Zdenek; Simek, Stanislav; Rozsival, Vladimir; Pleva, Leos; Stasek, Josef; Kala, Petr; Groch, Ladislav; Kocka, Viktor; Shaburishvili, Tamaz; Khintibidze, Irakli; Chapidze, Gulnara; Mamatsashvili, Merab; Mohanan, Padinhare; Jain, Rajesh; Parikh, Keyur; Patel, Tejas; Kumar, Sampath; Mehta, Ashwani; Banker, Darshan; Krishna, Lanka; Gadkari, Milind; Joshi, Hasit; Hiremath, Shirish; Grinius, Virgilijus; Norkiene, Sigute; Petrauskiene, Birute; Michels, Rolf; Tjon, Melvin; de Swart, Hans; de Winter, Robbert; White, Harvey; Devlin, Gerard; Abernethey, Malcolm; Osiev, Alexander; Linev, Kirill; Kalinina, Svetlana; Baum, Svetlana; Kosmachova, Elena; Shogenov, Zaur; Markov, Valentin; Boldueva, Svetlana; Barbarash, Olga; Kostenko, Victor; Vasilieva, Elena; Gruzdev, Aleksey; Lusov, Victor; Dovgalevsky, Pavel; Azarin, Oleg; Chernov, Sergey; Smolenskaya, Olga; Duda, Alexey; Fridrich, Viliam; Hranai, Marian; Studencan, Martin; Kurray, Peter; Bennett, John; Blomerus, Pieter; Disler, Laurence; Engelbrecht, Johannes; Klug, Eric; Routier, Robert; Venter, Tjaart; van der Merwe, Nico; Becker, Anthony; Cha, Kwang-Soo; Lee, Seung-Hwan; Han, Sang-Jin; Youn, Tae Jin; Hur, Seung-Ho; Seo, Hong Seog; Park, Hun-Sik; Rhim, Chong-Yun; Pyun, Wook-Bum; Choe, Hyunmin; Jeong, Myung-Ho; Park, Jong-Seon; Shin, Eak-Kyun; Hernández, Felipe; Figueras, Jaume; Hernández, Rosana; López-Minguez, José Ramón; González Juanatey, José Ramón; Palop, Ramón López; Galeote, Guillermo; Chamnarnphol, Noppadol; Buddhari, Wacin; Sansanayudh, Nakarin; Kuanprasert, Srun; Penny, William; Lui, Charles; Grimmett, Garfield; Srinivasan, Venkatraman; Ariani, Kevin; Khan, Waqor; Blankenship, James; Cannon, Louis; Eisenberg, Steven; McLaurin, Brent; Mahoney, Paul; Greenberg, Jerry; Breall, Jeffrey; Chandna, Harish; Hockstad, Eric; Tolerico, Paul; Kao, John; Shroff, Adhir; Nseir, Georges; Greenbaum, Adam; Cohn, Joel; Gogia, Harinder; Nahhas, Ahed; Istfan, Pierre; Orlow, Steve; Spriggs, Douglas; Sklar, Joel; Paulus, Richard; Cochran, David; Smith, Robert; Ferrier, L. Norman; Scott, J. Christopher; Xenopoulos, Nicholaos; Mulumudi, Mahesh; Hoback, James; Ginete, Wilson; Ballard, William; Stella, Joseph; Voeltz, Michele; Staniloae, Cezar; Eaton, Gregory; Griffin, John; Kumar, Krishna; Ebrahimi, Ramin; O'Shaughnessy, Charles; Lundstrom, Lundstrom; Temizer, Dogan; Tam, Kenneth; Suarez, Jose; Raval, Amish; Kaufman, Jay; Brilakis, Emmanouil; Stillabower, Michael; Quealy, Kathleen; Nunez, Boris; Pow, Thomas; Samuels, Bruce; Argenal, Agustin; Srinivas, Vankeepuram; Rosenthal, Andrew; Tummala, Pradyumna; Myers, Paul; LaMarche, Nelson; Chan, Michael; Bach, Richard; Simon, Daniel; Kettelkamp, Richard; Helmy, Tarek; Schaer, Gary; Kosinski, Edward; Buchbinder, Maurice; Sharma, Mukesh; Goodwin, Mark; Horwitz, Phillip; Mann, J. Tift; Holmes, David; Angiolillo, Dominick; Rao, Sunil; Azrin, Michael; Gammon, Roger; Mavromatis, Kreton; Ahmed, Abdel; Kent, Kenneth; Zughaib, Marcel; Westcott, R. Jeffrey; Jain, Ash; Gruberg, Luis; LeGalley, Thomas

    2009-01-01

    BACKGROUND: Intravenous cangrelor, a rapid-acting, reversible adenosine diphosphate (ADP) receptor antagonist, might reduce ischemic events during percutaneous coronary intervention (PCI). METHODS: In this double-blind, placebo-controlled study, we randomly assigned 5362 patients who had not been

  6. Effect of intravenous dexmedetomidine infusion on some ...

    African Journals Online (AJOL)

    Background: This study was designed to evaluate the effect of intravenous dexmedetomidine infusion in patients undergoing major abdominal surgery on stress response markers as plasma interleukin-6, cortisol and blood glucose level. It also assessed its effect on recovery profile and postoperative pain. Methods: Thirty ...

  7. Organizational Design Challenges Resulting From Big Data

    OpenAIRE

    Jay R. Galbraith

    2014-01-01

    Business firms and other types of organizations are feverishly exploring ways of taking advantage of the big data phenomenon. This article discusses firms that are at the leading edge of developing a big data analytics capability. Firms that are currently enjoying the most success in this area are able to use big data not only to improve their existing businesses but to create new businesses as well. Putting a strategic emphasis on big data requires adding an analytics capability to the exist...

  8. A Social Framework for Big Data

    OpenAIRE

    Ruppert, Evelyn; Harvey, Penny; Lury, Cellia; Mackenzie, Adrian; McNally, Ruth; Baker, Stephanie Alice; Kallianos, Yannis; Lewis, Camilla

    2015-01-01

    The Social Framework proposes an agenda that understands how social composition and social effects are related and proposes that giving Big Data a ‘social intelligence’ requires acting with an ethic of care. It is the product of an ESRC funded project, Socialising Big Data, led by Evelyn Ruppert (2013-14). Accompanying documents that are available are a working paper, 'Socialising Big Data: from concept to practice' and 'Background: A Social Framework for Big Data.'

  9. Cost-effectiveness of oral phenytoin, intravenous phenytoin, and intravenous fosphenytoin in the emergency department.

    Science.gov (United States)

    Rudis, Maria I; Touchette, Daniel R; Swadron, Stuart P; Chiu, Amy P; Orlinsky, Michael

    2004-03-01

    Oral phenytoin, intravenous phenytoin, and intravenous fosphenytoin are all commonly used for loading phenytoin in the emergency department (ED). The cost-effectiveness of each was compared for patients presenting with seizures and subtherapeutic phenytoin concentrations. A simple decision tree was developed to determine the treatment costs associated with each of 3 loading techniques. We determined effectiveness by comparing adverse event rates and by calculating the time to safe ED discharge. Time to safe ED discharge was defined as the time at which therapeutic concentrations of phenytoin (>or=10 mg/L) were achieved with an absence of any adverse events that precluded discharge. The comparative cost-effectiveness of alternatives to oral phenytoin was determined by combining net costs and number of adverse events, expressed as cost per adverse events avoided. Cost-effectiveness was also determined by comparing the net costs of each loading technique required to achieve the time to safe ED discharge, expressed as cost per hour of ED time saved. The outcomes and costs were primarily derived from a prospective, randomized controlled trial, augmented by time-motion studies and alternate-cost sources. Costs included the cost of drugs, supplies, and personnel. Analyses were also performed in scenarios incorporating labor costs and savings from using a lower-urgency area of the ED. The mean number of adverse events per patient for oral phenytoin, intravenous phenytoin, and intravenous fosphenytoin was 1.06, 1.93, and 2.13, respectively. Mean time to safe ED discharge in the 3 groups was 6.4 hours, 1.7 hours, and 1.3 hours. Cost per patient was 2.83 dollars, 21.16 dollars, and 175.19 dollars, respectively, and did not differ substantially in the Labor and Triage (lower-urgency area of ED) scenarios. When the measure of effectiveness was adverse events, oral phenytoin dominated intravenous phenytoin and intravenous fosphenytoin, with a lower cost and number of adverse

  10. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  11. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  12. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  13. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  14. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  15. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  16. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  17. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  18. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  19. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  20. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  1. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  2. Big City Education: Its Challenge to Governance.

    Science.gov (United States)

    Haskew, Laurence D.

    This chapter traces the migration from farms to cities and the later movement from cities to suburbs and discusses the impact of the resulting big city environment on the governance of big city education. The author (1) suggests how local, State, and Federal governments can improve big city education; (2) discusses ways of planning for the future…

  3. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  4. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  5. Big Data: Implications for Health System Pharmacy

    Science.gov (United States)

    Stokes, Laura B.; Rogers, Joseph W.; Hertig, John B.; Weber, Robert J.

    2016-01-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  6. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  7. A survey of big data research.

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  8. A survey of big data research

    OpenAIRE

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  9. Big sagebrush seed bank densities following wildfires

    Science.gov (United States)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  10. Intravenous/oral ciprofloxacin therapy versus intravenous ceftazidime therapy for selected bacterial infections.

    Science.gov (United States)

    Gaut, P L; Carron, W C; Ching, W T; Meyer, R D

    1989-11-30

    The efficacy and toxicity of sequential intravenous and oral ciprofloxacin therapy was compared with intravenously administered ceftazidime in a prospective, randomized, controlled, non-blinded trial. Thirty-two patients (16 patients receiving ciprofloxacin and 16 patients receiving ceftazidime) with 38 infections caused by susceptible Pseudomonas aeruginosa, enteric gram-negative rods, Salmonella group B, Serratia marcescens, Pseudomonas cepacia, and Xanthomonas maltophilia at various sites were evaluable for determination of efficacy. Length of therapy varied from seven to 25 days. Concomitant antimicrobials included intravenously administered beta-lactams for gram-positive organisms, intravenous/oral metronidazole and clindamycin for anaerobes, and intravenous/local amphotericin B for Candida albicans. Intravenous administration of 200 mg ciprofloxacin every 12 hours to 11 patients produced peak serum levels between 1.15 and 3.12 micrograms/ml; trough levels ranged between 0.08 and 0.86 micrograms/ml. Overall response rates were similar for patients receiving ciprofloxacin and ceftazidime. Emergence of resistance was similar in both groups--one Enterobacter cloacae and two P. aeruginosa became resistant after ciprofloxacin therapy and two P. aeruginosa became resistant after ceftazidime therapy. The frequency of superinfection with a variety of organisms was also similar in both groups. Adverse events related to ciprofloxacin included transient pruritus at the infusion site and generalized rash leading to drug discontinuation (one patient each), and with ceftazidime adverse effects included pain at the site of infusion and the development of allergic interstitial nephritis (one patient each). Overall, intravenous/oral ciprofloxin therapy appears to be as safe and effective as intravenous ceftazidime therapy in the treatment of a variety of infections due to susceptible aerobic gram-negative organisms.

  11. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  12. Acute toxicity of intravenously administered titanium dioxide nanoparticles in mice.

    Directory of Open Access Journals (Sweden)

    Jiaying Xu

    Full Text Available BACKGROUND: With a wide range of applications, titanium dioxide (TiO₂ nanoparticles (NPs are manufactured worldwide in large quantities. Recently, in the field of nanomedicine, intravenous injection of TiO₂ nanoparticulate carriers directly into the bloodstream has raised public concerns on their toxicity to humans. METHODS: In this study, mice were injected intravenously with a single dose of TiO₂ NPs at varying dose levels (0, 140, 300, 645, or 1387 mg/kg. Animal mortality, blood biochemistry, hematology, genotoxicity and histopathology were investigated 14 days after treatment. RESULTS: Death of mice in the highest dose (1387 mg/kg group was observed at day two after TiO₂ NPs injection. At day 7, acute toxicity symptoms, such as decreased physical activity and decreased intake of food and water, were observed in the highest dose group. Hematological analysis and the micronucleus test showed no significant acute hematological or genetic toxicity except an increase in the white blood cell (WBC count among mice 645 mg/kg dose group. However, the spleen of the mice showed significantly higher tissue weight/body weight (BW coefficients, and lower liver and kidney coefficients in the TiO₂ NPs treated mice compared to control. The biochemical parameters and histological tissue sections indicated that TiO₂ NPs treatment could induce different degrees of damage in the brain, lung, spleen, liver and kidneys. However, no pathological effects were observed in the heart in TiO₂ NPs treated mice. CONCLUSIONS: Intravenous injection of TiO₂ NPs at high doses in mice could cause acute toxicity effects in the brain, lung, spleen, liver, and kidney. No significant hematological or genetic toxicity was observed.

  13. The Big Five default brain: functional evidence.

    Science.gov (United States)

    Sampaio, Adriana; Soares, José Miguel; Coutinho, Joana; Sousa, Nuno; Gonçalves, Óscar F

    2014-11-01

    Recent neuroimaging studies have provided evidence that different dimensions of human personality may be associated with specific structural neuroanatomic correlates. Identifying brain correlates of a situation-independent personality structure would require evidence of a stable default mode of brain functioning. In this study, we investigated the correlates of the Big Five personality dimensions (Extraversion, Neuroticism, Openness/Intellect, Agreeableness, and Conscientiousness) and the default mode network (DMN). Forty-nine healthy adults completed the NEO-Five Factor. The results showed that the Extraversion (E) and Agreeableness (A) were positively correlated with activity in the midline core of the DMN, whereas Neuroticism (N), Openness (O), and Conscientiousness (C) were correlated with the parietal cortex system. Activity of the anterior cingulate cortex was positively associated with A and negatively with C. Regions of the parietal lobe were differentially associated with each personality dimension. The present study not only confirms previous functional correlates regarding the Big Five personality dimensions, but it also expands our knowledge showing the association between different personality dimensions and specific patterns of brain activation at rest.

  14. Intravenous cidofovir for resistant cutaneous warts in a patient with psoriasis treated with monoclonal antibodies.

    LENUS (Irish Health Repository)

    McAleer, M A

    2012-02-01

    Human papilloma virus is a common and often distressing cutaneous disease. It can be therapeutically challenging, especially in immunocompromised patients. We report a case of recalcitrant cutaneous warts that resolved with intravenous cidofovir treatment. The patient was immunocompromised secondary to monoclonal antibody therapy for psoriasis.

  15. Evaluation of the effects of intravenous anaesthesia using a ...

    African Journals Online (AJOL)

    medetomidine for total intravenous anaesthesia were evaluated in six sahel goats. The goats were administered a combination of ketamine (5mg/kg) and medetomidine (0.01mg/kg) intravenously. Baseline measurements of heart rate, respiratory ...

  16. Incarcerated intravenous heroin users: predictors of post-release utilization of methadone maintenance treatment.

    Science.gov (United States)

    Lin, Huang-Chi; Wang, Peng-Wei; Yang, Yi-Hsin; Tsai, Jih-Jin; Yen, Cheng-Fang

    2016-01-01

    Incarcerated intravenous heroin users have more problematic patterns of heroin use, but are less likely to access methadone maintenance treatment by their own initiative than heroin users in the community. The present study examined predictors for receiving methadone maintenance treatment post-release among incarcerated intravenous heroin users within a 24-month period. This cohort study recruited 315 incarcerated intravenous heroin users detained in 4 prisons in southern Taiwan and followed up within the 24-month period post-release. Cox proportional hazards regression analysis was applied to determine the predictive effects of sociodemographic and drug-use characteristics, attitude toward methadone maintenance treatment, human immunodeficiency virus serostatus, perceived family support, and depression for access to methadone maintenance treatment after release. There were 295 (93.7%) incarcerated intravenous heroin users released that entered the follow-up phase of the study. During the 24-month follow-up period, 50.8% of them received methadone maintenance treatment. After controlling for the effects of the detainment period before and after recruitment by Cox proportional hazards regression analysis, incarcerated intravenous heroin users who had positive human immunodeficiency virus serostatus (HR = 2.85, 95% CI = 1.80-4.52, p maintenance treatment before committal (HR = 1.94, 95% CI = 1.23-3.05, p maintenance treatment within the 24-month follow-up period. Positive human immunodeficiency virus serostatus with fully subsidized treatment and previous methadone maintenance treatment experiences predicted access of methadone maintenance treatment post-release. Strategies for getting familiar with methadone maintenance treatment during detainment, including providing methadone maintenance treatment prior to release and lowering the economic burden of receiving treatment, may facilitate entry of methadone maintenance treatment for incarcerated intravenous heroin

  17. Big Biology: Supersizing Science During the Emergence of the 21st Century

    Science.gov (United States)

    Vermeulen, Niki

    2017-01-01

    Ist Biologie das jüngste Mitglied in der Familie von Big Science? Die vermehrte Zusammenarbeit in der biologischen Forschung wurde in der Folge des Human Genome Project zwar zum Gegenstand hitziger Diskussionen, aber Debatten und Reflexionen blieben meist im Polemischen verhaftet und zeigten eine begrenzte Wertschätzung für die Vielfalt und Erklärungskraft des Konzepts von Big Science. Zur gleichen Zeit haben Wissenschafts- und Technikforscher/innen in ihren Beschreibungen des Wandels der Forschungslandschaft die Verwendung des Begriffs Big Science gemieden. Dieser interdisziplinäre Artikel kombiniert eine begriffliche Analyse des Konzepts von Big Science mit unterschiedlichen Daten und Ideen aus einer Multimethodenuntersuchung mehrerer großer Forschungsprojekte in der Biologie. Ziel ist es, ein empirisch fundiertes, nuanciertes und analytisch nützliches Verständnis von Big Biology zu entwickeln und die normativen Debatten mit ihren einfachen Dichotomien und rhetorischen Positionen hinter sich zu lassen. Zwar kann das Konzept von Big Science als eine Mode in der Wissenschaftspolitik gesehen werden – inzwischen vielleicht sogar als ein altmodisches Konzept –, doch lautet meine innovative Argumentation, dass dessen analytische Verwendung unsere Aufmerksamkeit auf die Ausweitung der Zusammenarbeit in den Biowissenschaften lenkt. Die Analyse von Big Biology zeigt Unterschiede zu Big Physics und anderen Formen von Big Science, namentlich in den Mustern der Forschungsorganisation, der verwendeten Technologien und der gesellschaftlichen Zusammenhänge, in denen sie tätig ist. So können Reflexionen über Big Science, Big Biology und ihre Beziehungen zur Wissensproduktion die jüngsten Behauptungen über grundlegende Veränderungen in der Life Science-Forschung in einen historischen Kontext stellen. PMID:27215209

  18. Optimal timing for intravenous administration set replacement.

    Science.gov (United States)

    Gillies, D; O'Riordan, L; Wallen, M; Morrison, A; Rankin, K; Nagy, S

    2005-10-19

    Administration of intravenous therapy is a common occurrence within the hospital setting. Routine replacement of administration sets has been advocated to reduce intravenous infusion contamination. If decreasing the frequency of changing intravenous administration sets does not increase infection rates, a change in practice could result in considerable cost savings. The objective of this review was to identify the optimal interval for the routine replacement of intravenous administration sets when infusate or parenteral nutrition (lipid and non-lipid) solutions are administered to people in hospital via central or peripheral venous catheters. We searched The Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, CINAHL, EMBASE: all from inception to February 2004; reference lists of identified trials, and bibliographies of published reviews. We also contacted researchers in the field. We did not have a language restriction. We included all randomized or quasi-randomized controlled trials addressing the frequency of replacing intravenous administration sets when parenteral nutrition (lipid and non-lipid containing solutions) or infusions (excluding blood) were administered to people in hospital via a central or peripheral catheter. Two authors assessed all potentially relevant studies. We resolved disagreements between the two authors by discussion with a third author. We collected data for the outcomes; infusate contamination; infusate-related bloodstream infection; catheter contamination; catheter-related bloodstream infection; all-cause bloodstream infection and all-cause mortality. We identified 23 references for review. We excluded eight of these studies; five because they did not fit the inclusion criteria and three because of inadequate data. We extracted data from the remaining 15 references (13 studies) with 4783 participants. We conclude that there is no evidence that changing intravenous administration sets more often than every 96 hours

  19. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within...... a global political economy. Grassroot Tolai pass judgement on the Big Shots through rhetorical contrast with idealized Big Men of the past, in a particular local version of a global trend for the emergence of new words to illustrate changing perceptions of locale elites. As such the 'Big Shot' acts...

  20. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  1. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  2. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Document Server

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  3. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  4. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  5. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  6. Small Artifacts - Big Technologies

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    2005-01-01

    The computer IC is the heart of the information and telecommunication technology. It is a tiny artifact, but with incredible organizing powers. We use this physical artifact as the location for studying central problems of the knowledge economy. First, the paper describes the history of chip design...... and the emergence of the technological community involved in designing and manufacturing computer chips. The community is structured in a way that reflects the underlying physical nature silicon and the numerous other materials and chemicals involved. But it also reflects the human agency of defining new projects......, of visioning the liberation from atoms, of committing to travel many detours in the labyrinths of development, and of perceiving and exploring the affordance that new technologies hide. Some of these characteristics are analyzed empirically in a case study of designing a chip for a digitalized hearing...

  7. Fuzzy VIKOR approach for selection of big data analyst in procurement management

    Directory of Open Access Journals (Sweden)

    Surajit Bag

    2016-07-01

    Full Text Available Background: Big data and predictive analysis have been hailed as the fourth paradigm of science. Big data and analytics are critical to the future of business sustainability. The demand for data scientists is increasing with the dynamic nature of businesses, thus making it indispensable to manage big data, derive meaningful results and interpret management decisions. Objectives: The purpose of this study was to provide a brief conceptual review of big data and analytics and further illustrate the use of a multicriteria decision-making technique in selecting the right skilled candidate for big data and analytics in procurement management. Method: It is important for firms to select and recruit the right data analyst, both in terms of skills sets and scope of analysis. The nature of such a problem is complex and multicriteria decision-making, which deals with both qualitative and quantitative factors. In the current study, an application of the Fuzzy VIsekriterijumska optimizacija i KOmpromisno Resenje (VIKOR method was used to solve the big data analyst selection problem. Results: From this study, it was identified that Technical knowledge (C1, Intellectual curiosity (C4 and Business acumen (C5 are the strongest influential criteria and must be present in the candidate for the big data and analytics job. Conclusion: Fuzzy VIKOR is the perfect technique in this kind of multiple criteria decisionmaking problematic scenario. This study will assist human resource managers and procurement managers in selecting the right workforce for big data analytics.

  8. [Reducing fear in preschool children receiving intravenous injections].

    Science.gov (United States)

    Hsieh, Yi-Chuan; Liu, Hui-Tzu; Cho, Yen-Hua

    2012-06-01

    Our pediatric medical ward administers an average of 80 intravenous injections to preschool children. We found that 91.1% exhibit behavior indicative of fear and anxiety. Over three-quarters (77.8%) of this number suffer severe fear and actively resist receiving injections. Such behavior places a greater than normal burden on human and material resources and often gives family members negative impressions that lower their trust in the healthcare service while raising nurse-patient tensions. Using observation and interviews, we found primary factors in injection fear to be: Past negative experiences, lack of adequate prior communication, measures taken to preemptively control child resistance, and default cognitive behavioral strategies from nursing staff. This project worked to develop a strategy to reduce cases of severe injection fear in preschool children from 77.8% to 38.9% and achieve a capacity improvement target for members of 50%. Our team identified several potential strategy solutions from research papers and books between August 1st, 2009 and April 30th, 2010. Our proposed method included therapeutic games, self-selection of injection position, and cognitive behavioral strategies to divert attention. Other measures were also specified as standard operating procedures for administering pediatric intravenous injections. We applied the strategy on 45 preschool children and identified a post-injection "severe fear" level of 37.8%. This project was designed to reduce fear in children to make them more accepting of vaccinations and to enhance children's positive treatment experience in order to raise nursing care quality.

  9. Cardiovascular effects of intravenous ghrelin infusion in healthy young men

    DEFF Research Database (Denmark)

    Vestergaard, Esben Thyssen; Andersen, Niels Holmark; Hansen, Troels Krarup

    2007-01-01

    Ghrelin infusion improves cardiac function in patients suffering from cardiac failure, and bolus administration of ghrelin increases cardiac output in healthy subjects. The cardiovascular effects of more continuous intravenous ghrelin exposure remain to be studied. We therefore studied the cardio......Ghrelin infusion improves cardiac function in patients suffering from cardiac failure, and bolus administration of ghrelin increases cardiac output in healthy subjects. The cardiovascular effects of more continuous intravenous ghrelin exposure remain to be studied. We therefore studied...... the cardiovascular effects of a constant infusion of human ghrelin at a rate of 5 pmol/kg per minute for 180 min. Fifteen healthy, young (aged 23.2 ± 0.5 yr), normal-weight (23.0 ± 0.4 kg/m2) men volunteered in a randomized double-blind, placebo-controlled crossover study. With the subjects remaining fasting, peak...... myocardial systolic velocity S′, tissue tracking TT, left ventricular ejection fraction EF, and endothelium-dependent flow-mediated vasodilatation were measured. Ghrelin infusion increased S′ 9% (P = 0.002) and TT 10% (P

  10. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  11. Retroperitoneal fibrosis with normal intravenous urogram.

    OpenAIRE

    Creagh, F. M.; Stone, T.; Stephenson, T. P.; Lazarus, J. H.

    1985-01-01

    A 58 year old male presented with a two week history of low back pain and malaise. The intravenous urogram (IVU) at presentation was normal but within three months he had developed renal failure with bilateral ureteric obstruction on repeat IVU. Primary retroperitoneal fibrosis was confirmed at operation. This case demonstrates that retroperitoneal fibrosis may progress rapidly to renal failure within a few months of the first symptoms. In addition, the IVU may be normal in the early stages o...

  12. Total intravenous anesthesia for major burn surgery

    OpenAIRE

    Cancio, Leopoldo C; Cuenca, Phillip B; Walker, Stephen C; Shepherd, John M

    2013-01-01

    Total intravenous anesthesia (TIVA) is frequently used for major operations requiring general anesthesia in critically ill burn patients. We reviewed our experience with this approach. Methods: During a 22-month period, 547 major burn surgeries were performed in this center’s operating room and were staffed by full-time burn anesthesiologists. The records of all 123 TIVA cases were reviewed; 112 records were complete and were included. For comparison, 75 cases were selected at random from a t...

  13. Contrast agent choice for intravenous coronary angiography

    International Nuclear Information System (INIS)

    Zeman, H.D.; Siddons, D.P.

    1989-01-01

    The screening of the general population for coronary artery disease would be practical if a method existed for visualizing the extent of occlusion after an intravenous injection of contrast agent. Measurements performed with monochromatic synchrotron radiation x-rays and an iodine containing contrast agent at the Stanford Synchrotron Radiation Laboratory have shown that such an intravenous angiography procedure would be possible with an adequately intense monochromatic x-ray source. Because of the size and cost of synchrotron radiation facilities it would be desirable to make the most efficient use of the intensity available, while reducing as much as possible the radiation dose experienced by the patient. By choosing contrast agents containing elements with a higher atomic number than iodine, it is possible to both improve the image quality and reduce the patient radiation dose, while using the same synchrotron source. By using Si monochromator crystals with a small mosaic spread, it is possible to increase the x-ray flux available for imaging by over an order of magnitude, without any changes in the storage ring or wiggler magnet. The most critical imaging task for intravenous coronary angiography utilizing synchrotron radiation x-rays is visualizing a coronary artery through the left ventricle or aorta which also contains a contrast agent. Calculations have been made of the signal to noise ratio expected for this imaging task for various contrast agents with atomic numbers between that of iodine and bismuth

  14. Intravenous Lipids for Preterm Infants: A Review

    Directory of Open Access Journals (Sweden)

    Ghassan S. A. Salama

    2015-01-01

    Full Text Available Extremely low birth weight infants (ELBW are born at a time when the fetus is undergoing rapid intrauterine brain and body growth. Continuation of this growth in the first several weeks postnatally during the time these infants are on ventilator support and receiving critical care is often a challenge. These infants are usually highly stressed and at risk for catabolism. Parenteral nutrition is needed in these infants because most cannot meet the majority of their nutritional needs using the enteral route. Despite adoption of a more aggressive approach with amino acid infusions, there still appears to be a reluctance to use early intravenous lipids. This is based on several dogmas that suggest that lipid infusions may be associated with the development or exacerbation of lung disease, displace bilirubin from albumin, exacerbate sepsis, and cause CNS injury and thrombocytopena. Several recent reviews have focused on intravenous nutrition for premature neonate, but very little exists that provides a comprehensive review of intravenous lipid for very low birth and other critically ill neonates. Here, we would like to provide a brief basic overview, of lipid biochemistry and metabolism of lipids, especially as they pertain to the preterm infant, discuss the origin of some of the current clinical practices, and provide a review of the literature, that can be used as a basis for revising clinical care, and provide some clarity in this controversial area, where clinical care is often based more on tradition and dogma than science.

  15. Research Dilemmas with Behavioral Big Data.

    Science.gov (United States)

    Shmueli, Galit

    2017-06-01

    Behavioral big data (BBD) refers to very large and rich multidimensional data sets on human and social behaviors, actions, and interactions, which have become available to companies, governments, and researchers. A growing number of researchers in social science and management fields acquire and analyze BBD for the purpose of extracting knowledge and scientific discoveries. However, the relationships between the researcher, data, subjects, and research questions differ in the BBD context compared to traditional behavioral data. Behavioral researchers using BBD face not only methodological and technical challenges but also ethical and moral dilemmas. In this article, we discuss several dilemmas, challenges, and trade-offs related to acquiring and analyzing BBD for causal behavioral research.

  16. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  17. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  18. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  19. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  20. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  2. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    constructions based on this paradigm include Ishai, Kushilevitz and Ostrovsky [97], Ben- Sasson , Chiesa, Genkin, Tromer and Virza [17], and Bitansky...Security and Privacy. [15] Assaf Ben- David , Noam Nisan, and Benny Pinkas. Fairplaymp: A system for secure multi-party computation. In ACM Conference...cryptographic fault-tolerant distributed computation (extended abstract). In STOC, 1988. 50 CHAPTER 1. CRYPTOGRAPHY FOR BIG DATA SECURITY [17] Eli Ben- Sasson

  3. Combining Gamification with Big Data

    OpenAIRE

    Godinho, Roger Alan Mateus

    2014-01-01

    This dissertation aims to understand if there is an impact of Gamification and Big Data Technologies combination on the strategy of companies. In order to study this, the present dissertation tests if the benefits raised by the combination of the two creates value to the final users, and moreover, if it can become a sustainable competitive advantage. For this, a research model is developed, where 5 companies that are already introducing such a strategy are analyzed, as well as the Gamificatio...

  4. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  5. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  6. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  7. Small Area Model-Based Estimators Using Big Data Sources

    Directory of Open Access Journals (Sweden)

    Marchetti Stefano

    2015-06-01

    Full Text Available The timely, accurate monitoring of social indicators, such as poverty or inequality, on a finegrained spatial and temporal scale is a crucial tool for understanding social phenomena and policymaking, but poses a great challenge to official statistics. This article argues that an interdisciplinary approach, combining the body of statistical research in small area estimation with the body of research in social data mining based on Big Data, can provide novel means to tackle this problem successfully. Big Data derived from the digital crumbs that humans leave behind in their daily activities are in fact providing ever more accurate proxies of social life. Social data mining from these data, coupled with advanced model-based techniques for fine-grained estimates, have the potential to provide a novel microscope through which to view and understand social complexity. This article suggests three ways to use Big Data together with small area estimation techniques, and shows how Big Data has the potential to mirror aspects of well-being and other socioeconomic phenomena.

  8. Surface urban heat island across 419 global big cities.

    Science.gov (United States)

    Peng, Shushi; Piao, Shilong; Ciais, Philippe; Friedlingstein, Pierre; Ottle, Catherine; Bréon, François-Marie; Nan, Huijuan; Zhou, Liming; Myneni, Ranga B

    2012-01-17

    Urban heat island is among the most evident aspects of human impacts on the earth system. Here we assess the diurnal and seasonal variation of surface urban heat island intensity (SUHII) defined as the surface temperature difference between urban area and suburban area measured from the MODIS. Differences in SUHII are analyzed across 419 global big cities, and we assess several potential biophysical and socio-economic driving factors. Across the big cities, we show that the average annual daytime SUHII (1.5 ± 1.2 °C) is higher than the annual nighttime SUHII (1.1 ± 0.5 °C) (P < 0.001). But no correlation is found between daytime and nighttime SUHII across big cities (P = 0.84), suggesting different driving mechanisms between day and night. The distribution of nighttime SUHII correlates positively with the difference in albedo and nighttime light between urban area and suburban area, while the distribution of daytime SUHII correlates negatively across cities with the difference of vegetation cover and activity between urban and suburban areas. Our results emphasize the key role of vegetation feedbacks in attenuating SUHII of big cities during the day, in particular during the growing season, further highlighting that increasing urban vegetation cover could be one effective way to mitigate the urban heat island effect.

  9. Leveraging Big Data Analytics for Cache-Enabled Wireless Networks

    OpenAIRE

    Abdel Kader, Manhal; Bastug, Ejder; Bennis, Mehdi; zeydan, engin; Karatep, Alper; Salih Er, Ahmet; Debbah, Merouane

    2015-01-01

    International audience; While 5G wireless networks are expected to handle the ever growing data avalanche, classical deployment/optimiza-tion approaches such as hyper-dense deployment of base stations or having more bandwidth are cost-inefficient, and are therefore seen as stopgaps. In this regard, context-aware approaches which exploits human predictability, recent advances in storage, edge/cloud computing and big data analytics are needed. In this article, we approach this problem from a pr...

  10. From individual to group privacy in big data analytics

    OpenAIRE

    Mittelstadt, B

    2017-01-01

    Mature information societies are characterised by mass production of data that provide insight into human behaviour. Analytics (as in big data analytics) has arisen as a practice to make sense of the data trails generated through interactions with networked devices, platforms and organisations. Persistent knowledge describing the behaviours and characteristics of people can be constructed over time, linking individuals into groups or classes of interest to the platform. Analytics allows for a...

  11. Small data, data infrastructures and big data (Working Paper 1)

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2014-01-01

    The production of academic knowledge has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will, however, continue to be important in the future because of their utility in answer...

  12. Mycotic aneurysms in intravenous drug abusers: the utility of intravenous digital subtraction angiography

    International Nuclear Information System (INIS)

    Shetty, P.C.; Krasicky, G.A.; Sharma, R.P.; Vemuri, B.R.; Burke, M.M.

    1985-01-01

    Two-hundred thirteen intravenous digital subtraction angiographic (DSA) examinations were performed on 195 intravenous drug abusers to rule out the possibility of a mycotic aneurysm in a groin, neck, or upper extremity infection. Twenty-three surgically proved cases of mycotic aneurysm were correctly identified with no false positive results. In addition, six cases of major venous occlusion were documented. The authors present the results of their experience and conclude that DSA is an effective and cost-efficient method of examining this high risk patient population

  13. Early Prediction of Movie Box Office Success based on Wikipedia Activity Big Data

    OpenAIRE

    Mestyán, Márton; Yasseri, Taha; Kertész, János

    2012-01-01

    Use of socially generated "big data" to access information about collective states of the minds in human societies has become a new paradigm in the emerging field of computational social science. A natural application of this would be the prediction of the society's reaction to a new product in the sense of popularity and adoption rate. However, bridging the gap between "real time monitoring" and "early predicting" remains a big challenge. Here we report on an endeavor to build a minimalistic...

  14. Big Data Innovation Challenge : Pioneering Approaches to Data-Driven Development

    OpenAIRE

    World Bank Group

    2016-01-01

    Big data can sound remote and lacking a human dimension, with few obvious links to development and impacting the lives of the poor. Concepts such as anti-poverty targeting, market access or rural electrification seem far more relevant – and easier to grasp. And yet some of today’s most groundbreaking initiatives in these areas rely on big data. This publication profiles these and more, sho...

  15. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  16. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  17. Intravenous drugs infusion safety through smart pumps

    Directory of Open Access Journals (Sweden)

    C. Gómez-Baraza

    2014-07-01

    Full Text Available Objective: To analyze the role of smart infusion pumps in reducing errors related with the administration of intravenous medications. Method: Retrospective, observational study analyzing the implementation of a system with smart intravenous infusion pumps (Hospira MedNetTM and the role of the safety system for the detection of errors during the administration of drugs, sera, and blood. We included infusions administered at the day-care hospitals of hematology, oncology, rheumatology, and oncopediatrics. We analyzed adherence to the safety system, the number of programming errors detected, the commonly implicated drugs in these errors, and improvement actions. Results: During the study period, 120 smart pumps were implemented and data on 70,028 infusions were gathered. The rate of adherence to the safety program was 62.30% in hematology (6,887 infusions, 60,30% in oncology (28,127 infusions, 46,50% in rheumatology (1,950 infusions and 1.8% in oncopediatrics (139 infusions. 3,481 out of the established limits programming alerts were generated by the pumps: 2,716 of relative limit and 765 of absolute limit. En 807 infusions (2.17%, errors that could have had consequences for the patients could be prevented. These findings allowed implementing a series of strategies aimed at minimizing these errors in the future. Conclusions: The Hospira MedNetTM system detects deviations from the established protocols of intravenous infusion, preventing in this way potential adverse events for the patients. It also allows establishing correction measures and implementing the improvement strategies.

  18. Retroperitoneal fibrosis with normal intravenous urogram.

    Science.gov (United States)

    Creagh, F. M.; Stone, T.; Stephenson, T. P.; Lazarus, J. H.

    1985-01-01

    A 58 year old male presented with a two week history of low back pain and malaise. The intravenous urogram (IVU) at presentation was normal but within three months he had developed renal failure with bilateral ureteric obstruction on repeat IVU. Primary retroperitoneal fibrosis was confirmed at operation. This case demonstrates that retroperitoneal fibrosis may progress rapidly to renal failure within a few months of the first symptoms. In addition, the IVU may be normal in the early stages of the illness. Images Figure 1 Figure 2 PMID:3983053

  19. Switching between intravenous and subcutaneous trastuzumab

    DEFF Research Database (Denmark)

    Gligorov, Joseph; Curigliano, Giuseppe; Müller, Volkmar

    2017-01-01

    AIM: To assess the safety and tolerability of switching between subcutaneous (SC) and intravenous (IV) trastuzumab in the PrefHer study (NCT01401166). PATIENTS AND METHODS: Patients with HER2-positive early breast cancer completed (neo)adjuvant chemotherapy and were randomised to receive four....... Rates of clinically important events, including grade ≥3 AEs, serious AEs, AEs leading to study drug discontinuation and cardiac AEs, were low and similar between treatment arms (safety signals for trastuzumab were observed. CONCLUSIONS: PrefHer revealed...... that switching from IV to SC trastuzumab (hand-held syringe or SID) or vice versa did not impact the known safety profile of trastuzumab....

  20. Medical big data: promise and challenges

    OpenAIRE

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-01-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct fr...

  1. Background: A Social Framework for Big Data

    OpenAIRE

    Ruppert, Evelyn; Harvey, Penny; Lury, Cellia; Mackenzie, Adrian; McNally, Ruth; Baker, Stephanie Alice; Kallianos, Yannis; Lewis, Camilla

    2015-01-01

    This is a Background document to 'A Social Framework for Big Data', which proposes an agenda that understands how social composition and social effects are related and proposes that giving Big Data a ‘social intelligence’ requires acting with an ethic of care. The Background provides a discussion of some conceptual issues and debates related to this agenda. Both documents along with a working paper, 'Socialising Big Data: from concept to practice' are the product of an ESRC funded project, So...

  2. The Natural Science Underlying Big History

    Directory of Open Access Journals (Sweden)

    Eric J. Chaisson

    2014-01-01

    Full Text Available Nature’s many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics is needed to describe cosmic evolution’s major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated.

  3. The Natural Science Underlying Big History

    Science.gov (United States)

    Chaisson, Eric J.

    2014-01-01

    Nature's many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated. PMID:25032228

  4. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  5. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  6. Survey of Big Data Information Security

    Directory of Open Access Journals (Sweden)

    Aida Tofikovna Makhmudova

    2016-06-01

    Full Text Available Today the information security (IS of data mining is the crucial and comprehensive issue for organizations of the different spheres and size. The main challenges of Big Data are management of large amounts of heterogeneous information and providing its availability. Big Data protection against unauthorized access and corruption (keeping its confidentiality and integrity and availability maintenance form the key research priorities in this field. The issues related to providing these Big Data features are considered in the paper. The existing approaches to their solution are analyzed. Also some concepts for their improvement while designing the secure Big Data mining algorithm are formulated in accordance to IS properties.

  7. Medical big data: promise and challenges

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-01-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology. PMID:28392994

  8. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  9. Traffic information computing platform for big data

    Science.gov (United States)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-10-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  10. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  11. Organizational Design Challenges Resulting From Big Data

    Directory of Open Access Journals (Sweden)

    Jay R. Galbraith

    2014-04-01

    Full Text Available Business firms and other types of organizations are feverishly exploring ways of taking advantage of the big data phenomenon. This article discusses firms that are at the leading edge of developing a big data analytics capability. Firms that are currently enjoying the most success in this area are able to use big data not only to improve their existing businesses but to create new businesses as well. Putting a strategic emphasis on big data requires adding an analytics capability to the existing organization. This transformation process results in power shifting to analytics experts and in decisions being made in real time.

  12. Big data: an introduction for librarians.

    Science.gov (United States)

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included.

  13. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  14. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  15. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  16. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  17. [Effects of intravenous transplantation of human umbilical cord blood mononuclear cells combined compound Danshen dripping pills on the microenvironment and apoptosis in the myocardium of the rabbits with acute myocardial infarction].

    Science.gov (United States)

    Yuan, Chunjun; Ai, Qi; Deng, Liuxia; Yu, Guolong

    2013-08-01

    To explore the effects of compound Danshen dripping pills (CDDP) and CDDP combined with transplantation of human umbilical cord blood cells (HUMNCs) on the inflammatory response, oxidative stress, myocardial cell apoptosis and cardiac function, and also to investigate the possible mechanisms of the combined therapy in the acute myocardial infarction (AMI). Rabbit model of AMI successfully established by ligation of the left anterior coronary artery (LAD). Forty rabbits were randomly divided into 4 groups (n=10 per group): a control group, injected with 0.5 mL of saline in 24 h after AMI and then gavaged with 5 mL of saline daily; a CDDP group, injected with saline 0.5 mL after AMI and then gavaged with CDDP (270 mg/d) daily; a transplantation group, injected with 0.5 mL of saline contained 3 × 10(7) HUCBMCs [labeled with green fluorescent protein (GFP)] and then gavaged with 5 mL of saline daily; a combined group, injected with 0.5 mL of saline contained 3 × 10(7) HUCBMCs (labeled with GFP) and then gavaged with CDDP (270 mg/d) daily. Cardiac function index such as left ventricular fractional shorting (LVFS) and ejection fraction(LVEF) were measured by echocardiography; the pathological changes were observed by HE staining and the white blood cells in the myocardium were determined by light microscopy. The superoxide dismutase (SOD) activity and malondialdehyde (MDA) content in myocardium were detected by nitrotetrazolium blue chloride (NBT) and thiobarbituric acid colorimetric measurement respectively. The number of transplanted cells in the myocardium was examined by GFP positive cells counted with fluorescence microscopy. 1) Compared with the control group (at 1 or 4 week), LVEF and LVFS were significant improved in the CDDP group, the transplantation group and the combined groups (all Pmyocardial cell apoptosis ratio were decreased significantly in the CDDP group, the transplantation group and the combined groups (all Pmyocardial infarction area in the

  18. Big bang nucleosynthesis: An update

    Energy Technology Data Exchange (ETDEWEB)

    Olive, Keith A. [William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States)

    2013-07-23

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.

  19. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  20. Inhomogeneous Big Bang Nucleosynthesis Revisited

    OpenAIRE

    Lara, J. F.; Kajino, T.; Mathews, G. J.

    2006-01-01

    We reanalyze the allowed parameters for inhomogeneous big bang nucleosynthesis in light of the WMAP constraints on the baryon-to-photon ratio and a recent measurement which has set the neutron lifetime to be 878.5 +/- 0.7 +/- 0.3 seconds. For a set baryon-to-photon ratio the new lifetime reduces the mass fraction of He4 by 0.0015 but does not significantly change the abundances of other isotopes. This enlarges the region of concordance between He4 and deuterium in the parameter space of the b...

  1. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  2. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  3. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  4. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  5. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  6. Ultrasonography-guided peripheral intravenous access versus traditional approaches in patients with difficult intravenous access.

    Science.gov (United States)

    Costantino, Thomas G; Parikh, Aman K; Satz, Wayne A; Fojtik, John P

    2005-11-01

    We assess the success rate of emergency physicians in placing peripheral intravenous catheters in difficult-access patients who were unsuccessfully cannulated by emergency nurses. A technique using real-time ultrasonographic guidance by 2 physicians was compared with traditional approaches using palpation and landmark guidance. This was a prospective, systematically allocated study of all patients requiring intravenous access who presented to 2 university hospitals between October 2003 and March 2004. Inclusion criterion was the inability of any available nurse to obtain intravenous access after at least 3 attempts on a subgroup of patients who had a history of difficult intravenous access because of obesity, history of intravenous drug abuse, or chronic medical problems. Exclusion criterion was the need for central venous access. Patients presenting on odd days were allocated to the ultrasonographic-guided group, and those presenting on even days were allocated to the traditional-approach group. Endpoints were successful cannulation, number of sticks, time, and patient satisfaction. Sixty patients were enrolled, 39 on odd days and 21 on even days. Success rate was greater for the ultrasonographic group (97%) versus control (33%), difference in proportions of 64% (95% confidence interval [CI] 39% to 71%). The ultrasonographic group required less overall time (13 minutes versus 30 minutes, for a difference of 17 [95% CI 0.8 to 25.6]), less time to successful cannulation from first percutaneous puncture (4 minutes versus 15 minutes, for a difference of 11 [95% CI 8.2 to 19.4]), and fewer percutaneous punctures (1.7 versus 3.7, for a difference of 2.0 [95% CI 1.27 to 2.82]) and had greater patient satisfaction (8.7 versus 5.7, for a difference of 3.0 [95% CI 1.82 to 4.29]) than the traditional landmark approach. Ultrasonographic-guided peripheral intravenous access is more successful than traditional "blind" techniques, requires less time, decreases the number of

  7. Intravenous Carbamazepine for Adults With Seizures.

    Science.gov (United States)

    Vickery, P Brittany; Tillery, Erika E; DeFalco, Alicia Potter

    2018-03-01

    To review the pharmacology, pharmacokinetics, efficacy, safety, dosage and administration, potential drug-drug interactions, and place in therapy of the intravenous (IV) formulation of carbamazepine (Carnexiv) for the treatment of seizures in adult patients. A comprehensive PubMed and EBSCOhost search (1945 to August 2017) was performed utilizing the keywords carbamazepine, Carnexiv, carbamazepine intravenous, IV carbamazepine, seizures, epilepsy, and seizure disorder. Additional data were obtained from literature review citations, manufacturer's product labeling, and Lundbeck website as well as Clinicaltrials.gov and governmental sources. All English-language trials evaluating IV carbamazepine were analyzed for this review. IV carbamazepine is FDA approved as temporary replacement therapy for treatment of adult seizures. Based on a phase I trial and pooled data from 2 open-label bioavailability studies comparing oral with IV dosing, there was no noted indication of loss of seizure control in patients switched to short-term replacement antiepileptic drug therapy with IV carbamazepine. The recommended dose of IV carbamazepine is 70% of the patient's oral dose, given every 6 hours via 30-minute infusions. The adverse effect profile of IV carbamazepine is similar to that of the oral formulation, with the exception of added infusion-site reactions. IV carbamazepine is a reasonable option for adults with generalized tonic-clonic or focal seizures, previously stabilized on oral carbamazepine, who are unable to tolerate oral medications for up to 7 days. Unknown acquisition cost and lack of availability in the United States limit its use currently.

  8. Flank pain: is Intravenous Urogram necessary?

    Science.gov (United States)

    Teh, H S; Lin, M B; Khoo, T K

    2001-09-01

    To determine the diagnostic yield of Intravenous Urogram (IVU) and the values of plain radiograph of kidney, ureter and bladder (KUB) and urinalysis as screening tests, with the objective to improve the cost effectiveness, in the management of patients presenting with flank pain due to urinary lithiasis. All Intravenous Urogram (IVU) request forms and reports for the month of February 1998 were audited. The case notes, urinalysis, KUB and IVU films were traced and reviewed. There were 110 patients investigated, 61.8% (68) had normal IVU, 38.2% (42) had abnormal IVU. The sensitivity and specificity of KUB alone was 79.4% and 90%. The sensitivity using urinalysis alone was 90.9% and its specificity 33.8%. The sensitivity of combined KUB and urinalysis was 100% and its specificity 26%, with a negative predictive value of 100%. All the patients with both negative KUB and urinalysis in our study were found to have negative IVU. Our study shows that in patients with both negative KUB and urinalysis, the yield of IVU is very low and may not be necessary. This is important, as an IVU examination is not without risk. A combination of KUB with urinary analysis and careful evaluation of clinical symptoms will improve the cost-effectiveness of patient management.

  9. Intravenous dynamic nucleography of the brain

    International Nuclear Information System (INIS)

    Rosenthall, L.

    1972-01-01

    The advent of stationary imaging devices has created interest in studying cerebral blood flows and transits with diffusible and nondiffusible radioactive indicators. Much of this has disclosed interesting pathophysiology, but not necessarily of significant diagnostic import to include in routine patient workup. The conventional static brain scan is one of the more useful tests in the nuclear medicine armamentarium for uncovering and localizing intracranial disease. Unfortunately, it does not as a rule clearly distinguish cerebral vascular accidents, neoplasms, arteriovenous malformations, and so forth, which is important from the standpoint of patient management. Aside from clinical impressions a diagnosis is often based on the appearance of the radiocontrast angiogram, which is not always desirable because of the implicit hazards. Thus it is incumbent upon investigators to search for innocuous intravenous methods of identifying the various intracranial afflictions. Intravenous 99 /sup m/Tc-pertechnetate comparisons of brain hemisphere perfusion as a routine complement to static brain imaging are useful. Estimations of disparate radioactive transits are made qualitatively from serial 4 to 5 sec exposure scintiphotographs. (U.S.)

  10. Adverse reactions to iotroxate at intravenous cholangiography

    International Nuclear Information System (INIS)

    Nilsson, U.

    1987-01-01

    The number and type of adverse reactions to meglumine iotroxate at intravenous infusion cholangiography, performed one day prior to elective cholecystectomy, were recorded in a prospective investigation of 196 asymptomatic, anicteric patients. One hundred ml (50 mg I/ml) of contrast medium was infused over a period of 30 minutes. Only 2 minor (1%) and no severe or fatal reactions were noted. A review of the literature on the use of iotroxate in 2492 patients, including those in the present investigation, revealed a complication rate of 3.5% (3.0% minor, 0.3% moderate and 0.2% severe reactions) at infusion of iotroxate (5.0-8.0 g I) over a period of 30 to 120 minutes. This compared favourably with the 5% complication rate (4% minor, 0.5% moderate and 0.5% severe reactions) at infusion of iodoxamate and the 9% complication rate (5% minor, 1% moderate and 3% severe reactions) at infusion of ioglycamide. Irrespective of the contrast agent used, the frequency of adverse reactions at infusion was found to be 3 times lower than when equal amounts (5.0-5.6 g I) of the same medium were injected. It is concluded that, at present, infusion of iotroxate in an amount which approximates to the transportation maximum of the liver is the least toxic way of performing intravenous cholangiography with an optimum filling of the bile ducts. (orig.)

  11. Panlobular emphysema in young intravenous Ritalin abusers

    International Nuclear Information System (INIS)

    Schmidt, R.A.; Glenny, R.W.; Godwin, J.D.; Hampson, N.B.; Cantino, M.E.; Reichenbach, D.D.

    1991-01-01

    We studied a distinctive group of young intravenous Ritalin abusers with profound obstructive lung disease. Clinically, they seemed to have severe emphysema, but the pathologic basis of their symptoms had not been investigated previously. Seven patients have died and been autopsied: in four, the lungs were fixed, inflated, dried, and examined in detail radiologically, grossly, microscopically, and by electron probe X-ray microanalysis. All seven patients had severe panlobular (panacinar) emphysema that tended to be more severe in the lower lung zones and that was associated with microscopic talc granulomas. Vascular involvement by talc granulomas was variable, but significant interstitial fibrosis was not present. Five patients were tested for alpha-1-antitrypsin deficiency and found to be normal, as were six similar living patients. These findings indicate that some intravenous drug abusers develop emphysema that clinically, radiologically, and pathologically resembles that caused by alpha-1-antitrypsin deficiency but which must have a different pathogenesis. Talc from the Ritalin tablets may be important, but the mechanism remains to be elucidated

  12. Intravenous immunoglobulin therapy and systemic lupus erythematosus.

    Science.gov (United States)

    Zandman-Goddard, Gisele; Levy, Yair; Shoenfeld, Yehuda

    2005-12-01

    Systemic lupus erythematosus (SLE) is a multisystem autoimmune disease with diverse manifestations. We suggest that intravenous immunoglobulin (IVIg) therapy may be beneficial and safe for various manifestations in SLE. A structured literature search of articles published on the efficacy of IVIg in the treatment of SLE between 1983 and 2005 was conducted. We searched the terms "IVIg," "intravenous immunoglobulin," "lupus," "SLE," and "systemic lupus erythematosus." The various clinical manifestations of SLE that were reported to be successfully treated by IVIg in case reports include autoimmune hemolytic anemia, acquired factor VIII inhibitors, acquired von Willebrand disease, pure red cell aplasia, thrombocytopenia, pancytopenia, myelofibrosis, pneumonitis, pleural effusion, pericarditis, myocarditis, cardiogenic shock, nephritis, end-stage renal disease, encephalitis, neuropsychiatric lupus, psychosis, peripheral neuropathy, polyradiculoneuropathy, and vasculitis. The most extensive experience is with lupus nephritis. There are only a few case series of IVIg use in patients with SLE with various manifestations, in which the response rate to IVIg therapy ranged from 33 to 100%. We suggest that IVIg devoid of sucrose, at a dose of 2 g/kg over a 5-d period given uniformly and at a slow infusion rate in patients without an increased risk for thromboembolic events or renal failure, is a safe and beneficial adjunct therapy for cases of SLE that are resistant to or refuse conventional treatment. The duration of therapy is yet to be established. Controlled trials are warranted.

  13. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  14. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    Science.gov (United States)

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  15. Comparison of the Effectiveness of a Virtual Simulator With a Plastic Arm Model in Teaching Intravenous Catheter Insertion Skills.

    Science.gov (United States)

    Günay İsmailoğlu, Elif; Zaybak, Ayten

    2018-02-01

    The objective of this study was to compare the effectiveness of a virtual intravenous simulator with a plastic arm model in teaching intravenous catheter insertion skills to nursing students. We used a randomized controlled quasi-experimental trial design and recruited 65 students who were assigned to the experimental (n = 33) and control (n = 32) groups using the simple random sampling method. The experimental group received intravenous catheterization skills training on the virtual intravenous simulator, and the control group received the same training on a plastic model of a human arm. Data were collected using the personal information form, intravenous catheterization knowledge assessment form, Intravenous Catheterization Skill Test, Self-Confidence and Satisfaction Scale, and Fear Symptoms Scale. In the study, the mean scores in the control group were 20.44 for psychomotor skills, 15.62 for clinical psychomotor skills, 31.78 for self-confidence, and 21.77 for satisfaction. The mean scores in the experimental group were 45.18 for psychomotor skills, 16.28 for clinical psychomotor skills, 34.18 for self-confidence, and 43.89 for satisfaction. The results indicated that psychomotor skills and satisfaction scores were higher in the experimental group, while the clinical psychomotor skills and self-confidence scores were similar in both groups. More students in the control group reported experiencing symptoms such as cold and sweaty hands, significant restlessness, and tense muscles than those in the experimental group.

  16. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2017-06-01

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  17. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  18. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  19. The BigBOSS spectrograph

    Science.gov (United States)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  20. Effects of intravenous diclofenac on postoperative sore throat in ...

    African Journals Online (AJOL)

    Effects of intravenous diclofenac on postoperative sore throat in patients undergoing laparoscopic surgery at Aga Khan University Hospital, Nairobi: A prospective, randomized, double blind controlled trial.

  1. Intravenous methylprednisolone pulse therapy for children with epileptic encephalopathy

    OpenAIRE

    Pera, Maria Carmela; Randazzo, Giovanna; Masnada, Silvia; Dontin, Serena Donetti; De Giorgis, Valentina; Balottin, Umberto; Veggiotti, Pierangelo

    2015-01-01

    The aim of this retrospective study of children affected by epileptic encephalopathy was to evaluate seizure frequency, electroencephalographic pattern and neuropsychological status, before and after intravenous methylprednisolone therapy.

  2. [Big data, medical language and biomedical terminology systems].

    Science.gov (United States)

    Schulz, Stefan; López-García, Pablo

    2015-08-01

    A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.

  3. Big Data and the brave new world of social media research

    Directory of Open Access Journals (Sweden)

    Ralph Schroeder

    2014-12-01

    Full Text Available The recent Facebook study about emotional contagion has generated a high-profile debate about the ethical and social issues in Big Data research. These issues are not unprecedented, but the debate highlighted that, in focusing on research ethics and the legal issues about this type of research, an important larger picture is overlooked about the extent to which free will is compatible with the growth of deterministic scientific knowledge, and how Big Data research has become central to this growth of knowledge. After discussing the ‘emotional contagion study’ as an illustration, these larger issues about Big Data and scientific knowledge are addressed by providing definitions of data, Big Data and of how scientific knowledge changes the human-made environment. Against this background, it will be possible to examine why the uses of data-driven analyses of human behaviour in particular have recently experienced rapid growth. The essay then goes on to discuss the distinction between basic scientific research as against applied research, a distinction which, it is argued, is necessary to understand the quite different implications in the context of scientific as opposed to applied research. Further, it is important to recognize that Big Data analyses are both enabled and constrained by the nature of data sources available. Big Data research is bound to become more widespread, and this will require more awareness on the part of data scientists, policymakers and a wider public about its contexts and often unintended consequences.

  4. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  5. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Veneziano, G.

    2000-01-01

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  6. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  7. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  8. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...

  9. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  10. Big system: Interactive graphics for the engineer

    Science.gov (United States)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  11. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  12. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  13. The ethics of Big data: analytical survey

    OpenAIRE

    GIBER L.; KAZANTSEV N.

    2015-01-01

    The number of recent publications on the matter of ethical challenges of the implementation of Big Data has signified the growing interest to all the aspects of this issue. The proposed study specifically aims at analyzing ethical issues connected with Big Data.

  14. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  15. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  16. Total knee arthroplasty in patients with a history of illicit intravenous drug abuse.

    Science.gov (United States)

    Bauer, David E; Hingsammer, Andreas; Ernstbrunner, Lukas; Aichmair, Alexander; Rosskopf, Andrea B; Eckers, Franziska; Wieser, Karl; Fucentese, Sandro F

    2018-01-01

    Injection drug users are at high risk for both infection with blood-borne pathogens, namely, human immune deficiency virus (HIV), hepatitis-B, -C virus, various bacterial infections, as well as early primary and secondary joint degeneration. When total knee arthroplasty (TKA) is anticipated the risk of septic complications is a major concern. The purpose of this study was to assess the clinical and radiographic outcome of patients with a history of intravenous drug use after total knee arthroplasty. The primary outcome was revision rate. Secondary outcomes were the Western Ontario and McMaster Universities Arthritis Index (WOMAC), Knee Society Score (KSS) and radiographic loosening. We retrospectively reviewed the records of 1,692 TKA performed or revised in our institution. Data of 18 TKA in 12 patients (11 male, 1 female; average age 42, range 23-62 years) with a history of intravenous opioid abuse were available for final analysis. The mean follow up was 125 (range 25-238) months. Seven patients required revision surgery due to periprosthetic joint infection after 62 months (range 5-159): one two staged revision, three arthrodesis and three amputations. The median prosthesis survival was 101 (95%-CI 48-154) months. Total knee arthroplasty in patients with a history of intravenous drug abuse is associated with major complications, including above-the-knee amputation. If permanent abstinence from intravenous drug abuse is doubtful, other therapeutic options including primary arthrodesis should be considered.

  17. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  18. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  19. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  20. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  1. The big de Rham–Witt complex

    DEFF Research Database (Denmark)

    Hesselholt, Lars

    2015-01-01

    This paper gives a new and direct construction of the multi-prime big de Rham–Witt complex, which is defined for every commutative and unital ring; the original construction by Madsen and myself relied on the adjoint functor theorem and accordingly was very indirect. The construction given here...... by the universal derivation of the underlying ring together with an additional structure depending directly on the λ-ring structure in question. In the case of the ring of big Witt vectors, this additional structure gives rise to divided Frobenius operators on the module of Kähler differentials....... It is the existence of these divided Frobenius operators that makes the new construction of the big de Rham–Witt complex possible. It is further shown that the big de Rham–Witt complex behaves well with respect to étale maps, and finally, the big de Rham–Witt complex of the ring of integers is explicitly evaluated....

  2. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  3. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  4. Anaphylaxis after intravenous infusion of dexketoprofen trometamol

    Directory of Open Access Journals (Sweden)

    Sertac Guler

    2016-09-01

    Full Text Available Dexketoprofen trometamol (DT, a nonsteroidal anti-inflammatory drug, is a highly water-soluble salt and active enantiomer of rac-ketoprofen. Its parenteral form is commonly used for acute pain management in emergency departments of our country. Side effects such as diarrhea, indigestion, nausea, stomach pain, and vomiting may be seen after the use of DT. Anaphylactic shock (AS secondary to infusion of DT is very rare and, to our knowledge, it is the first case report describing this side effect. This case report was presented to emphasize that AS may be seen after the use of DT. Keywords: Anaphylactic shock, Dexketoprofen trometamol, Intravenous infusion (MeSH database

  5. Intravenous urography in children and youth

    Energy Technology Data Exchange (ETDEWEB)

    Pedersen, H.K.; Gudmundsen, T.E.; Oestensen, H.; Pape, J.F.

    1987-10-01

    This report derives from Tromsoe in northern Norway. In a retrospective study of the indications for intravenous urography (IU) and the findings at IU in 740 patients (451 girls and 289 boys) aged 0-19 years, we found that urinary tract infections accounted for 69.4% of the IU in females and 30.1% of the IU in males, most often seen in the youngest patients. The pathological findings most frequently seen were anomalies (17 females and 10 males) and urinary tract obstruction (3 females and 15 males). The present study indicates the following: first, that the yield of IU in the primary investigation of children and youth suffering from enuresis and non-specific abdominal disturbancies is small; and second, that the use of IU in children and youth with urinary tract infection and haematuria should be questioned and reconsidered.

  6. Renal trauma and the intravenous urogram.

    Science.gov (United States)

    Oakland, C D; Britton, J M; Charlton, C A

    1987-01-01

    A retrospective analysis of all patients with blunt abdominal trauma associated with haematuria admitted to one hospital (Royal United, Bath) in a 10-year period was conducted to establish the contribution of the intravenous urogram (IVU) in their management. Eighty-one case records were analysed. Of 35 IVUs performed in patients with microscopic (reagentstrip positive) haematuria, only one was abnormal. In contrast, 27 IVUs performed in patients with macroscopic (naked eye) haematuria revealed 17 major injuries and 5 previously unrecognized congenital abnormalities. It is concluded that an IVU is an unnecessary and non-contributory investigation in patients with microscopic haematuria and guidelines are suggested for the role of IVU in patients with blunt abdominal trauma associated with haematuria. PMID:3560121

  7. Retrocaval ureter: the importance of intravenous urography.

    Science.gov (United States)

    Hassan, Radhiana; Aziz, Azian Abd; Mohamed, Siti Kamariah Che

    2011-10-01

    Retrocaval ureter is a rare cause of hydronephrosis. Its rarity and non-specific presentation pose a challenge to surgeons and radiologists in making the correct diagnosis. Differentiation from other causes of urinary tract obstruction, especially the more common urolithiasis, is important for successful surgical management. Current practice has seen multislice computed tomography (MSCT) rapidly replaces intravenous urography (IVU) in the assessment of patients with hydronephrosis due to suspected urolithiasis, especially ureterolithiasis. However, MSCT, without adequate opacification of the entire ureter, may allow the physician to overlook a retrocaval ureter as the cause of hydronephrosis. High-resolution IVU images can demonstrate the typical appearance that leads to the accurate diagnosis of a retrocaval ureter. We reported a case that illustrates this scenario and highlights the importance of IVU in the assessment of a complex congenital disorder involving the urinary tract.

  8. Intravenous immunoglobulin, pharmacogenomics, and Kawasaki disease.

    Science.gov (United States)

    Kuo, Ho-Chang; Hsu, Yu-Wen; Wu, Mei-Shin; Chien, Shu-Chen; Liu, Shih-Feng; Chang, Wei-Chiao

    2016-02-01

    Kawasaki disease (KD) is a systemic vasculitis of unknown etiology and it is therefore worth examining the multifactorial interaction of genes and environmental factors. Targeted genetic association and genome-wide association studies have helped to provide a better understanding of KD from infection to the immune-related response. Findings in the past decade have contributed to a major breakthrough in the genetics of KD, with the identification of several genomic regions linked to the pathogenesis of KD, including ITPKC, CD40, BLK, and FCGR2A. This review focuses on the factors associated with the genetic polymorphisms of KD and the pharmacogenomics of the response to treatment in patients with intravenous immunoglobulin resistance. Copyright © 2014. Published by Elsevier B.V.

  9. [Use of intravenous immunoglobulins in pediatrics].

    Science.gov (United States)

    Duse, M; Plebani, A; Crispino, P; Ugazio, A G

    1991-01-01

    Intramuscular Immunoglobulin (IMIG) have been used for 40 years in substitution therapy for antibody deficiencies and as prophylaxis for and treatment of several infectious diseases. Modified and intact intravenous immunoglobulin preparations (IVIG) have now been available for more than 10 years: only the intact product express full Fc- mediated functions with a biological half-life of IgG (3-4 weeks). These preparations have constituted an important achievement in the treatment of humoral immunodeficiencies also resulting in a dramatic improvement of the prognosis. The use of IVIG has also modified the therapeutic approach to several secondary and acquired immunodeficiencies. Treatment with IVIG for immune modulation in several diseases is investigated: substantial data indicate a useful role in selected cases of idiopathic thrombocytopenic purpura, Kawasaky disease and in some neurologic diseases. IVIG are substantially safe and severe side effects have been rarely reported.

  10. Intravenous urography in children and youth

    International Nuclear Information System (INIS)

    Pedersen, H.K.; Gudmundsen, T.E.; Oestensen, H.; Pape, J.F.

    1987-01-01

    This report derives from Tromsoe in northern Norway. In a retrospective study of the indications for intravenous urography (IU) and the findings at IU in 740 patients (451 girls and 289 boys) aged 0-19 years, we found that urinary tract infections accounted for 69.4% of the IU in females and 30.1% of the IU in males, most often seen in the youngest patients. The pathological findings most frequently seen were anomalies (17 females and 10 males) and urinary tract obstruction (3 females and 15 males). The present study indicates the following: first, that the yield of IU in the primary investigation of children and youth suffering from enuresis and non-specific abdominal disturbancies is small; and second, that the use of IU in children and youth with urinary tract infection and haematuria should be questioned and reconsidered. (orig.)

  11. Solar urticaria successfully treated with intravenous immunoglobulin.

    LENUS (Irish Health Repository)

    Hughes, R

    2012-02-01

    Idiopathic solar urticaria (SU) is a rare, debilitating photodermatosis, which may be difficult to treat. First-line treatment with antihistamines is effective in mild cases, but remission after phototherapeutic induction of tolerance is often short-lived. Other treatment options include plasma exchange, photopheresis and cyclosporin. We present two cases of severe, idiopathic SU, which were resistant to conventional treatment. Both patients achieved remission after administration of intravenous immunoglobulin (IVIg) and have remained in remission at 13 months and 4 years, respectively. There are only two case reports of successful treatment of solar urticaria with IVIg. In our experience IVIg given at a total dose of 2 g\\/kg over several 5-day courses about a month apart is an effective treatment option for severe idiopathic SU. It is also generally safe, even if certainly subject to significant theoretical risks, such as induction of viral infection or anaphylaxis.

  12. Ceftaroline fosamil: just another intravenous antibiotic.

    Science.gov (United States)

    2013-12-01

    Various antibiotics, especially cephalosporins, are used for empirical treatment of community-acquired pneumonia requiring hospitalisation and intravenous treatment, and for serious infections of the skin and soft tissues. When the infection is caused by bacteria that are resistant to common antibiotics, some antibiotics such as vancomycin are available. Ceftaroline (Zinforo, AstraZeneca) is a new cephalosporin intended for intravenous administration (as ceftaroline fosamil). It is authorised for the treatment of community-acquired pneumonia and for serious infections of the skin and soft tissues. In two double-blind, randomised trials of ceftaroline versus ceftriaxone (a cephalosporin), ceftaroline showed no advantage in patients with community-acquired pneumonia. Note that the results of these trials are undermined by the use of a suboptimal dose of ceftriaxone. Ceftaroline has not been evaluated versus a first-line treatment for serious skin infections. It has been compared with second-line antibiotics in patients with serious skin infections in four randomised trials. None of these trials showed that ceftaroline has superior efficacy. The known adverse effect profile of ceftaroline is similar to that of all cephalosporins, and comprises hypersensitivity reactions (including anaphylaxis) and gastrointestinal disorders (including rare cases of pseudomembranous colitis). A possible excess of haematological and renal adverse effects has also been raised. Given the absence of relevant data, it is best to avoid using ceftaroline during pregnancy. In practice, there is no proof that ceftaroline represents a therapeutic advance for patients with community-acquired pneumonia warranting hospitalisation or with serious skin or soft-tissue infections. It is best to stick with better-known antibiotics.

  13. Phytonadione Content in Branded Intravenous Fat Emulsions.

    Science.gov (United States)

    Forchielli, Maria Luisa; Conti, Matteo; Motta, Roberto; Puggioli, Cristina; Bersani, Germana

    2017-03-01

    Intravenous fat emulsions (IVFE) with different fatty acid compositions contain vitamin E as a by-product of vegetable and animal oil during the refining processes. Likewise, other lipid-soluble vitamins may be present in IVFE. No data, however, exist about phytonadione (vitamin K1) concentration in IVFE information leaflets. Therefore, our aim was to evaluate the phytonadione content in different IVFE. Analyses were carried out in triplicate on 6 branded IVFE as follows: 30% soybean oil (100%), 20% olive-soybean oil (80%-20%), 20% soybean-medium-chain triglycerides (MCT) coconut oil (50%-50%), 20% soybean-olive-MCT-fish oil (30%-25%-30%-15%), 20% soybean-MCT-fish oil (40%-50%-10%), and 10% pure fish oil (100%). Phytonadione was analyzed and quantified by a quali-quantitative liquid chromatography-mass spectrometry (LC-MS) method after its extraction from the IVFE by an isopropyl alcohol-hexane mixture, reverse phase-liquid chromatography, and specific multiple-reaction monitoring for phytonadione and vitamin d3 (as internal standard). This method was validated through specificity, linearity, and accuracy. Average vitamin K1 content was 500, 100, 90, 100, 95, and 70 µg/L in soybean oil, olive-soybean oil, soybean-MCT coconut oil, soybean-olive-MCT-fish oil, soybean-MCT-fish oil, and pure fish oil intravenous lipid emulsions (ILEs), respectively. The analytical LC-MS method was extremely effective in terms of specificity, linearity ( r = 0.99), and accuracy (coefficient of variation <5%). Phytonadione is present in IVFE, and its intake varies according to IVFE type and the volume administered. It can contribute to daily requirements and become clinically relevant when simultaneously infused with multivitamins during long-term parenteral nutrition. LC-MS seems adequate in assessing vitamin K1 intake in IVFE.

  14. Intravenous volume tomographic pulmonary angiography imaging

    Science.gov (United States)

    Ning, Ruola; Strang, John G.; Chen, Biao; Conover, David L.; Yu, Rongfeng

    1999-05-01

    This study presents a new intravenous (IV) tomographic angiography imaging technique, called intravenous volume tomographic digital angiography (VTDA) for cross sectional pulmonary angiography. While the advantages of IV-VTDA over spiral CT in terms of volume scanning time and resolution have been validated and reported in our previous papers for head and neck vascular imaging, the superiority of IV-VTDA over spiral CT for cross sectional pulmonary angiography has not been explored yet. The purpose of this study is to demonstrate the advantage of isotropic resolution of IV-VTDA in the x, y and z directions through phantom and animal studies, and to explore its clinical application for detecting clots in pulmonary angiography. A prototype image intensifier-based VTDA imaging system has been designed and constructed by modifying a GE 8800 CT scanner. This system was used for a series of phantom and dog studies. A pulmonary vascular phantom was designed and constructed. The phantom was scanned using the prototype VTDA system for direct 3D reconstruction. Then the same phantom was scanned using a GE CT/i spiral CT scanner using the routine pulmonary CT angiography protocols. IV contrast injection and volume scanning protocols were developed during the dog studies. Both VTDA reconstructed images and spiral CT images of the specially designed phantom were analyzed and compared. The detectability of simulated vessels and clots was assessed as the function of iodine concentration levels, oriented angles, and diameters of the vessels and clots. A set of 3D VTDA reconstruction images of dog pulmonary arteries was obtained with different IV injection rates and isotropic resolution in the x, y and z directions. The results of clot detection studies in dog pulmonary arteries have also been shown. This study presents a new tomographic IV angiography imaging technique for cross sectional pulmonary angiography. The results of phantom and animal studies indicate that IV-VTDA is

  15. Penetration of moxalactam and cefazolin into atrial appendage after simultaneous intramuscular or intravenous administration.

    OpenAIRE

    Polk, R E; Smith, J E; Ducey, K; Lower, R R

    1982-01-01

    This study compared the penetration of moxalactam and cefazolin into the human atrial appendage after simultaneous administration of both drugs by two routes. Nineteen adult patients scheduled for coronary vein bypass surgery randomly received 10 mg of moxalactam and cefazolin per kg by either the intramuscular or intravenous (bolus) route on administration of anesthesia. Concentrations of cefazolin in serum were significantly greater than concentrations of moxalactam at all times for both ro...

  16. Intravenous topiramate: comparison of pharmacokinetics and safety with the oral formulation in healthy volunteers.

    Science.gov (United States)

    Clark, Anne M; Kriel, Robert L; Leppik, Ilo E; Marino, Susan E; Mishra, Usha; Brundage, Richard C; Cloyd, James C

    2013-06-01

    Although oral topiramate (TPM) products are widely prescribed for migraines and epilepsy, injectable TPM is not available for human use. We have developed a solubilized TPM formulation using a cyclodextrin matrix, Captisol with the long-term goal of evaluating its safety and efficacy in neonatal seizures. This study in healthy adult volunteers was performed as required by the U.S. Food and Drug Administration (FDA) to demonstrate the pharmacokinetics and safety prior to initiation of studies involving children. This study allowed investigation of absolute bioavailability, absolute clearance, and distribution volume of TPM, information that could not be obtained without using an intravenous TPM formulation. This study was an open-label, two-way crossover of oral and intravenous TPM in 12 healthy adult volunteers. Initially two subjects received 50 mg, intravenously and orally. Following evidence of safety in the first two subjects, 10 individuals received 100 mg doses of intravenous and oral TPM randomly sequenced 2 weeks apart. Blood samples were taken just prior to drug administration and at intervals up to 120 h afterwards. TPM was measured using a validated liquid chromatography-mass spectrometry method. Concentration-time data were analyzed using a noncompartmental approach with WinNonlin 5.2. All subjects completed the study. The mean (±standard deviation) absolute oral bioavailability was 109% (±10.8%). For intravenous and oral TPM the mean distribution volumes were 1.06 L/kg (±0.29) and 0.94 L/kg (±0.24). Clearances were 1.33 L/h (±0.26) and 1.22 L/h (±0.26). The half-life values were 42.3 h (±6.2) and 41.2 h (±7.5). No changes in heart rate, blood pressure, electrocardiography, or infusion site reactions were observed. Mild central nervous system cognitive adverse events and ataxia occurred between dosing and 2 h post dose with both intravenous and oral administration. With intravenous TPM, these adverse effects occurred as early as during the 15

  17. The big war over brackets.

    Science.gov (United States)

    Alvarez, R O

    1994-01-01

    The Third Preparatory Committee Meeting for the International Conference on Population and Development (ICPD), PrepCom III, was held at UN headquarters in New York on April 4-22, 1994. It was the last big preparatory meeting leading to the ICPD to be held in Cairo, Egypt, in September 1994. The author attended the second week of meetings as the official delegate of the Institute for Social Studies and Action. Debates mostly focused upon reproductive health and rights, sexual health and rights, family planning, contraception, condom use, fertility regulation, pregnancy termination, and safe motherhood. The Vatican and its allies' preoccupation with discussing language which may imply abortion caused sustainable development, population, consumption patterns, internal and international migration, economic strategies, and budgetary allocations to be discussed less extensively than they should have been. The author describes points of controversy, the power of women at the meetings, and afterthoughts on the meetings.

  18. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  19. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  20. Big Bang nucleosynthesis in crisis?

    International Nuclear Information System (INIS)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P.; Bludman, S.; Langacker, P.

    1995-01-01

    A new evaluation of the constraint on the number of light neutrino species (N ν ) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4 He abundance has been underestimated by 0.014±0.004 (1σ) or less than 10% (95% C.L.) of 3 He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is N ν =2.1±0.3 (1σ) and the upper limit is N ν ν =3) at the 98.6% C.L. copyright 1995 The American Physical Society

  1. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  2. Big Data and Grand Challenges

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2016-01-01

    The paper discusses the future of the Humanities and the role of Digital Humanities. Taking the point of departure in Rens Bods “A new History of the humanities” (2013) it is argued that the current crisis within the Humanities and Digital Humanities is not as much about different notions of cult...

  3. Hydrothorax, hydromediastinum and pericardial effusion: a complication of intravenous alimentation.

    Science.gov (United States)

    Damtew, B; Lewandowski, B

    1984-01-01

    Complications secondary to intravenous alimentation are rare but potentially lethal. Massive bilateral pleural effusions and a pericardial effusion developed in a patient receiving prolonged intravenous alimentation. Severe respiratory distress and renal failure ensued. He recovered with appropriate treatment. Images Fig. 1 Fig. 2 Fig. 3 PMID:6428731

  4. Intravenous lipid emulsion and dexmedetomidine for treatment of ...

    African Journals Online (AJOL)

    All cats presented in this study, were treated with intravenous lipid emulsion (ILE) at variable dosages, and dexmedetomidine was also administered by intravenous way. No adverse reaction such as thrombophlebitis, overload circulation or others was noticed during and after administration of ILE. Dexmedetomidine was ...

  5. Clinical effect of intravenous thrombolysis combined with nicorandil ...

    African Journals Online (AJOL)

    Purpose: To evaluate the effectiveness of intravenous thrombolysis in combination with nicorandil in the treatment of acute ST-segment elevation myocardial infarction (STEMI). Methods: Patients who developed acute STEMI and underwent intravenous thrombolysis in the hospital were selected and divided into observation ...

  6. Effects of intravenous diclofenac on postoperative sore throat in ...

    African Journals Online (AJOL)

    EB

    Objective: To evaluate the effect of intravenous diclofenac sodium on the occurrence and severity of postoperative sore throat. Methods: ... Conclusion: Intravenous diclofenac sodium does not reduce the occurrence or severity of postoperative sore throat. .... 8.4% sodium bicarbonate-also a colourless liquid- was added to ...

  7. Cost-minimization of mabthera intravenous versus subcutaneous administration

    NARCIS (Netherlands)

    Bax, P.; Postma, M.J.

    2013-01-01

    Objectives: To identify and compare all costs related to preparing and administrating MabThera for the intravenous and subcutaneous formulations in Dutch hematological patients. The a priori notion is that the costs of subcutaneous MabThera injections are lower compared to intravenous infusion due

  8. Microbiological quality of some brands of intravenous fluids ...

    African Journals Online (AJOL)

    Microbiological quality of some brands of intravenous fluids produced by some pharmaceutical companies in Nigeria was investigated. Membrane filtration method was used for concentration of contaminating organisms in the intravenous fluids. Thioglycollate medium, Tryptone Soya broth, Brilliant Green Agar ...

  9. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    Science.gov (United States)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  10. What makes Big Data, Big Data? Exploring the ontological characteristics of 26 datasets

    Directory of Open Access Journals (Sweden)

    Rob Kitchin

    2016-02-01

    Full Text Available Big Data has been variously defined in the literature. In the main, definitions suggest that Big Data possess a suite of key traits: volume, velocity and variety (the 3Vs, but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes Big Data, Big Data?’, applying Kitchin’s taxonomy of seven Big Data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute Big Data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of Big Data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that Big Data as an analytical category needs to be unpacked, with the genus of Big Data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes Big Data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.

  11. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  12. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  13. Big Pharma: a former insider's view.

    Science.gov (United States)

    Badcott, David

    2013-05-01

    There is no lack of criticisms frequently levelled against the international pharmaceutical industry (Big Pharma): excessive profits, dubious or even dishonest practices, exploiting the sick and selective use of research data. Neither is there a shortage of examples used to support such opinions. A recent book by Brody (Hooked: Ethics, the Medical Profession and the Pharmaceutical Industry, 2008) provides a précis of the main areas of criticism, adopting a twofold strategy: (1) An assumption that the special nature and human need for pharmaceutical medicines requires that such products should not be treated like other commodities and (2) A multilevel descriptive approach that facilitates an ethical analysis of relationships and practices. At the same time, Brody is fully aware of the nature of the fundamental dilemma: the apparent addiction to (and denial of) the widespread availability of gifts and financial support for conferences etc., but recognises that 'Remove the industry and its products, and a considerable portion of scientific medicine's power to help the patient vanishes' (Brody 2008, p. 5). The paper explores some of the relevant issues, and argues that despite the identified shortcomings and a need for rigorous and perhaps enhanced regulation, and realistic price control, the commercially competitive pharmaceutical industry remains the best option for developing safer and more effective medicinal treatments. At the same time, adoption of a broader ethical basis for the industry's activities, such as a triple bottom line policy, would register an important move in the right direction and go some way toward answering critics.

  14. Psycho-informatics: Big Data shaping modern psychometrics.

    Science.gov (United States)

    Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E

    2014-04-01

    For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. BLENDING IOT AND BIG DATA ANALYTICS

    OpenAIRE

    Tulasi.B*; Girish J Vemulkar

    2016-01-01

    Internet is continuously evolving and changing. Internet of Things (IoT) can be considered as the future of Internet applications which involves machine to machine learning (M2M). The actionable intelligence can be derived through fusion of Big Data and real time analytics with IoT. Big Data and IoT can be viewed as two sides of a coin. With the connection between Big Data and the objects on Internet benefits of IoT can be easily reaped. The applications of IoT spread across various domains l...

  16. Smart Information Management in Health Big Data.

    Science.gov (United States)

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  17. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  18. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  19. Ethics and Epistemology of Big Data.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian

    2017-12-01

    In this Symposium on the Ethics and Epistemology of Big Data, we present four perspectives on the ways in which the rapid growth in size of research databanks-i.e. their shift into the realm of "big data"-has changed their moral, socio-political, and epistemic status. While there is clearly something different about "big data" databanks, we encourage readers to place the arguments presented in this Symposium in the context of longstanding debates about the ethics, politics, and epistemology of biobank, database, genetic, and epidemiological research.

  20. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,