WorldWideScience

Sample records for big losses lead

  1. Big losses lead to irrational decision-making in gambling situations: relationship between deliberation and impulsivity.

    Directory of Open Access Journals (Sweden)

    Yuji Takano

    Full Text Available In gambling situations, we found a paradoxical reinforcing effect of high-risk decision-making after repeated big monetary losses. The computerized version of the Iowa Gambling Task (Bechara et al., 2000, which contained six big loss cards in deck B', was conducted on normal healthy college students. The results indicated that the total number of selections from deck A' and deck B' decreased across trials. However, there was no decrease in selections from deck B'. Detailed analysis of the card selections revealed that some people persisted in selecting from the "risky" deck B' as the number of big losses increased. This tendency was prominent in self-rated deliberative people. However, they were implicitly impulsive, as revealed by the matching familiar figure test. These results suggest that the gap between explicit deliberation and implicit impulsivity drew them into pathological gambling.

  2. Leading Undergraduate Students to Big Data Generation

    OpenAIRE

    Yang, Jianjun; Shen, Ju

    2015-01-01

    People are facing a flood of data today. Data are being collected at unprecedented scale in many areas, such as networking, image processing, virtualization, scientific computation, and algorithms. The huge data nowadays are called Big Data. Big data is an all encompassing term for any collection of data sets so large and complex that it becomes difficult to process them using traditional data processing applications. In this article, the authors present a unique way which uses network simula...

  3. Cool horizons lead to information loss

    Science.gov (United States)

    Chowdhury, Borun D.

    2013-10-01

    There are two evidences for information loss during black hole evaporation: (i) a pure state evolves to a mixed state and (ii) the map from the initial state to final state is non-invertible. Any proposed resolution of the information paradox must address both these issues. The firewall argument focuses only on the first and this leads to order one deviations from the Unruh vacuum for maximally entangled black holes. The nature of the argument does not extend to black holes in pure states. It was shown by Avery, Puhm and the author that requiring the initial state to final state map to be invertible mandates structure at the horizon even for pure states. The proof works if black holes can be formed in generic states and in this paper we show that this is indeed the case. We also demonstrate how models proposed by Susskind, Papadodimas et al. and Maldacena et al. end up making the initial to final state map non-invertible and thus make the horizon "cool" at the cost of unitarity.

  4. A Brief Review on Leading Big Data Models

    OpenAIRE

    Sharma, Sugam; Tim, Udoyara S; Wong, Johnny; Gadia, Shashi; Sharma, Subhash

    2014-01-01

    Today, science is passing through an era of transformation, where the inundation of data, dubbed data deluge is influencing the decision making process. The science is driven by the data and is being termed as data science. In this internet age, the volume of the data has grown up to petabytes, and this large, complex, structured or unstructured, and heterogeneous data in the form of “Big Data” has gained significant attention. The rapid pace of data growth through various disparate sources, ...

  5. [Overall digitalization: leading innovation of endodontics in big data era].

    Science.gov (United States)

    Ling, J Q

    2016-04-01

    In big data era, digital technologies bring great challenges and opportunities to modern stomatology. The applications of digital technologies, such as cone-beam CT(CBCT), computer aided design,(CAD)and computer aided manufacture(CAM), 3D printing and digital approaches for education , provide new concepts and patterns to the treatment and study of endodontic diseases. This review provides an overview of the application and prospect of commonly used digital technologies in the development of endodontics. PMID:27117212

  6. Lead uptake and lead loss in the fresh water field crab, Barytelphusa guerini, on exposure to organic and inorganic lead

    Energy Technology Data Exchange (ETDEWEB)

    Tulasi, S.J.; Yasmeen, R.; Reddy, C.P.; Rao, J.V.R.

    1987-07-01

    Lead is a heavy metal which is widely used in paint industry, pigments, dyes, electrical components and electronics, plastic chemicals and in various other things. Since some of the lead salts are soluble in water, lead presents a potential threat to aquatic organisms. Studies dealing with invertebrates include those on mortality, growth and lead uptake in Lymnaea palustris and bioaccumulation of heavy metals in oysters and mussels. Little information exists regarding the effect of lead on the fresh water crustaceans. Hence the present investigation has been undertaken to study the uptake and loss of lead on exposure to subtoxic levels or organic and inorganic lead.

  7. Costs of IQ Loss from Leaded Aviation Gasoline Emissions.

    Science.gov (United States)

    Wolfe, Philip J; Giang, Amanda; Ashok, Akshay; Selin, Noelle E; Barrett, Steven R H

    2016-09-01

    In the United States, general aviation piston-driven aircraft are now the largest source of lead emitted to the atmosphere. Elevated lead concentrations impair children's IQ and can lead to lower earnings potentials. This study is the first assessment of the nationwide annual costs of IQ losses from aircraft lead emissions. We develop a general aviation emissions inventory for the continental United States and model its impact on atmospheric concentrations using the community multi-scale air quality model (CMAQ). We use these concentrations to quantify the impacts of annual aviation lead emissions on the U.S. population using two methods: through static estimates of cohort-wide IQ deficits and through dynamic economy-wide effects using a computational general equilibrium model. We also examine the sensitivity of these damage estimates to different background lead concentrations, showing the impact of lead controls and regulations on marginal costs. We find that aircraft-attributable lead contributes to $1.06 billion 2006 USD ($0.01-$11.6) in annual damages from lifetime earnings reductions, and that dynamic economy-wide methods result in damage estimates that are 54% larger. Because the marginal costs of lead are dependent on background concentration, the costs of piston-driven aircraft lead emissions are expected to increase over time as regulations on other emissions sources are tightened. PMID:27494542

  8. Losses as ecological guides: minor losses lead to maximization and not to avoidance.

    Science.gov (United States)

    Yechiam, Eldad; Retzer, Matan; Telpaz, Ariel; Hochman, Guy

    2015-06-01

    Losses are commonly thought to result in a neuropsychological avoidance response. We suggest that losses also provide ecological guidance by increasing focus on the task at hand, and that this effect may override the avoidance response. This prediction was tested in a series of studies. In Study 1a we found that minor losses did not lead to an avoidance response. Instead, they guided participants to make advantageous choices (in terms of expected value) and to avoid disadvantageous choices. Moreover, losses were associated with less switching between options after the first block of exploration. In Study 1b we found that this effect was not simply a by-product of the increase in visual contrast with losses. In Study 1c we found that the effect of losses did not emerge when alternatives did not differ in their expected value but only in their risk level. In Study 2 we investigated the autonomic arousal dynamics associated with this behavioral pattern via pupillometric responses. The results showed increased pupil diameter following losses compared to gains. However, this increase was not associated with a tendency to avoid losses, but rather with a tendency to select more advantageously. These findings suggest that attention and reasoning processes induced by losses can out-weigh the influence of affective processes leading to avoidance. PMID:25797454

  9. Modeling of Blood Lead Levels in Astronauts Exposed to Lead from Microgravity-Accelerated Bone Loss

    Science.gov (United States)

    Garcia, H.; James, J.; Tsuji, J.

    2014-01-01

    Human exposure to lead has been associated with toxicity to multiple organ systems. Studies of various population groups with relatively low blood lead concentrations (lead level with lower cognitive test scores in children, later onset of puberty in girls, and increased blood pressure and cardiovascular mortality rates in adults. Cognitive effects are considered by regulatory agencies to be the most sensitive endpoint at low doses. Although 95% of the body burden of lead is stored in the bones, the adverse effects of lead correlate with the concentration of lead in the blood better than with that in the bones. NASA has found that prolonged exposure to microgravity during spaceflight results in a significant loss of bone minerals, the extent of which varies from individual to individual and from bone to bone, but generally averages about 0.5% per month. During such bone loss, lead that had been stored in bones would be released along with calcium. The effects on the concentration of lead in the blood (PbB) of various concentrations of lead in drinking water (PbW) and of lead released from bones due to accelerated osteoporosis in microgravity, as well as changes in exposure to environmental lead before, during, and after spaceflight were evaluated using a physiologically based pharmacokinetic (PBPK) model that incorporated exposure to environmental lead both on earth and in flight and included temporarily increased rates of osteoporosis during spaceflight.

  10. Streamflow losses and ground-water level changes along the Big Lost River at the Idaho National Engineering Laboratory, Idaho

    International Nuclear Information System (INIS)

    The Big Lost flows onto the eastern Snake River Plain near Arco, Idaho, and across the INEL. Stream-flow recharges the Snake River Plain aquifer from the Big Lost River channel, the INEL spreading areas, and playas at the river terminus. Average annual streamflow for the Big Lost River from 1965 to 1987 above the INEL diversion was 104,436 acre-ft; about 50% of this flow was diverted to the INEL spreading areas, about 9% infiltrated between the INEL diversion and Lincoln Boulevard, and the remainder infiltrated below Lincoln Boulevard or flowed into playas. Streamflow losses to evapotranspiration were minor compared to infiltration losses. Infiltration loss was estimated along the 44 river miles from Arco to playa 1 at discharges ranging from 37 to 372 cu ft/sec. At discharges less than 100 cu ft/sec, loss from Arco to the Big Lost River Sinks was 1 to 4 cu ft/sec/mile. Loss from the sinks to playa 1 ranged from 7 to 12 cu ft/sec/mile. Infiltration loss was greatest at large discharges. At a discharge of 372 cu ft/sec, a loss of 28 cu ft/sec/mi was estimated in the area of the Big Lost River is substantial immediately southwest of the Radioactive Waste Management Complex and in the area form the Naval Reactor Facility to playas 1 and 2

  11. On the evaluation of the pressure losses in a lead-bismuth-eutectics cooled fuel assembly with TRACE and SUSA

    International Nuclear Information System (INIS)

    The prediction of the pressure drop in a pool-type reactor operated with lead-bismuth-eutectics is of crucial importance. A pressure drop of e.g. 1 bar is equivalent to a lead-bismuth-eutectics column of about 1 m, which has a big influence on the financial aspects of the design proposal. The paper presents results on the hydraulic evaluation of a fuel assembly with the emphasis on uncertainties and variations of relevant parameters like the mass flow rate, form, and friction loss coefficients. With the subsequent uncertainty and sensitivity study, in connection with thermal hydraulic investigations, the influence of these uncertain parameters was evaluated. (author)

  12. Does faster loan growth lead to higher loan losses?

    OpenAIRE

    William R. Keeton

    1999-01-01

    During the last couple of years, concern has increased that the exceptionally rapid growth in business loans at commercial banks has been due in large part to excessively easy credit standards. Some analysts argue that competition for loan customers has greatly increased, causing banks to reduce loan rates and ease credit standards to obtain new business. Others argue that as the economic expansion has continued and memories of past loan losses have faded, banks have become more willing to ta...

  13. Association of defects in lead chloride and lead bromide: Ionic conductivity and dielectric loss measurements

    NARCIS (Netherlands)

    Brom, W.E. van den; Schoonman, J.; Wit, J.H.W. de

    1972-01-01

    The ionic conductivity data of pure and doped lead bromide without associated defects are used in order to explain the anomalous conductivity behaviour of copper (I) bromide and lead oxide-doped lead-bromide crystals. In these crystals precipitated dopant and associated defects are present. The asso

  14. Size of genetic bottlenecks leading to virus fitness loss is determined by mean initial population fitness.

    OpenAIRE

    Novella, I S; Elena, S F; Moya, A.; Domingo, E; Holland, J J

    1995-01-01

    Genetic bottlenecks are important events in the genetic diversification of organisms and colonization of new ecological niches. Repeated bottlenecking of RNA viruses often leads to fitness losses due to the operation of Muller's ratchet. Herein we use vesicular stomatitis virus to determine the transmission population size which leads to fitness decreases of virus populations. Remarkably, the effective size of a genetic bottleneck associated with fitness loss is greater when the fitness of th...

  15. AdS/CFT heavy quark energy loss beyond the leading order

    Science.gov (United States)

    Horowitz, W. A.

    2014-11-01

    Naïve, leading order, fully strongly-coupled AdS/CFT energy loss models are either falsified or put into significant doubt when constrained by RHIC and then compared to LHC data. The proper inclusion of fluctuations in heavy quark momentum loss leads to LHC predictions, constrained by RHIC, not in qualitative disagreement with measurements. Once renormalized, strong-coupling energy loss predictions for jet suppression with a new, physically motivated jet definition within AdS/CFT yields predictions in surprisingly good agreement with preliminary LHC results.

  16. Concentration trends for lead and calcium-normalized lead in fish fillets from the Big River, a mining-contaminated stream in southeastern Missouri USA

    Science.gov (United States)

    Schmitt, Christopher J.; McKee, Michael J.

    2016-01-01

    Lead (Pb) and calcium (Ca) concentrations were measured in fillet samples of longear sunfish (Lepomis megalotis) and redhorse suckers (Moxostoma spp.) collected in 2005–2012 from the Big River, which drains a historical mining area in southeastern Missouri and where a consumption advisory is in effect due to elevated Pb concentrations in fish. Lead tends to accumulated in Ca-rich tissues such as bone and scale. Concentrations of Pb in fish muscle are typically low, but can become elevated in fillets from Pb-contaminated sites depending in part on how much bone, scale, and skin is included in the sample. We used analysis-of-covariance to normalize Pb concentration to the geometric mean Ca concentration (415 ug/g wet weight, ww), which reduced variation between taxa, sites, and years, as was the number of samples that exceeded Missouri consumption advisory threshold (300 ng/g ww). Concentrations of Pb in 2005–2012 were lower than in the past, especially after Ca-normalization, but the consumption advisory is still warranted because concentrations were >300 ng/g ww in samples of both taxa from contaminated sites. For monitoring purposes, a simple linear regression model is proposed for estimating Ca-normalized Pb concentrations in fillets from Pb:Ca molar ratios as a way of reducing the effects of differing preparation methods on fillet Pb variation.

  17. Parkin loss leads to PARIS-dependent declines in mitochondrial mass and respiration.

    Science.gov (United States)

    Stevens, Daniel A; Lee, Yunjong; Kang, Ho Chul; Lee, Byoung Dae; Lee, Yun-Il; Bower, Aaron; Jiang, Haisong; Kang, Sung-Ung; Andrabi, Shaida A; Dawson, Valina L; Shin, Joo-Ho; Dawson, Ted M

    2015-09-15

    Mutations in parkin lead to early-onset autosomal recessive Parkinson's disease (PD) and inactivation of parkin is thought to contribute to sporadic PD. Adult knockout of parkin in the ventral midbrain of mice leads to an age-dependent loss of dopamine neurons that is dependent on the accumulation of parkin interacting substrate (PARIS), zinc finger protein 746 (ZNF746), and its transcriptional repression of PGC-1α. Here we show that adult knockout of parkin in mouse ventral midbrain leads to decreases in mitochondrial size, number, and protein markers consistent with a defect in mitochondrial biogenesis. This decrease in mitochondrial mass is prevented by short hairpin RNA knockdown of PARIS. PARIS overexpression in mouse ventral midbrain leads to decreases in mitochondrial number and protein markers and PGC-1α-dependent deficits in mitochondrial respiration. Taken together, these results suggest that parkin loss impairs mitochondrial biogenesis, leading to declining function of the mitochondrial pool and cell death. PMID:26324925

  18. Investigations of the factors causing performance losses of lead/acid traction batteries

    Science.gov (United States)

    Kronberger, H.; Fabjan, Ch.; Gofas, N.

    A failure analysis is carried out with a lead/acid traction battery after a two-years' test run in an electric passenger car. A survey of the operational data, in combination with laboratory tests and chemical and physical analyses, reveals the main causes of battery damage and performance loss: insufficiencies of the charging procedure, inadequate maintainance (water-refilling system), antimony-contamination and loss of the active material due to grid corrosion and shedding of PbO 2.

  19. How to minimize iron loss while decontaminating converter dust from lead.

    Science.gov (United States)

    Moussavi, Mohsen; Fathikalajahi, Jamshid; Khalili, Fariba

    2011-12-01

    The purpose of this work was to decontaminate a converter flue dust from lead while minimizing the iron loss from the dust matrix. A physicochemical method based on a leaching cascade was applied to remove lead impurities with HCl. Finally, the lead-rich wastes generated at the end of the operations were further treated in order to meet the standards of waste disposal sites. The results show that lead could be removed from the dust with efficiencies of better than 90%. However, some iron was lost in these operations. It was noticed that under optimum conditions 8.5 g of iron would be lost as leachate per each gram of removed lead. It was also noticed that the lead-treated dust was more concentrated in iron and had less calcium. A number of parameters that could affect the amount of iron loss, such as acid dose, exposure time and temperature, were identified and the extent of the effects are presented. It was concluded that the lead content in the flue dust could practically be lowered to any level. It was also concluded that the difference between the solubility of lead chloride in cold and hot water, as well as the affinity of certain ligands such as Cl- and OH- to form stable complexes with lead ions, played a key role in this study. The lead-containing complexes in the leachate were identified to be predominantly PbCl3- and PbCl+. PMID:22439558

  20. Theoretical and experimental loss and efficiency studies of a magnetic lead screw

    DEFF Research Database (Denmark)

    Berg, Nick Ilsø; Holm, Rasmus Koldborg; Rasmussen, Peter Omand

    2013-01-01

    This paper investigates mechanical and magnetic losses in a magnetic lead screw (MLS). The MLS converts a low speed high force linear motion of a translator into a high speed low torque rotational motion of a rotor through helically shaped magnets. Initial tests performed with a novel 17 k......N demonstrator and a simplified motor model showed an efficiency of only 80 % at low load, however it was expected that the efficiency should be above 95 %. For understanding and future optimization a detailed study of the loss in the MLS is presented with the aim of identifying and segregate various loss...

  1. Theoretical and Experimental Loss and Efficiency Studies of a Magnetic Lead Screw

    DEFF Research Database (Denmark)

    Berg, Nick Ilsø; Holm, Rasmus Koldborg; Rasmussen, Peter Omand

    2015-01-01

    This paper investigates mechanical and magnetic losses in a magnetic lead screw (MLS). The MLS converts a low-speed high-force linear motion of a translator into a high-speed low-torque rotational motion of a rotor through helically shaped magnets. Initial tests performed with a novel 17-k......N demonstrator and a simplified motor model showed an efficiency of only 80% at low load; however, it was expected that the efficiency should be above 95%. For understanding and future optimization, a detailed study of the loss in the MLS is presented with the aim of identifying and segregating various loss...

  2. Lead pressure loss in the heat exchanger of the ELSY fast lead-cooled reactor by CFD approach

    International Nuclear Information System (INIS)

    In the frame of the ELSY (European lead-cooled system) design proposal for a fast lead-cooled reactor, which should comply with the goals of the 4. generation nuclear power plants, the focus is set on the usage of the possible advantages offered by the lead technology in comparison to lead-bismuth eutectic (LBE). Lead is less expensive, less corrosive and has a smaller radiological emissivity in comparison with LBE. The ELSY project aims at demonstrating the feasibility of a lead fast reactor for energy generation and the identification of solutions for a simple but safe system. In order to properly dimension the reactor and to allow the flow of lead in natural circulation regime, as required by the nuclear accidents scenarios, the knowledge of the lead pressure losses through each component is mandatory. The present paper discusses the pressure loss through the new innovative design proposed for the ELSY spiral heat exchanger (HX). The lack of experimental data for lead flows through heat exchangers, as well as the novelty of the HX design, motivated an approach based on CFD (Computational Fluid Dynamics) analysis. We employed the commercial tool ANSYS CFX and successfully validate the program against theoretical predictions for pressure loss simulations through perforated plates and pipe bundles. The ELSY HX has a cylindrical design and uniformly perforated double inner and double outer walls, as described in [4]. The flow of lead represents the primary circuit, while supercritical water is planned for the secondary circuit of the reactor. The perforations in the walls and in the corresponding companion shells are displaced in a staggered way. About 200 tubes that are arranged vertically in a staggered way are planned for the secondary circuit of one HX. A detailed complete model is not feasible at the actual stage of the design, due to the complex geometry, which has reference elements ranging between 10-3/1 m scales. Therefore, unit slice models consisting of

  3. Analisis Faktor-Faktor Six Big Losses Pada Mesin Cane Catter I Yang Mempengaruhi Efesiensi Produksi Pada Pabrik Gula PTPN II Sei Semayang

    OpenAIRE

    Arwanie, Meilya Nurul

    2010-01-01

    Pabrik Gula PTPN II Sei Semayang merupakan perusahaan yang bergerak dalam pengolahan tebu yang tidak terlepas dari masalah yang berhubungan dengan efektivitas mesin/peralatan yang diakibatkan oleh six big losses. Hal ini dapat terlihat dengan terjadinya shutdown yang tidak terencana dan frekuensi kerusakan yang terjadi pada mesin/ peralatan karena kerusakan tersebut target produksi tidak tercapai. Oleh karena itu diperlukan langkah-langkah efektif dan efisien dalam pemeliharaan mesin dan pera...

  4. Pulsations of red supergiant pair-instability supernova progenitors leading to extreme mass loss

    CERN Document Server

    Moriya, Takashi J

    2014-01-01

    Recent stellar evolution models show consistently that very massive metal-free stars evolve into red supergiants shortly before they explode. We argue that the envelopes of these stars, which will form pair-instability supernovae, become pulsationally unstable, and that this will lead to extreme mass-loss rates even though the metal content of the envelopes is very small. We investigate the pulsational properties of such models, and derive pulsationally induced mass-loss rates which take the damping effects of the mass loss on the pulsations selfconsistently into account. We find that the pulsations may induce mass-loss rates of ~ 1e-4 - 1e-2 Msun/yr shortly before the explosions, which may create a dense circumstellar medium. Our results show that very massive stars with dense circumstellar media may originate from a wider initial mass range than that of pulsational-pair instability supernovae. The extreme mass loss will cease when so much of the hydrogen-rich envelope is lost that the star becomes more comp...

  5. Research and Manufacture of a Big-lead Variable-pitch Screw%大导程、变螺距螺杆的研制

    Institute of Scientific and Technical Information of China (English)

    揭晓; 覃岭

    2011-01-01

    Process analysis for a big-lead variable-pitch screw was made. Rational NC program was programmed. Qualified part was gotten. The requirement of custom is satisfied.%对大导程、变螺距螺杆进行了工艺分析,编制出合理的数控程序,制造出合格的零件,满足了用户的要求.

  6. Loss of FTO antagonises Wnt signaling and leads to developmental defects associated with ciliopathies.

    Directory of Open Access Journals (Sweden)

    Daniel P S Osborn

    Full Text Available Common intronic variants in the Human fat mass and obesity-associated gene (FTO are found to be associated with an increased risk of obesity. Overexpression of FTO correlates with increased food intake and obesity, whilst loss-of-function results in lethality and severe developmental defects. Despite intense scientific discussions around the role of FTO in energy metabolism, the function of FTO during development remains undefined. Here, we show that loss of Fto leads to developmental defects such as growth retardation, craniofacial dysmorphism and aberrant neural crest cells migration in Zebrafish. We find that the important developmental pathway, Wnt, is compromised in the absence of FTO, both in vivo (zebrafish and in vitro (Fto(-/- MEFs and HEK293T. Canonical Wnt signalling is down regulated by abrogated β-Catenin translocation to the nucleus whilst non-canonical Wnt/Ca(2+ pathway is activated via its key signal mediators CaMKII and PKCδ. Moreover, we demonstrate that loss of Fto results in short, absent or disorganised cilia leading to situs inversus, renal cystogenesis, neural crest cell defects and microcephaly in Zebrafish. Congruently, Fto knockout mice display aberrant tissue specific cilia. These data identify FTO as a protein-regulator of the balanced activation between canonical and non-canonical branches of the Wnt pathway. Furthermore, we present the first evidence that FTO plays a role in development and cilia formation/function.

  7. Preventing mitochondrial fission impairs mitochondrial function and leads to loss of mitochondrial DNA.

    Directory of Open Access Journals (Sweden)

    Philippe A Parone

    Full Text Available Mitochondria form a highly dynamic tubular network, the morphology of which is regulated by frequent fission and fusion events. However, the role of mitochondrial fission in homeostasis of the organelle is still unknown. Here we report that preventing mitochondrial fission, by down-regulating expression of Drp1 in mammalian cells leads to a loss of mitochondrial DNA and a decrease of mitochondrial respiration coupled to an increase in the levels of cellular reactive oxygen species (ROS. At the cellular level, mitochondrial dysfunction resulting from the lack of fission leads to a drop in the levels of cellular ATP, an inhibition of cell proliferation and an increase in autophagy. In conclusion, we propose that mitochondrial fission is required for preservation of mitochondrial function and thereby for maintenance of cellular homeostasis.

  8. Loss of ATF2 function leads to cranial motoneuron degeneration during embryonic mouse development.

    Directory of Open Access Journals (Sweden)

    Julien Ackermann

    Full Text Available The AP-1 family transcription factor ATF2 is essential for development and tissue maintenance in mammals. In particular, ATF2 is highly expressed and activated in the brain and previous studies using mouse knockouts have confirmed its requirement in the cerebellum as well as in vestibular sense organs. Here we present the analysis of the requirement for ATF2 in CNS development in mouse embryos, specifically in the brainstem. We discovered that neuron-specific inactivation of ATF2 leads to significant loss of motoneurons of the hypoglossal, abducens and facial nuclei. While the generation of ATF2 mutant motoneurons appears normal during early development, they undergo caspase-dependent and independent cell death during later embryonic and foetal stages. The loss of these motoneurons correlates with increased levels of stress activated MAP kinases, JNK and p38, as well as aberrant accumulation of phosphorylated neurofilament proteins, NF-H and NF-M, known substrates for these kinases. This, together with other neuropathological phenotypes, including aberrant vacuolisation and lipid accumulation, indicates that deficiency in ATF2 leads to neurodegeneration of subsets of somatic and visceral motoneurons of the brainstem. It also confirms that ATF2 has a critical role in limiting the activities of stress kinases JNK and p38 which are potent inducers of cell death in the CNS.

  9. Knockdown of cytosolic glutaredoxin 1 leads to loss of mitochondrial membrane potential: implication in neurodegenerative diseases.

    Directory of Open Access Journals (Sweden)

    Uzma Saeed

    Full Text Available Mitochondrial dysfunction including that caused by oxidative stress has been implicated in the pathogenesis of neurodegenerative diseases. Glutaredoxin 1 (Grx1, a cytosolic thiol disulfide oxido-reductase, reduces glutathionylated proteins to protein thiols and helps maintain redox status of proteins during oxidative stress. Grx1 downregulation aggravates mitochondrial dysfunction in animal models of neurodegenerative diseases, such as Parkinson's and motor neuron disease. We examined the mechanism underlying the regulation of mitochondrial function by Grx1. Downregulation of Grx1 by shRNA results in loss of mitochondrial membrane potential (MMP, which is prevented by the thiol antioxidant, alpha-lipoic acid, or by cyclosporine A, an inhibitor of mitochondrial permeability transition. The thiol groups of voltage dependent anion channel (VDAC, an outer membrane protein in mitochondria but not adenosine nucleotide translocase (ANT, an inner membrane protein, are oxidized when Grx1 is downregulated. We then examined the effect of beta-N-oxalyl amino-L-alanine (L-BOAA, an excitatory amino acid implicated in neurolathyrism (a type of motor neuron disease, that causes mitochondrial dysfunction. Exposure of cells to L-BOAA resulted in loss of MMP, which was prevented by overexpression of Grx1. Grx1 expression is regulated by estrogen in the CNS and treatment of SH-SY5Y cells with estrogen upregulated Grx1 and protected from L-BOAA mediated MMP loss. Our studies demonstrate that Grx1, a cytosolic oxido-reductase, helps maintain mitochondrial integrity and prevents MMP loss caused by oxidative insult. Further, downregulation of Grx1 leads to mitochondrial dysfunction through oxidative modification of the outer membrane protein, VDAC, providing support for the critical role of Grx1 in maintenance of MMP.

  10. Fuel rod mechanical deformation during the PBF/LOFT lead rod loss-of-coolant experiments

    International Nuclear Information System (INIS)

    Results of four PBF/LOFT Lead Rod (LLR) sequential blowdown tests conducted in the Power Burst Facility (PBF) are presented. Each test employed four separately shrouded fuel rods. The primary objective of the test series was to evaluate the extent of mechanical deformation that would be expected to occur to low pressure (0.1 MPa), light water reactor design fuel rods when subjected to a series of double ended cold leg break loss-of-coolant accident (LOCA) tests, and to determine whether subjecting these deformed fuel rods to subsequent testing would result in rod failure. The extent of mechanical deformation (buckling, collapse, or waisting of the cladding) was evaluated by comparison of cladding temperature and system pressure measurements with out-of-pile experimental data, and by posttest visual examinations and cladding diametral measurements

  11. Neospora caninum is the leading cause of bovine fetal loss in British Columbia, Canada.

    Science.gov (United States)

    Wilson, Devon J; Orsel, Karin; Waddington, Josh; Rajeev, Malavika; Sweeny, Amy R; Joseph, Tomy; Grigg, Michael E; Raverty, Stephen A

    2016-03-15

    The protozoan pathogen Neospora caninum is recognized as a leading cause of infectious abortions in cattle worldwide. To evaluate the impact of neosporosis on dairy and beef herd production, a retrospective, longitudinal study was performed to identify the impact of neosporosis alongside other causes of fetal abortion in British Columbia, Canada. Retrospective analysis of pathology records of bovine fetal submissions submitted to the Animal Health Centre, Abbotsford, British Columbia, a provincial veterinary diagnostic laboratory, from January 2007 to July 2013 identified 182 abortion cases (passive surveillance). From July 2013 to May 2014, an active surveillance program identified a further 54 abortion cases from dairy farmers in the Upper Fraser Valley, British Columbia. Of the total 236 fetal submissions analyzed, N. caninum was diagnosed in 18.2% of cases, making it the most commonly identified infectious agent associated with fetal loss. During active surveillance, N. caninum was associated with 41% of fetuses submitted compared to 13.3% during passive surveillance (page had the highest prevalence of N. caninum. There was no significant association with dam parity. N. caninum was diagnosed in every year except 2009 and cases were geographically widespread throughout the province. Furthermore, the active surveillance program demonstrates that N. caninum is highly prevalent in the Upper Fraser Valley and is a major causal agent of production losses in this dairy intensive region. PMID:26872927

  12. FORUM: Indirect leakage leads to a failure of avoided loss biodiversity offsetting

    OpenAIRE

    Moilanen, Atte; Laitila, Jussi

    2015-01-01

    Summary Biodiversity offsetting has quickly gained political support all around the world. Avoided loss (averted risk) offsetting means compensation for ecological damage via averted loss of anticipated impacts through the removal of threatening processes in compensation areas. Leakage means the phenomenon of environmentally damaging activity relocating elsewhere after being stopped locally by avoided loss offsetting. Indirect leakage means that locally avoided losses displace to other admini...

  13. Transgenic n-3 PUFAs enrichment leads to weight loss via modulating neuropeptides in hypothalamus.

    Science.gov (United States)

    Ma, Shuangshuang; Ge, Yinlin; Gai, Xiaoying; Xue, Meilan; Li, Ning; Kang, Jingxuan; Wan, Jianbo; Zhang, Jinyu

    2016-01-12

    Body weight is related to fat mass, which is associated with obesity. Our study explored the effect of fat-1 gene on body weight in fat-1 transgenic mice. In present study, we observed that the weight/length ratio of fat-1 transgenic mice was lower than that of wild-type mice. The serum levels of triglycerides (TG), cholesterol (CT), high-density lipoprotein cholesterol (HDL-c), low-density lipoprotein cholesterol (LDL-c) and blood glucose (BG) in fat-1 transgenic mice were all decreased. The weights of peri-bowels fat, perirenal fat and peri-testicular fat in fat-1 transgenic mice were reduced. We hypothesized that increase of n-3 PUFAs might alter the expression of hypothalamic neuropeptide genes and lead to loss of body weight in fat-1 transgenic mice. Therefore, we measured mRNA levels of appetite neuropeptides, Neuropeptide Y (NPY), Agouti-related peptides (AgRP), Proopiomelanocortin (POMC), Cocaine and amphetamine regulated transcript (CART), ghrelin and nesfatin-1 in hypothalamus by real-time PCR. Compared with wild-type mice, the mRNA levels of CART, POMC and ghrelin were higher, while the mRNA levels of NPY, AgRP and nesfatin-1 were lower in fat-1 transgenic mice. The results indicate that fat-1 gene or n-3 PUFAs participates in regulation of body weight, and the mechanism of this phenomenon involves the expression of appetite neuropeptides and lipoproteins in fat-1 transgenic mice. PMID:26610903

  14. Wildlife and Wildlife Habitat Loss Assessment at Detroit Big Cliff Dam and Reservoir Project, North Santiam River, Oregon, 1985 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Noyes, J.H.

    1985-02-01

    A habitat based assessment was conducted of the US Army Corps of Engineers' Detroit/Big Cliff Dam and Reservoir Project (Detroit Project) on the North Santiam River, Oregon, to determine losses or gains resulting from the development and operation of the hydroelectric-related components of the project. Preconstruction, postconstruction, and recent vegetation cover types at the project site were mapped based on aerial photographs from 1939, 1956, and 1979, respectively. Vegetation cover types were identified within the affected area and acreages of each type at each time period were determined. Ten wildlife target species were selected to represent a cross-section of species groups affected by the project. An interagency team evaluated the suitability of the habitat to support the target species at each time period. An evaluation procedure which accounted for both the quantity and quality of habitat was used to aid in assessing impacts resulting from the project. The Detroit Project extensively altered or affected 6324 acres of land and river in the North Santiam River drainage. Impacts to wildlife centered around the loss of 1,608 acres of conifer forest and 620 acres of riparian habitat. Impacts resulting from the Detroit Project included the loss of winter range for black-tailed deer and Roosevelt elk, and the loss of year-round habitat for deer, river otter, beaver, ruffed grouse, pileated woodpecker, spotted owl, and many other wildlife species. Bald eagle and osprey were benefited by an increase in foraging habitat. The potential of the affected area to support wildlife was greatly altered as a result of the Detroit Project. Losses or gains in the potential of the habitat to support wildlife will exist over the life of the project.

  15. Systematic Evaluation of Dissolved Lead Sorption Losses to Particulate Syringe Filter Materials

    Science.gov (United States)

    Distinguishing between soluble and particulate lead in drinking water is useful in understanding the mechanism of lead release and identifying remedial action. Typically, particulate lead is defined as the amount of lead removed by a 0.45 µm filter. Unfortunately, there is little...

  16. The association between low levels of lead in blood and occupational noise-induced hearing loss in steel workers

    International Nuclear Information System (INIS)

    As the use of leaded gasoline has ceased in the last decade, background lead exposure has generally been reduced. The aim of this study was to examine the effect of low-level lead exposure on human hearing loss. This study was conducted in a steel plant and 412 workers were recruited from all over the plant. Personal information such as demographics and work history was obtained through a questionnaire. All subjects took part in an audiometric examination of hearing thresholds, for both ears, with air-conducted pure tones at frequencies of 500, 1000, 2000, 3000, 4000, 6000 and 8000 Hz. Subjects' blood samples were collected and analyzed for levels of manganese, copper, zinc, arsenic, cadmium and lead with inductive couple plasma-mass spectrometry. Meanwhile, noise levels in different working zones were determined using a sound level meter with A-weighting network. Only subjects with hearing loss difference of no more than 15 dB between both ears and had no congenital abnormalities were included in further data analysis. Lead was the only metal in blood found significantly correlated with hearing loss for most tested sound frequencies (p < 0.05 to p < 0.0001). After adjustment for age and noise level, the logistic regression model analysis indicated that elevated blood lead over 7 μg/dL was significantly associated with hearing loss at the sound frequencies of 3000 through 8000 Hz with odds ratios raging from 3.06 to 6.26 (p < 0.05 ∼ p < 0.005). We concluded that elevated blood lead at level below 10 μg/dL might enhance the noise-induced hearing loss. Future research needs to further explore the detailed mechanism.

  17. Multiple Events Lead to Dendritic Spine Loss in Triple Transgenic Alzheimer's Disease Mice

    OpenAIRE

    Tobias Bittner; Martin Fuhrmann; Steffen Burgold; Ochs, Simon M.; Nadine Hoffmann; Gerda Mitteregger; Hans Kretzschmar; LaFerla, Frank M.; Jochen Herms

    2010-01-01

    The pathology of Alzheimer's disease (AD) is characterized by the accumulation of amyloid-β (Aβ) peptide, hyperphosphorylated tau protein, neuronal death, and synaptic loss. By means of long-term two-photon in vivo imaging and confocal imaging, we characterized the spatio-temporal pattern of dendritic spine loss for the first time in 3xTg-AD mice. These mice exhibit an early loss of layer III neurons at 4 months of age, at a time when only soluble Aβ is abundant. Later on, dendritic spines ar...

  18. Big Society, Big Deal?

    Science.gov (United States)

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  19. Continuous Feedings of Fortified Human Milk Lead to Nutrient Losses of Fat, Calcium and Phosphorous

    OpenAIRE

    Abrams, Steven A.; Veit, Lauren E.; Maria Hamzo; Rogers, Stefanie P; Hicks, Penni D

    2010-01-01

    Substantial losses of nutrients may occur during tube (gavage) feeding of fortified human milk. Our objective was to compare the losses of key macronutrients and minerals based on method of fortification and gavage feeding method. We used clinically available gavage feeding systems and measured pre- and post-feeding (end-point) nutrient content of calcium (Ca), phosphorus (Phos), protein, and fat. Comparisons were made between continuous, gravity bolus, and 30-minute infusion pump feeding sys...

  20. Multiple events lead to dendritic spine loss in triple transgenic Alzheimer's disease mice.

    Directory of Open Access Journals (Sweden)

    Tobias Bittner

    Full Text Available The pathology of Alzheimer's disease (AD is characterized by the accumulation of amyloid-β (Aβ peptide, hyperphosphorylated tau protein, neuronal death, and synaptic loss. By means of long-term two-photon in vivo imaging and confocal imaging, we characterized the spatio-temporal pattern of dendritic spine loss for the first time in 3xTg-AD mice. These mice exhibit an early loss of layer III neurons at 4 months of age, at a time when only soluble Aβ is abundant. Later on, dendritic spines are lost around amyloid plaques once they appear at 13 months of age. At the same age, we observed spine loss also in areas apart from amyloid plaques. This plaque independent spine loss manifests exclusively at dystrophic dendrites that accumulate both soluble Aβ and hyperphosphorylated tau intracellularly. Collectively, our data shows that three spatio-temporally independent events contribute to a net loss of dendritic spines. These events coincided either with the occurrence of intracellular soluble or extracellular fibrillar Aβ alone, or the combination of intracellular soluble Aβ and hyperphosphorylated tau.

  1. Lead

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    This is one of a series of reports made on industrial pollutants by the Expert Panel on Air Quality Standards to advise the United Kingdom Government on air quality standards. It describes the main sources of lead exposure, including the relative contribution of lead in the air and lead in the diet, and the methods by which it is measured in air. The Panel also considers the airborne concentrations recorded to date in the United Kingdom, ways in which lead is handled in by the body, and its toxic effects on people. The dominant source of airborne lead is petrol combustion. Other source include coal combustion, the production of non-ferrous metals and waste treatment and disposal. The justification of an air quality standard for lead is set down. The Panel recommends an air quality standard for lead in the United Kingdom of 0.25 {mu}g/m{sup 3} measured as an annual average. This is intended to protect young children, the group most vulnerable to impairment of brain function. 17 refs., 3 figs., 2 tabs.

  2. Loss of P2X7 nucleotide receptor function leads to abnormal fat distribution in mice

    OpenAIRE

    Beaucage, Kim L.; Xiao, Andrew; Pollmann, Steven I.; Grol, Matthew W.; Beach, Ryan J.; Holdsworth, David W.; Sims, Stephen M.; Darling, Mark R.; Dixon, S. Jeffrey

    2013-01-01

    The P2X7 receptor is an ATP-gated cation channel expressed by a number of cell types. We have shown previously that disruption of P2X7 receptor function results in downregulation of osteogenic markers and upregulation of adipogenic markers in calvarial cell cultures. In the present study, we assessed whether loss of P2X7 receptor function results in changes to adipocyte distribution and lipid accumulation in vivo. Male P2X7 loss-of-function (KO) mice exhibited significantly greater body weigh...

  3. Chronic skin inflammation leads to bone loss by IL-17-mediated inhibition of Wnt signaling in osteoblasts.

    Science.gov (United States)

    Uluçkan, Özge; Jimenez, Maria; Karbach, Susanne; Jeschke, Anke; Graña, Osvaldo; Keller, Johannes; Busse, Björn; Croxford, Andrew L; Finzel, Stephanie; Koenders, Marije; van den Berg, Wim; Schinke, Thorsten; Amling, Michael; Waisman, Ari; Schett, Georg; Wagner, Erwin F

    2016-03-16

    Inflammation has important roles in tissue regeneration, autoimmunity, and cancer. Different inflammatory stimuli can lead to bone loss by mechanisms that are not well understood. We show that skin inflammation induces bone loss in mice and humans. In psoriasis, one of the prototypic IL-17A-mediated inflammatory human skin diseases, low bone formation and bone loss correlated with increased serum IL-17A levels. Similarly, in two mouse models with chronic IL-17A-mediated skin inflammation,K14-IL17A(ind)andJunB(Δep), strong inhibition of bone formation was observed, different from classical inflammatory bone loss where osteoclast activation leads to bone degradation. We show that under inflammatory conditions, skin-resident cells such as keratinocytes, γδ T cells, and innate lymphoid cells were able to express IL-17A, which acted systemically to inhibit osteoblast and osteocyte function by a mechanism involving Wnt signaling. IL-17A led to decreased Wnt signaling in vitro, and importantly, pharmacological blockade of IL-17A rescued Wnt target gene expression and bone formation in vivo. These data provide a mechanism where IL-17A affects bone formation by regulating Wnt signaling in osteoblasts and osteocytes. This study suggests that using IL-17A blocking agents in psoriasis could be beneficial against bone loss in these patients. PMID:27089206

  4. Loss of Sip1 leads to migration defects and retention of ectodermal markers during lens development

    OpenAIRE

    Manthey, Abby L.; Salil A. Lachke; FitzGerald, Paul G.; Mason, Robert W.; Scheiblin, David A.; McDonald, John H.; Duncan, Melinda K.

    2013-01-01

    SIP1 encodes a DNA-binding transcription factor that regulates multiple developmental processes, as highlighted by the pleiotropic defects observed in Mowat-Wilson Syndrome, which results from mutations in this gene. Further, in adults, dysregulated SIP1 expression has been implicated in both cancer and fibrotic diseases, where it functionally links TGFβ signaling to the loss of epithelial cell characteristics and gene expression. In the ocular lens, an epithelial tissue important for vision,...

  5. The loss of local HGF, an endogenous gastrotrophic factor, leads to mucosal injuries in the stomach of mice

    International Nuclear Information System (INIS)

    The stomach is constantly exposed to mechanical and chemical stresses. Under persistent damages, epithelial cell proliferation is required to maintain mucosal integrity. Nevertheless, which ligand system(s) is physiologically involved in gastric defense remains unclear. Herein, we provide evidence that HGF is a key 'natural ligand' to reverse gastric injury. The injection of cisplatin in mice led to the loss of HGF in the gastric interstitium, associated with the decrease in proliferating epithelium and the progression of mucotitis. When c-Met tyrosine phosphorylation was abolished by anti-HGF IgG, mucosal cell proliferation became faint, leading to delayed recovery from mucotitis, and vice versa in cases of HGF supplementation. Our findings indicate that: (1) HGF/c-Met signal on mucosa is needed to restore gastric injuries; and (2) the loss of local HGF leads to manifestation of gastric lesions. This study provides a rationale that explains why HGF supplement is useful for reversing gastric diseases

  6. 大数据时代引领财务报告变革%Big Data Era Leads the Transformation of Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    管萍; 宋良荣

    2014-01-01

    财务报告的改进离不开信息技术的发展,可扩展商业报告语言(XBRL)推动了财务报告在互联网时代的发展,而大数据时代的技术变革和思维变革将再次引领财务报告新的突破,在财务报告模式的重构、信息披露制度的改进、网络财务报告方面产生重要的影响。%The development of financial reporting is inseparable from the improvement of information technology, Extensible Business Reporting Language (XBRL) promote the development of financial reporting in the internet age, big data era of technology transformation and the change of thinking will again lead the new breakthrough of the financial reporting, which have a significant impact on the financial reporting model reconstruction, and improve information disclosure system and internet financial reporting.

  7. Loss of intercellular adhesion leads to differential accumulation of hypericin in bladder cancer

    Science.gov (United States)

    Lucky, S. Sasidharan; Bhuvaneswari, Ramaswamy; Chin, William W. L.; Lau, Weber K. O.; Olivo, Malini C. D.

    2009-06-01

    Photodynamic diagnosis (PDD) exploits the photoactive nature of certain compounds, namely photosensitizers, in order to enhance the visual demarcation between normal and neoplastic tissue. Hypericin is one such potent photosensitizer that preferentially accumulate in neoplastic tissue, and fluoresce in the visible spectrum when illuminated with light of an appropriate wavelength. In our study, we investigated the role of E-cadherin in the selective permeation of hypericin in bladder cancer tissues. Clinical studies were done on a series of 43 histologically graded bladder cancer biopsy specimens, obtained from 28 patients who received intravesical instillations with 8μM hypericin solution for at least 2 hours. Immunohistochemical staining was used to assess the expression of E-cadherin, in the cryosectioned tissues. Hypericin uptake was examined by fluorescence microscopy. Immunohistochemical staining showed a clear expression of E-cadherin along the urothelial lining of the normal and pre-malignant tissues. Partial expression of these cell adhesion molecules were still observed in malignant tissues, however there was a loss of expression to variable extends along the urothelium. Thus, loss of intercellular adhesion can be associated with enhanced hypericin permeation through paracellular diffusion.

  8. Modeling of lead-acid battery capacity loss in a photovoltaic application

    Energy Technology Data Exchange (ETDEWEB)

    JUNGST,RUDOLPH G.; URBINA,ANGEL; PAEZ,THOMAS L.

    2000-04-12

    The authors have developed a model for the probabilistic behavior of a rechargeable battery acting as the energy storage component in a photovoltaic power supply system. Stochastic and deterministic models are created to simulate the behavior of the system components. The components are the solar resource, the photovoltaic power supply system, the rechargeable battery, and a load. One focus of this research is to model battery state of charge and battery capacity as a function of time. The capacity damage effect that occurs during deep discharge is introduced via a non-positive function of duration and depth of deep discharge events. Because the form of this function is unknown and varies with battery type, the authors model it with an artificial neural network (ANN) whose parameters are to be trained with experimental data. The battery capacity loss model will be described and a numerical example will be presented showing the predicted battery life under different PV system use scenarios.

  9. Loss of Cell-Substrate Adhesion Leads to Periodic Shape Oscillations in Fibroblasts

    CERN Document Server

    Pullarkat, P A

    2006-01-01

    We report the phenomenon of periodic shape oscillations occurring in 3T3 fibroblasts merely as a consequence of a loss of cell-substrate adhesion. The oscillatory behavior can last many hours at a constant frequency, and can be switched off and on using chemical agents. This robustness allows for the extraction of quantitative data using single cells. We demonstrate that the frequency of oscillations increases with increasing actomyosin contractility. Both the Myosin Light Chain Kinase as well as the Rho Kinase pathways are shown to operate during this process. Further, we reveal significant similarities between the oscillatory dynamics and the commonly observed phenomenon of blebbing. We show that both these processes ceases when extracellular calcium is depleted or when stretch activated calcium channels are blocked. This, along with the fact that these dynamical processes require actomyosin contactility points towards strong similarities in the respective mechanisms. Finally, we speculate on a possible mec...

  10. Loss of NAD Homeostasis Leads to Progressive and Reversible Degeneration of Skeletal Muscle.

    Science.gov (United States)

    Frederick, David W; Loro, Emanuele; Liu, Ling; Davila, Antonio; Chellappa, Karthikeyani; Silverman, Ian M; Quinn, William J; Gosai, Sager J; Tichy, Elisia D; Davis, James G; Mourkioti, Foteini; Gregory, Brian D; Dellinger, Ryan W; Redpath, Philip; Migaud, Marie E; Nakamaru-Ogiso, Eiko; Rabinowitz, Joshua D; Khurana, Tejvir S; Baur, Joseph A

    2016-08-01

    NAD is an obligate co-factor for the catabolism of metabolic fuels in all cell types. However, the availability of NAD in several tissues can become limited during genotoxic stress and the course of natural aging. The point at which NAD restriction imposes functional limitations on tissue physiology remains unknown. We examined this question in murine skeletal muscle by specifically depleting Nampt, an essential enzyme in the NAD salvage pathway. Knockout mice exhibited a dramatic 85% decline in intramuscular NAD content, accompanied by fiber degeneration and progressive loss of both muscle strength and treadmill endurance. Administration of the NAD precursor nicotinamide riboside rapidly ameliorated functional deficits and restored muscle mass despite having only a modest effect on the intramuscular NAD pool. Additionally, lifelong overexpression of Nampt preserved muscle NAD levels and exercise capacity in aged mice, supporting a critical role for tissue-autonomous NAD homeostasis in maintaining muscle mass and function. PMID:27508874

  11. Rhabdomyolysis-Associated Mutations in Human LPIN1 Lead to Loss of Phosphatidic Acid Phosphohydrolase Activity.

    Science.gov (United States)

    Schweitzer, George G; Collier, Sara L; Chen, Zhouji; Eaton, James M; Connolly, Anne M; Bucelli, Robert C; Pestronk, Alan; Harris, Thurl E; Finck, Brian N

    2015-01-01

    Rhabdomyolysis is an acute syndrome due to extensive injury of skeletal muscle. Recurrent rhabdomyolysis is often caused by inborn errors in intermediary metabolism, and recent work has suggested that mutations in the human gene encoding lipin 1 (LPIN1) may be a common cause of recurrent rhabdomyolysis in children. Lipin 1 dephosphorylates phosphatidic acid to form diacylglycerol (phosphatidic acid phosphohydrolase; PAP) and acts as a transcriptional regulatory protein to control metabolic gene expression. Herein, a 3-year-old boy with severe recurrent rhabdomyolysis was determined to be a compound heterozygote for a novel c.1904T>C (p.Leu635Pro) substitution and a previously reported genomic deletion of exons 18-19 (E766-S838_del) in LPIN1. Western blotting with patient muscle biopsy lysates demonstrated a marked reduction in lipin 1 protein, while immunohistochemical staining for lipin 1 showed abnormal subcellular localization. We cloned cDNAs to express recombinant lipin 1 proteins harboring pathogenic mutations and showed that the E766-S838_del allele was not expressed at the RNA or protein level. Lipin 1 p.Leu635Pro was expressed, but the protein was less stable, was aggregated in the cytosol, and was targeted for proteosomal degradation. Another pathogenic single amino acid substitution, lipin 1 p.Arg725His, was well expressed and retained its transcriptional regulatory function. However, both p.Leu635Pro and p.Arg725His proteins were found to be deficient in PAP activity. Kinetic analyses demonstrated a loss of catalysis rather than diminished substrate binding. These data suggest that loss of lipin 1-mediated PAP activity may be involved in the pathogenesis of rhabdomyolysis in lipin 1 deficiency. PMID:25967228

  12. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  13. DENTAL CARIES LEADING TO PREMATURE LOSS OF BABY TEETH- IMPLICATIONS AND MANAGEMENT

    OpenAIRE

    Rachana Bahuguna; Sansriti Narain; Tapan Singh

    2011-01-01

    Dental caries is a destructive process causing decalcification of the tooth enamel and leading to continued destruction of enamel and dentin, and cavitation of the tooth. Dental caries can occur soon after eruption of the primary teeth, starting at 6 months of age. Primary teeth are present for a reason. One key reason is that they save space for the permanent tooth, which will erupt into its position when the deciduous / primary tooth is lost normally. If a primary tooth (baby or milk tooth...

  14. Loss of arylformamidase with reduced thymidine kinase expression leads to impaired glucose tolerance

    Directory of Open Access Journals (Sweden)

    Alison J. Hugill

    2015-11-01

    Full Text Available Tryptophan metabolites have been linked in observational studies with type 2 diabetes, cognitive disorders, inflammation and immune system regulation. A rate-limiting enzyme in tryptophan conversion is arylformamidase (Afmid, and a double knockout of this gene and thymidine kinase (Tk has been reported to cause renal failure and abnormal immune system regulation. In order to further investigate possible links between abnormal tryptophan catabolism and diabetes and to examine the effect of single Afmid knockout, we have carried out metabolic phenotyping of an exon 2 Afmid gene knockout. These mice exhibit impaired glucose tolerance, although their insulin sensitivity is unchanged in comparison to wild-type animals. This phenotype results from a defect in glucose stimulated insulin secretion and these mice show reduced islet mass with age. No evidence of a renal phenotype was found, suggesting that this published phenotype resulted from loss of Tk expression in the double knockout. However, despite specifically removing only exon 2 of Afmid in our experiments we also observed some reduction of Tk expression, possibly due to a regulatory element in this region. In summary, our findings support a link between abnormal tryptophan metabolism and diabetes and highlight beta cell function for further mechanistic analysis.

  15. Loss of GSNOR1 Function Leads to Compromised Auxin Signaling and Polar Auxin Transport.

    Science.gov (United States)

    Shi, Ya-Fei; Wang, Da-Li; Wang, Chao; Culler, Angela Hendrickson; Kreiser, Molly A; Suresh, Jayanti; Cohen, Jerry D; Pan, Jianwei; Baker, Barbara; Liu, Jian-Zhong

    2015-09-01

    Cross talk between phytohormones, nitric oxide (NO), and auxin has been implicated in the control of plant growth and development. Two recent reports indicate that NO promoted auxin signaling but inhibited auxin transport probably through S-nitrosylation. However, genetic evidence for the effect of S-nitrosylation on auxin physiology has been lacking. In this study, we used a genetic approach to understand the broader role of S-nitrosylation in auxin physiology in Arabidopsis. We compared auxin signaling and transport in Col-0 and gsnor1-3, a loss-of-function GSNOR1 mutant defective in protein de-nitrosylation. Our results showed that auxin signaling was impaired in the gsnor1-3 mutant as revealed by significantly reduced DR5-GUS/DR5-GFP accumulation and compromised degradation of AXR3NT-GUS, a useful reporter in interrogating auxin-mediated degradation of Aux/IAA by auxin receptors. In addition, polar auxin transport was compromised in gsnor1-3, which was correlated with universally reduced levels of PIN or GFP-PIN proteins in the roots of the mutant in a manner independent of transcription and 26S proteasome degradation. Our results suggest that S-nitrosylation and GSNOR1-mediated de-nitrosylation contribute to auxin physiology, and impaired auxin signaling and compromised auxin transport are responsible for the auxin-related morphological phenotypes displayed by the gsnor1-3 mutant. PMID:25917173

  16. Predicting short-term weight loss using four leading health behavior change theories

    Directory of Open Access Journals (Sweden)

    Barata José T

    2007-04-01

    Full Text Available Abstract Background This study was conceived to analyze how exercise and weight management psychosocial variables, derived from several health behavior change theories, predict weight change in a short-term intervention. The theories under analysis were the Social Cognitive Theory, the Transtheoretical Model, the Theory of Planned Behavior, and Self-Determination Theory. Methods Subjects were 142 overweight and obese women (BMI = 30.2 ± 3.7 kg/m2; age = 38.3 ± 5.8y, participating in a 16-week University-based weight control program. Body weight and a comprehensive psychometric battery were assessed at baseline and at program's end. Results Weight decreased significantly (-3.6 ± 3.4%, p Conclusion The present models were able to predict 20–30% of variance in short-term weight loss and changes in weight management self-efficacy accounted for a large share of the predictive power. As expected from previous studies, exercise variables were only moderately associated with short-term outcomes; they are expected to play a larger explanatory role in longer-term results.

  17. DENTAL CARIES LEADING TO PREMATURE LOSS OF BABY TEETH- IMPLICATIONS AND MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rachana Bahuguna

    2011-12-01

    Full Text Available Dental caries is a destructive process causing decalcification of the tooth enamel and leading to continued destruction of enamel and dentin, and cavitation of the tooth. Dental caries can occur soon after eruption of the primary teeth, starting at 6 months of age. Primary teeth are present for a reason. One key reason is that they save space for the permanent tooth, which will erupt into its position when the deciduous / primary tooth is lost normally. If a primary tooth (baby or milk tooth, has to be removed early due to say, an abscess which is mostly a result of dental caries, a space maintainer may be recommended to save the space. If the space is not preserved, the other teeth may drift causing difficult to treat crowding and orthodontic problems. These "spacers" are placed temporarily, and are not permanent. They are removed when the new tooth (usually a bicuspid erupts or the abutment teeth get loose.

  18. Human impact on atolls leads to coral loss and community homogenisation: a modeling study.

    Directory of Open Access Journals (Sweden)

    Bernhard M Riegl

    Full Text Available We explore impacts on pristine atolls subjected to anthropogenic near-field (human habitation and far-field (climate and environmental change pressure. Using literature data of human impacts on reefs, we parameterize forecast models to evaluate trajectories in coral cover under impact scenarios that primarily act via recruitment and increased mortality of larger corals. From surveys across the Chagos, we investigate the regeneration dynamics of coral populations distant from human habitation after natural disturbances. Using a size-based mathematical model based on a time-series of coral community and population data from 1999-2006, we provide hind- and forecast data for coral population dynamics within lagoons and on ocean-facing reefs verified against monitoring from 1979-2009. Environmental data (currents, temperatures were used for calibration. The coral community was simplified into growth typologies: branching and encrusting, arboresent and massive corals. Community patterns observed in the field were influenced by bleaching-related mortality, most notably in 1998. Survival had been highest in deep lagoonal settings, which suggests a refuge. Recruitment levels were higher in lagoons than on ocean-facing reefs. When adding stress by direct human pressure, climate and environmental change as increased disturbance frequency and modified recruitment and mortality levels (due to eutrophication, overfishing, pollution, heat, acidification, etc, models suggest steep declines in coral populations and loss of community diversification among habitats. We found it likely that degradation of lagoonal coral populations would impact regeneration potential of all coral populations, also on ocean-facing reefs, thus decreasing reef resilience on the entire atoll.

  19. 大数据引领我们走向智能化时代%Big data lead us into intelligent era

    Institute of Scientific and Technical Information of China (English)

    张振兴; 牟如玲

    2014-01-01

    随着大数据时代的来临,各类智能化新生事物层出不穷。该文通过了解大数据的真正内涵,分析大数据的现状和面临的困境,总结大数据分析的深刻意义,我们对大数据分析的技术与方法以及大数据工作的开展给出了相关建议。%With the coming of the era of big data, all kinds of intelligent new things emerge in endlessly. By understanding the true spirit of big data, analyzing the present situation and the difficulty of big data faced by, summarizing the deep meaning of big data analysis, we have given the related suggestions about technology and method of big data analysis and of big data work.

  20. Loss of runt-related transcription factor 3 expression leads hepatocellular carcinoma cells to escape apoptosis

    International Nuclear Information System (INIS)

    Runt-related transcription factor 3 (RUNX3) is known as a tumor suppressor gene for gastric cancer and other cancers, this gene may be involved in the development of hepatocellular carcinoma (HCC). RUNX3 expression was analyzed by immunoblot and immunohistochemistry in HCC cells and tissues, respectively. Hep3B cells, lacking endogenous RUNX3, were introduced with RUNX3 constructs. Cell proliferation was measured using the MTT assay and apoptosis was evaluated using DAPI staining. Apoptosis signaling was assessed by immunoblot analysis. RUNX3 protein expression was frequently inactivated in the HCC cell lines (91%) and tissues (90%). RUNX3 expression inhibited 90 ± 8% of cell growth at 72 h in serum starved Hep3B cells. Forty-eight hour serum starvation-induced apoptosis and the percentage of apoptotic cells reached 31 ± 4% and 4 ± 1% in RUNX3-expressing Hep3B and control cells, respectively. Apoptotic activity was increased by Bim expression and caspase-3 and caspase-9 activation. RUNX3 expression enhanced serum starvation-induced apoptosis in HCC cell lines. RUNX3 is deleted or weakly expressed in HCC, which leads to tumorigenesis by escaping apoptosis

  1. Loss of runt-related transcription factor 3 expression leads hepatocellular carcinoma cells to escape apoptosis

    Directory of Open Access Journals (Sweden)

    Nakamura Shinichiro

    2011-01-01

    Full Text Available Abstract Background Runt-related transcription factor 3 (RUNX3 is known as a tumor suppressor gene for gastric cancer and other cancers, this gene may be involved in the development of hepatocellular carcinoma (HCC. Methods RUNX3 expression was analyzed by immunoblot and immunohistochemistry in HCC cells and tissues, respectively. Hep3B cells, lacking endogenous RUNX3, were introduced with RUNX3 constructs. Cell proliferation was measured using the MTT assay and apoptosis was evaluated using DAPI staining. Apoptosis signaling was assessed by immunoblot analysis. Results RUNX3 protein expression was frequently inactivated in the HCC cell lines (91% and tissues (90%. RUNX3 expression inhibited 90 ± 8% of cell growth at 72 h in serum starved Hep3B cells. Forty-eight hour serum starvation-induced apoptosis and the percentage of apoptotic cells reached 31 ± 4% and 4 ± 1% in RUNX3-expressing Hep3B and control cells, respectively. Apoptotic activity was increased by Bim expression and caspase-3 and caspase-9 activation. Conclusion RUNX3 expression enhanced serum starvation-induced apoptosis in HCC cell lines. RUNX3 is deleted or weakly expressed in HCC, which leads to tumorigenesis by escaping apoptosis.

  2. Comparison of Inventory Systems with Service, Positive Lead-Time, Loss, and Retrial of Customers

    Directory of Open Access Journals (Sweden)

    A. Krishnamoorthy

    2007-01-01

    Full Text Available We analyze and compare three (s,S inventory systems with positive service time and retrial of customers. In all of these systems, arrivals of customers form a Poisson process and service times are exponentially distributed. When the inventory level depletes to s due to services, an order of replenishment is placed. The lead-time follows an exponential distribution. In model I, an arriving customer, finding the inventory dry or server busy, proceeds to an orbit with probability γ and is lost forever with probability (1−γ. A retrial customer in the orbit, finding the inventory dry or server busy, returns to the orbit with probability δ and is lost forever with probability (1−δ. In addition to the description in model I, we provide a buffer of varying (finite capacity equal to the current inventory level for model II and another having capacity equal to the maximum inventory level S for model III. In models II and III, an arriving customer, finding the buffer full, proceeds to an orbit with probability γ and is lost forever with probability (1−γ. A retrial customer in the orbit, finding the buffer full, returns to the orbit with probability δ and is lost forever with probability (1−δ. In all these models, the interretrial times are exponentially distributed with linear rate. Using matrix-analytic method, we study these inventory models. Some measures of the system performance in the steady state are derived. A suitable cost function is defined for all three cases and analyzed using graphical illustrations.

  3. Loss of Frataxin induces iron toxicity, sphingolipid synthesis, and Pdk1/Mef2 activation, leading to neurodegeneration

    Science.gov (United States)

    Chen, Kuchuan; Lin, Guang; Haelterman, Nele A; Ho, Tammy Szu-Yu; Li, Tongchao; Li, Zhihong; Duraine, Lita; Graham, Brett H; Jaiswal, Manish; Yamamoto, Shinya; Rasband, Matthew N; Bellen, Hugo J

    2016-01-01

    Mutations in Frataxin (FXN) cause Friedreich’s ataxia (FRDA), a recessive neurodegenerative disorder. Previous studies have proposed that loss of FXN causes mitochondrial dysfunction, which triggers elevated reactive oxygen species (ROS) and leads to the demise of neurons. Here we describe a ROS independent mechanism that contributes to neurodegeneration in fly FXN mutants. We show that loss of frataxin homolog (fh) in Drosophila leads to iron toxicity, which in turn induces sphingolipid synthesis and ectopically activates 3-phosphoinositide dependent protein kinase-1 (Pdk1) and myocyte enhancer factor-2 (Mef2). Dampening iron toxicity, inhibiting sphingolipid synthesis by Myriocin, or reducing Pdk1 or Mef2 levels, all effectively suppress neurodegeneration in fh mutants. Moreover, increasing dihydrosphingosine activates Mef2 activity through PDK1 in mammalian neuronal cell line suggesting that the mechanisms are evolutionarily conserved. Our results indicate that an iron/sphingolipid/Pdk1/Mef2 pathway may play a role in FRDA. DOI: http://dx.doi.org/10.7554/eLife.16043.001 PMID:27343351

  4. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  5. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  6. Big Data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper is...... to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big...

  7. Loss of l(3)mbt leads to acquisition of the ping-pong cycle in Drosophila ovarian somatic cells.

    Science.gov (United States)

    Sumiyoshi, Tetsutaro; Sato, Kaoru; Yamamoto, Hitomi; Iwasaki, Yuka W; Siomi, Haruhiko; Siomi, Mikiko C

    2016-07-15

    In Drosophila germ cells, PIWI-interacting RNAs (piRNAs) are amplified through a PIWI slicer-dependent feed-forward loop termed the ping-pong cycle, yielding secondary piRNAs. However, the detailed mechanism remains poorly understood, largely because an ex vivo model system amenable to biochemical analyses has not been available. Here, we show that CRISPR-mediated loss of function of lethal (3) malignant brain tumor [l(3)mbt] leads to ectopic activation of the germ-specific ping-pong cycle in ovarian somatic cells. Perinuclear foci resembling nuage, the ping-pong center, appeared following l(3)mbt mutation. This activation of the ping-pong machinery in cultured cells will greatly facilitate elucidation of the mechanism underlying secondary piRNA biogenesis in Drosophila. PMID:27474440

  8. GNA13 loss in germinal center B cells leads to impaired apoptosis and promotes lymphoma in vivo.

    Science.gov (United States)

    Healy, Jane A; Nugent, Adrienne; Rempel, Rachel E; Moffitt, Andrea B; Davis, Nicholas S; Jiang, Xiaoyu; Shingleton, Jennifer R; Zhang, Jenny; Love, Cassandra; Datta, Jyotishka; McKinney, Matthew E; Tzeng, Tiffany J; Wettschureck, Nina; Offermanns, Stefan; Walzer, Katelyn A; Chi, Jen-Tsan; Rasheed, Suhail A K; Casey, Patrick J; Lossos, Izidore S; Dave, Sandeep S

    2016-06-01

    GNA13 is the most frequently mutated gene in germinal center (GC)-derived B-cell lymphomas, including nearly a quarter of Burkitt lymphoma and GC-derived diffuse large B-cell lymphoma. These mutations occur in a pattern consistent with loss of function. We have modeled the GNA13-deficient state exclusively in GC B cells by crossing the Gna13 conditional knockout mouse strain with the GC-specific AID-Cre transgenic strain. AID-Cre(+) GNA13-deficient mice demonstrate disordered GC architecture and dark zone/light zone distribution in vivo, and demonstrate altered migration behavior, decreased levels of filamentous actin, and attenuated RhoA activity in vitro. We also found that GNA13-deficient mice have increased numbers of GC B cells that display impaired caspase-mediated cell death and increased frequency of somatic hypermutation in the immunoglobulin VH locus. Lastly, GNA13 deficiency, combined with conditional MYC transgene expression in mouse GC B cells, promotes lymphomagenesis. Thus, GNA13 loss is associated with GC B-cell persistence, in which impaired apoptosis and ongoing somatic hypermutation may lead to an increased risk of lymphoma development. PMID:26989201

  9. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper is...... to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay...

  10. Lung Injury Combined with Loss of Regulatory T Cells Leads to De Novo Lung-Restricted Autoimmunity.

    Science.gov (United States)

    Chiu, Stephen; Fernandez, Ramiro; Subramanian, Vijay; Sun, Haiying; DeCamp, Malcolm M; Kreisel, Daniel; Perlman, Harris; Budinger, G R Scott; Mohanakumar, Thalachallour; Bharat, Ankit

    2016-07-01

    More than one third of patients with chronic lung disease undergoing lung transplantation have pre-existing Abs against lung-restricted self-Ags, collagen type V (ColV), and k-α1 tubulin (KAT). These Abs can also develop de novo after lung transplantation and mediate allograft rejection. However, the mechanisms leading to lung-restricted autoimmunity remain unknown. Because these self-Ags are normally sequestered, tissue injury is required to expose them to the immune system. We previously showed that respiratory viruses can induce apoptosis in CD4(+)CD25(+)Foxp3(+) regulatory T cells (Tregs), the key mediators of self-tolerance. Therefore, we hypothesized that lung-tissue injury can lead to lung-restricted immunity if it occurs in a setting when Tregs are impaired. We found that human lung recipients who suffer respiratory viral infections experienced a decrease in peripheral Tregs. Pre-existing lung allograft injury from donor-directed Abs or gastroesophageal reflux led to new ColV and KAT Abs post respiratory viral infection. Similarly, murine parainfluenza (Sendai) respiratory viral infection caused a decrease in Tregs. Intratracheal instillation of anti-MHC class I Abs, but not isotype control, followed by murine Sendai virus infection led to development of Abs against ColV and KAT, but not collagen type II (ColII), a cartilaginous protein. This was associated with expansion of IFN-γ-producing CD4(+) T cells specific to ColV and KAT, but not ColII. Intratracheal anti-MHC class I Abs or hydrochloric acid in Foxp3-DTR mice induced ColV and KAT, but not ColII, immunity, only if Tregs were depleted using diphtheria toxin. We conclude that tissue injury combined with loss of Tregs can lead to lung-tissue-restricted immunity. PMID:27194786

  11. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo; Valentin, Finn

    utilize a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...... on preferentialattachment, but more of an assortativity effect creating not merely a rich-gets-richer effect but an elitist network with high entry barriers. In this acclaimed democratic and collaborative environment of Big Science, the elite closes in on itself. We propose this tendency to be even...

  12. Lack of X-linked inhibitor of apoptosis protein leads to increased apoptosis and tissue loss following neonatal brain injury

    Directory of Open Access Journals (Sweden)

    Tim West

    2009-04-01

    Full Text Available Neurological deficits caused by H-I (hypoxia-ischaemia) to the perinatal brain are often severely debilitating and lead to motor impairment, intellectual disability and seizures. Perinatal brain injury is distinct from adult brain injury in that the developing brain is undergoing the normal process of neuronal elimination by apoptotic cell death and thus the apoptotic machinery is more easily engaged and activated in response to injury. Thus cell death in response to neonatal H-I brain injury is partially due to mitochondrial dysfunction and activation of the apoptosome and caspase 3. An important regulator of the apoptotic response following mitochondrial dysfunction is XIAP (X-linked inhibitor of apoptosis protein). XIAP inhibits apoptosis at the level of caspase 9 and caspase 3 activation, and lack of XIAP in vitro has been shown to lead to increased apoptotic cell death. In the present study we show that mice lacking the gene encoding the XIAP protein have an exacerbated response to neonatal H-I injury as measured by tissue loss at 7 days following the injury. In addition, when the XIAP-deficient mice were studied at 24 h post-H-I we found that the increase in injury correlates with an increased apoptotic response in the XIAP-deficient mice and also with brain imaging changes in T2-weighted magnetic resonance imaging and apparent diffusion coefficient that correspond to the location of apoptotic cell death. These results identify a critical role of XIAP in regulating neuronal apoptosis in vivo and demonstrate the enhanced vulnerability of neurons to injury in the absence of XIAP in the developing brain.

  13. Overexpression of galectin-7 in mouse epidermis leads to loss of cell junctions and defective skin repair.

    Directory of Open Access Journals (Sweden)

    Gaëlle Gendronneau

    Full Text Available The proteins of the galectin family are implicated in many cellular processes, including cell interactions, polarity, intracellular trafficking, and signal transduction. In human and mouse, galectin-7 is almost exclusively expressed in stratified epithelia, notably in the epidermis. Galectin-7 expression is also altered in several human tumors of epithelial origin. This study aimed at dissecting the consequences of galectin-7 overexpression on epidermis structure and functions in vivo.We established transgenic mice specifically overexpressing galectin-7 in the basal epidermal keratinocytes and analyzed the consequences on untreated skin and after UVB irradiation or mechanical injury.The intercellular cohesion of the epidermis is impaired in transgenic animals, with gaps developing between adjacent keratinocytes, associated with loss of adherens junctions. The epidermal architecture is aberrant with perturbations in the multilayered cellular organisation of the tissue, and structural defects in the basement membrane. These transgenic animals displayed a reduced re-epithelialisation potential following superficial wound, due to a defective collective migration of keratinocytes. Finally, a single mild dose of UVB induced an abnormal apoptotic response in the transgenic epidermis.These results indicate that an excess of galectin-7 leads to a destabilisation of adherens junctions associated with defects in epidermal repair. As this phenotype shares similarities with that of galectin-7 null mutant mice, we conclude that a critical level of this protein is required for maintaining proper epidermal homeostasis. This study brings new insight into the mode of action of galectins in normal and pathological situations.

  14. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  15. Big Science

    International Nuclear Information System (INIS)

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions

  16. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  17. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  18. Mitotic defects lead to pervasive aneuploidy and accompany loss of RB1 activity in mouse LmnaDhe dermal fibroblasts.

    Directory of Open Access Journals (Sweden)

    C Herbert Pratt

    Full Text Available BACKGROUND: Lamin A (LMNA is a component of the nuclear lamina and is mutated in several human diseases, including Emery-Dreifuss muscular dystrophy (EDMD; OMIM ID# 181350 and the premature aging syndrome Hutchinson-Gilford progeria syndrome (HGPS; OMIM ID# 176670. Cells from progeria patients exhibit cell cycle defects in both interphase and mitosis. Mouse models with loss of LMNA function have reduced Retinoblastoma protein (RB1 activity, leading to aberrant cell cycle control in interphase, but how mitosis is affected by LMNA is not well understood. RESULTS: We examined the cell cycle and structural phenotypes of cells from mice with the Lmna allele, Disheveled hair and ears (Lmna(Dhe. We found that dermal fibroblasts from heterozygous Lmna(Dhe (Lmna(Dhe/+ mice exhibit many phenotypes of human laminopathy cells. These include severe perturbations to the nuclear shape and lamina, increased DNA damage, and slow growth rates due to mitotic delay. Interestingly, Lmna(Dhe/+ fibroblasts also had reduced levels of hypophosphorylated RB1 and the non-SMC condensin II-subunit D3 (NCAP-D3, a mitosis specific centromere condensin subunit that depends on RB1 activity. Mitotic check point control by mitotic arrest deficient-like 1 (MAD2L1 also was perturbed in Lmna(Dhe/+ cells. Lmna(Dhe/+ fibroblasts were consistently aneuploid and had higher levels of micronuclei and anaphase bridges than normal fibroblasts, consistent with chromosome segregation defects. CONCLUSIONS: These data indicate that RB1 may be a key regulator of cellular phenotype in laminopathy-related cells, and suggest that the effects of LMNA on RB1 include both interphase and mitotic cell cycle control.

  19. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  20. AC Transport Losses Calculation in a Bi-2223 Current Lead Using Thermal Coupling With an Analytical Formula

    OpenAIRE

    Berger, Kévin; Lévêque, Jean; Netter, Denis; Douine, Bruno; Rezzoug, Abderrezak

    2005-01-01

    When a superconductor is fed with an alternating current, the temperature rise created by the losses tends to reduce the current carrying capacity. If the amplitude of the current exceeds the value of the critical current, then the losses become particularly high and the thermal heating considerable. In this paper, a numerical and an analytical model which allow to estimate AC transport losses are presented. These models, which use the expression of Ic(T) and n(T), are available for any appli...

  1. 数字技术开辟牙体牙髓创新之路%Overall digitalization: leading innovation of endodontics in big data era

    Institute of Scientific and Technical Information of China (English)

    凌均棨

    2016-01-01

    在大数据时代,数字技术为现代口腔医学带来了新的挑战和机遇.锥形束CT、计算机辅助设计与制作、3D打印、显微CT和数字化教学等现代数字技术的应用,为牙体牙髓病诊疗和研究提供了新的理念和模式,概述常用数字技术在牙体牙髓病学科发展中的应用与展望.%In big data era,digital technologies bring great challenges and opportunities to modern stomatology.The applications of digital technologies,such as cone-beam CT(CBCT),computer aided design,(CAD) and computer aided manufacture(CAM),3D printing and digital approaches for education,provide new concepts and patterns to the treatment and study of endodontic diseases.This review provides an overview of the application and prospect of commonly used digital technologies in the development of endodontics.

  2. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  3. AC loss in striped (filamentary) YBCO coated conductors leading to designs for high frequencies and field-sweep amplitudes

    International Nuclear Information System (INIS)

    AC losses of YBCO coated conductors are investigated by calculation and experiment for the higher frequency regime. Previous research using YBCO film deposited onto single-crystal substrates demonstrated the effectiveness of 'striping' or filamentary subdivision as a technique for AC loss reduction. As a result of these studies the idea of subdividing YBCO 'coated conductors' (YBCO, overlayer, and even underlayer) into such stripes suggested itself. The suggestion was implemented by burning grooves into samples of coated conductor using laser micromachining. Various machining parameters were investigated, and the striping and slicing characteristics are presented. Loss measurements were performed on unstriped as well as striped samples by the pick-up coil technique at frequencies from 50 to 200 Hz at field sweep amplitudes of up to 150 mT. The effect of soft ferromagnetic Fe shielding was also investigated. The results of the experiments form a starting point for a more general study of reduced-loss coated conductor design (including hysteretic, coupling, normal eddy current, and transport losses) projected into higher ranges of frequency and field-sweep amplitude with transformer and all-cryogenic-motor/generator applications in mind

  4. Prostate Epithelial Pten/TP53 Loss Leads to Transformation of Multipotential Progenitors and Epithelial to Mesenchymal Transition

    OpenAIRE

    Martin, Philip; Liu, Yen-Nien; Pierce, Rachel; Abou-Kheir, Wassim; Casey, Orla; Seng, Victoria; Camacho, Daniel; Simpson, R. Mark; Kelly, Kathleen

    2011-01-01

    Loss of PTEN and loss of TP53 are common genetic aberrations occurring in prostate cancer. PTEN and TP53 contribute to the regulation of self-renewal and differentiation in prostate progenitors, presumptive tumor initiating cells for prostate cancer. Here we characterize the transformed phenotypes resulting from deletion of the Pten and TP53 tumor suppressors in prostate epithelium. Using the PB-Cre4+Ptenfl/flTP53fl/fl model of prostate cancer, we describe the histological and metastatic prop...

  5. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  6. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  7. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  8. Correlation of Beam Electron and LED Signal Losses under Irradiation and Long-term Recovery of Lead Tungstate Crystals

    OpenAIRE

    Batarin, V. A.; Butler, J.; Davidenko, A. M.; Derevschikov, A. A.; Goncharenko, Y. M.; Grishin, V. N.; Kachanov, V A.; Konstantinov, A. S.; Kravtsov, V. I.; Kubota, Y.; Lukanin, V. S.; Matulenko, Y. A.; Melnick, Y. M.; Meschanin, A. P.; Mikhalin, N. E.

    2005-01-01

    Radiation damage in lead tungstate crystals reduces their transparency. The calibration that relates the amount of light detected in such crystals to incident energy of photons or electrons is of paramount importance to maintaining the energy resolution the detection system. We report on tests of lead tungstate crystals, read out by photomultiplier tubes, exposed to irradiation by monoenergetic electron or pion beams. The beam electrons themselves were used to measure the scintillation light ...

  9. Confirmation bias leads to overestimation of losses of woody plant foliage to insect herbivores in tropical regions

    Directory of Open Access Journals (Sweden)

    Mikhail V. Kozlov

    2014-12-01

    Full Text Available Confirmation bias, i.e., the tendency of humans to seek out evidence in a manner that confirms their hypotheses, is almost overlooked in ecological studies. For decades, insect herbivory was commonly accepted to be highest in tropical regions. By comparing the data collected blindly (when the observer was not aware of the research hypothesis being tested with the results of non-blind studies (when the observer knew what results could be expected, we tested the hypothesis that the records made in the tropics could have overestimated community-wide losses of plant foliage to insects due to the confirmation bias. The average loss of leaf area of woody plants to defoliating insects in Brazil, when measured by a blind method (1.11%, was significantly lower than the loss measured in non-blind studies, both original (5.14% and published (6.37%. We attribute the overestimation of the community-wide losses of plant foliage to insects in non-blind studies to the unconsciously preconceived selection of study species with higher-than-average levels of herbivory. Based on our findings, we urge for caution in obtaining community-wide characteristics from the results of multiple single-species studies. Our data suggest that we may need to revise the paradigm of the highest level of background insect herbivory in the tropical regions. More generally, we argue that more attention should be paid by ecologists to the problem of biases occurring at the pre-publication phases of the scientific research and, consequently, to the development and the wide application of methods that avoid biases occurring due to unconscious psychological processes.

  10. Muscle-Specific Loss of Apoptosis-Inducing Factor Leads to Mitochondrial Dysfunction, Skeletal Muscle Atrophy, and Dilated Cardiomyopathy

    OpenAIRE

    Joza, Nicholas; Oudit, Gavin Y.; Brown, Doris; Bénit, Paule; Kassiri, Zamaneh; Vahsen, Nicola; Benoit, Loralyn; Patel, Mikin M.; Nowikovsky, Karin; Vassault, Anne; Backx, Peter H; Wada, Teiji; Kroemer, Guido; Rustin, Pierre; Penninger, Josef M.

    2005-01-01

    Cardiac and skeletal muscle critically depend on mitochondrial energy metabolism for their normal function. Recently, we showed that apoptosis-inducing factor (AIF), a mitochondrial protein implicated in programmed cell death, plays a role in mitochondrial respiration. However, the in vivo consequences of AIF-regulated mitochondrial respiration resulting from a loss-of-function mutation in Aif are not known. Here, we report tissue-specific deletion of Aif in the mouse. Mice in which Aif has b...

  11. Loss of TET2 in hematopoietic cells leads to DNA hypermethylation of active enhancers and induction of leukemogenesis

    OpenAIRE

    Rasmussen, Kasper D.; Jia, Guangshuai; Johansen, Jens V.; Pedersen, Marianne T.; Rapin, Nicolas; Bagger, Frederik O.; Porse, Bo T; Bernard, Olivier A; Christensen, Jesper; Helin, Kristian

    2015-01-01

    The methylcytosine dioxygenase TET2 is frequently mutated in hematological disorders, including acute myeloid leukemia (AML), and has been suggested to protect CpG islands and promoters from aberrant DNA methylation. Rasmussen et al. used a novel Tet2-dependent leukemia mouse model to show that the primary effect of Tet2 loss in preleukemic hematopoietic cells is progressive and widespread DNA hypermethylation affecting up to 25% of active enhancer elements.

  12. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  13. Loss of p53-regulatory protein IFI16 induces NBS1 leading to activation of p53-mediated checkpoint by phosphorylation of p53 SER37.

    Science.gov (United States)

    Tawara, Hideyuki; Fujiuchi, Nobuko; Sironi, Juan; Martin, Sarah; Aglipay, Jason; Ouchi, Mutsuko; Taga, Makoto; Chen, Phang-Lang; Ouchi, Toru

    2008-01-01

    Our previous results that IFI16 is involved in p53 transcription activity under conditions of ionizing radiation (IR), and that the protein is frequently lost in human breast cancer cell lines and breast adenocarcinoma tissues suggesting that IFI16 plays a crucial role in controlling cell growth. Here, we show that loss of IFI16 by RNA interference in cell culture causes elevated phosphorylation of p53 Ser37 and accumulated NBS1 (nibrin) and p21WAF1, leading to growth retardation. Consistent with these observations, doxycyclin-induced NBS1 caused accumulation of p21WAF1 and increased phosphorylation of p53 Ser37, leading to cell cycle arrest in G1 phase. Wortmannin treatment was found to decrease p53 Ser37 phosphorylation in NBS-induced cells. These results suggest that loss of IFI16 activates p53 checkpoint through NBS1-DNA-PKcs pathway. PMID:17981542

  14. Early-life lead exposure recapitulates the selective loss of parvalbumin-positive GABAergic interneurons and subcortical dopamine system hyperactivity present in schizophrenia

    OpenAIRE

    Stansfield, K H; Ruby, K N; Soares, B D; McGlothan, J L; Liu, X.; Guilarte, T.R.

    2015-01-01

    Environmental factors have been associated with psychiatric disorders and recent epidemiological studies suggest an association between prenatal lead (Pb2+) exposure and schizophrenia (SZ). Pb2+ is a potent antagonist of the N-methyl-D-aspartate receptor (NMDAR) and converging evidence indicates that NMDAR hypofunction has a key role in the pathophysiology of SZ. The glutamatergic hypothesis of SZ posits that NMDAR hypofunction results in the loss of parvalbumin (PV)-positive GABAergic intern...

  15. Rewiring yeast acetate metabolism through MPC1 loss of function leads to mitochondrial damage and decreases chronological lifespan

    Directory of Open Access Journals (Sweden)

    Ivan Orlandi

    2014-11-01

    Full Text Available During growth on fermentable substrates, such as glucose, pyruvate, which is the end-product of glycolysis, can be used to generate acetyl-CoA in the cytosol via acetaldehyde and acetate, or in mitochondria by direct oxidative decarboxylation. In the latter case, the mitochondrial pyruvate carrier (MPC is responsible for pyruvate transport into mitochondrial matrix space. During chronological aging, yeast cells which lack the major structural subunit Mpc1 display a reduced lifespan accompanied by an age-dependent loss of autophagy. Here, we show that the impairment of pyruvate import into mitochondria linked to Mpc1 loss is compensated by a flux redirection of TCA cycle intermediates through the malic enzyme-dependent alternative route. In such a way, the TCA cycle operates in a “branched” fashion to generate pyruvate and is depleted of intermediates. Mutant cells cope with this depletion by increasing the activity of glyoxylate cycle and of the pathway which provides the nucleocytosolic acetyl-CoA. Moreover, cellular respiration decreases and ROS accumulate in the mitochondria which, in turn, undergo severe damage. These acquired traits in concert with the reduced autophagy restrict cell survival of the mpc1∆ mutant during chronological aging. Conversely, the activation of the carnitine shuttle by supplying acetyl-CoA to the mitochondria is sufficient to abrogate the short-lived phenotype of the mutant.

  16. Loss of neurogenesis in Hydra leads to compensatory regulation of neurogenic and neurotransmission genes in epithelial cells.

    Science.gov (United States)

    Wenger, Y; Buzgariu, W; Galliot, B

    2016-01-01

    Hydra continuously differentiates a sophisticated nervous system made of mechanosensory cells (nematocytes) and sensory-motor and ganglionic neurons from interstitial stem cells. However, this dynamic adult neurogenesis is dispensable for morphogenesis. Indeed animals depleted of their interstitial stem cells and interstitial progenitors lose their active behaviours but maintain their developmental fitness, and regenerate and bud when force-fed. To characterize the impact of the loss of neurogenesis in Hydra, we first performed transcriptomic profiling at five positions along the body axis. We found neurogenic genes predominantly expressed along the central body column, which contains stem cells and progenitors, and neurotransmission genes predominantly expressed at the extremities, where the nervous system is dense. Next, we performed transcriptomics on animals depleted of their interstitial cells by hydroxyurea, colchicine or heat-shock treatment. By crossing these results with cell-type-specific transcriptomics, we identified epithelial genes up-regulated upon loss of neurogenesis: transcription factors (Dlx, Dlx1, DMBX1/Manacle, Ets1, Gli3, KLF11, LMX1A, ZNF436, Shox1), epitheliopeptides (Arminins, PW peptide), neurosignalling components (CAMK1D, DDCl2, Inx1), ligand-ion channel receptors (CHRNA1, NaC7), G-Protein Coupled Receptors and FMRFRL. Hence epitheliomuscular cells seemingly enhance their sensing ability when neurogenesis is compromised. This unsuspected plasticity might reflect the extended multifunctionality of epithelial-like cells in early eumetazoan evolution. PMID:26598723

  17. Correlation of Beam Electron and LED Signal Losses under Irradiation and Long-term Recovery of Lead Tungstate Crystals

    CERN Document Server

    Batarin, V A; Davidenko, A M; Derevshchikov, A A; Goncharenko, Yu M; Grishin, V N; Kachanov, V A; Konstantinov, A S; Kravtsov, V I; Kubota, Y; Lukanin, V S; Matulenko, Yu A; Melnik, Yu M; Meshchanin, A P; Mikhalin, N E; Minaev, N G; Mochalov, V V; Morozov, D A; Nogach, L V; Ryazantsev, A V; Semenov, P A; Semenov, V K; Shestermanov, K E; Soloviev, L F; Stone, S; Uzunian, A V; Vasilev, A N; Yakutin, A E; Yarba, J V

    2005-01-01

    Radiation damage in lead tungstate crystals reduces their transparency. The calibration that relates the amount of light detected in such crystals to incident energy of photons or electrons is of paramount importance to maintaining the energy resolution the detection system. We report on tests of lead tungstate crystals, read out by photomultiplier tubes, exposed to irradiation by monoenergetic electron or pion beams. The beam electrons themselves were used to measure the scintillation light output, and a blue light emitting diode (LED) was used to track variations of crystals transparency. We report on the correlation of the LED measurement with radiation damage by the beams and also show that it can accurately monitor the crystals recovery from such damage.

  18. Symptomatic Central Venous Stenosis in a Hemodialysis Patient Leading to Loss of Arteriovenous Access: A Case Report and Literature Review

    OpenAIRE

    Tatapudi, Vasishta S.; Spinowitz, Noam; Goldfarb, David S.

    2014-01-01

    Central venous stenosis is a well-described sequel to the placement of hemodialysis catheters in the central venous system. The presence of an ipsilateral arteriovenous fistula or graft often leads to severe venous dilatation, arm edema and recurrent infections. Vascular access thrombosis, compromised blood flow and inadequate dialysis delivery are dreaded complications that eventually render the access unusable. We report the case of a 58-year-old male hemodialysis patient who developed symp...

  19. Loss of TET2 in hematopoietic cells leads to DNA hypermethylation of active enhancers and induction of leukemogenesis

    DEFF Research Database (Denmark)

    Rasmussen, Kasper D; Jia, Guangshuai; Johansen, Jens V;

    2015-01-01

    DNA methylation is tightly regulated throughout mammalian development, and altered DNA methylation patterns are a general hallmark of cancer. The methylcytosine dioxygenase TET2 is frequently mutated in hematological disorders, including acute myeloid leukemia (AML), and has been suggested to...... protect CG dinucleotide (CpG) islands and promoters from aberrant DNA methylation. In this study, we present a novel Tet2-dependent leukemia mouse model that closely recapitulates gene expression profiles and hallmarks of human AML1-ETO-induced AML. Using this model, we show that the primary effect of Tet......2 loss in preleukemic hematopoietic cells is progressive and widespread DNA hypermethylation affecting up to 25% of active enhancer elements. In contrast, CpG island and promoter methylation does not change in a Tet2-dependent manner but increases relative to population doublings. We confirmed this...

  20. Loss of Survivin in Intestinal Epithelial Progenitor Cells Leads to Mitotic Catastrophe and Breakdown of Gut Immune Homeostasis.

    Science.gov (United States)

    Martini, Eva; Wittkopf, Nadine; Günther, Claudia; Leppkes, Moritz; Okada, Hitoshi; Watson, Alastair J; Podstawa, Eva; Backert, Ingo; Amann, Kerstin; Neurath, Markus F; Becker, Christoph

    2016-02-01

    A tightly regulated balance of proliferation and cell death of intestinal epithelial cells (IECs) is essential for maintenance of gut homeostasis. Survivin is highly expressed during embryogenesis and in several cancer types, but little is known about its role in adult gut tissue. Here, we show that Survivin is specifically expressed in transit-amplifying cells and Lgr5(+) stem cells. Genetic loss of Survivin in IECs resulted in destruction of intestinal integrity, mucosal inflammation, and death of the animals. Survivin deletion was associated with decreased epithelial proliferation due to defective chromosomal segregation. Moreover, Survivin-deficient animals showed induced phosphorylation of p53 and H2AX and increased levels of cell-intrinsic apoptosis in IECs. Consequently, induced deletion of Survivin in Lgr5(+) stem cells led to cell death. In summary, Survivin is a key regulator of gut tissue integrity by regulating epithelial homeostasis in the stem cell niche. PMID:26832409

  1. Loss of Survivin in Intestinal Epithelial Progenitor Cells Leads to Mitotic Catastrophe and Breakdown of Gut Immune Homeostasis

    Directory of Open Access Journals (Sweden)

    Eva Martini

    2016-02-01

    Full Text Available A tightly regulated balance of proliferation and cell death of intestinal epithelial cells (IECs is essential for maintenance of gut homeostasis. Survivin is highly expressed during embryogenesis and in several cancer types, but little is known about its role in adult gut tissue. Here, we show that Survivin is specifically expressed in transit-amplifying cells and Lgr5+ stem cells. Genetic loss of Survivin in IECs resulted in destruction of intestinal integrity, mucosal inflammation, and death of the animals. Survivin deletion was associated with decreased epithelial proliferation due to defective chromosomal segregation. Moreover, Survivin-deficient animals showed induced phosphorylation of p53 and H2AX and increased levels of cell-intrinsic apoptosis in IECs. Consequently, induced deletion of Survivin in Lgr5+ stem cells led to cell death. In summary, Survivin is a key regulator of gut tissue integrity by regulating epithelial homeostasis in the stem cell niche.

  2. DKK1 mediated inhibition of Wnt signaling in postnatal mice leads to loss of TEC progenitors and thymic degeneration.

    Directory of Open Access Journals (Sweden)

    Masako Osada

    Full Text Available BACKGROUND: Thymic epithelial cell (TEC microenvironments are essential for the recruitment of T cell precursors from the bone marrow, as well as the subsequent expansion and selection of thymocytes resulting in a mature self-tolerant T cell repertoire. The molecular mechanisms, which control both the initial development and subsequent maintenance of these critical microenvironments, are poorly defined. Wnt signaling has been shown to be important to the development of several epithelial tissues and organs. Regulation of Wnt signaling has also been shown to impact both early thymocyte and thymic epithelial development. However, early blocks in thymic organogenesis or death of the mice have prevented analysis of a role of canonical Wnt signaling in the maintenance of TECs in the postnatal thymus. METHODOLOGY/PRINCIPAL FINDINGS: Here we demonstrate that tetracycline-regulated expression of the canonical Wnt inhibitor DKK1 in TECs localized in both the cortex and medulla of adult mice, results in rapid thymic degeneration characterized by a loss of DeltaNP63(+ Foxn1(+ and Aire(+ TECs, loss of K5K8DP TECs thought to represent or contain an immature TEC progenitor, decreased TEC proliferation and the development of cystic structures, similar to an aged thymus. Removal of DKK1 from DKK1-involuted mice results in full recovery, suggesting that canonical Wnt signaling is required for the differentiation or proliferation of TEC populations needed for maintenance of properly organized adult thymic epithelial microenvironments. CONCLUSIONS/SIGNIFICANCE: Taken together, the results of this study demonstrate that canonical Wnt signaling within TECs is required for the maintenance of epithelial microenvironments in the postnatal thymus, possibly through effects on TEC progenitor/stem cell populations. Downstream targets of Wnt signaling, which are responsible for maintenance of these TEC progenitors may provide useful targets for therapies aimed at

  3. Big data: an introduction for librarians.

    Science.gov (United States)

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included. PMID:25023020

  4. Big Data : Overview

    OpenAIRE

    Richa Gupta; Sunny Gupta; Anuradha Singhal

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  5. Big Data: Overview

    OpenAIRE

    Gupta, Richa; Gupta, Sunny; Singhal, Anuradha

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  6. Loss of TET2 in hematopoietic cells leads to DNA hypermethylation of active enhancers and induction of leukemogenesis.

    Science.gov (United States)

    Rasmussen, Kasper D; Jia, Guangshuai; Johansen, Jens V; Pedersen, Marianne T; Rapin, Nicolas; Bagger, Frederik O; Porse, Bo T; Bernard, Olivier A; Christensen, Jesper; Helin, Kristian

    2015-05-01

    DNA methylation is tightly regulated throughout mammalian development, and altered DNA methylation patterns are a general hallmark of cancer. The methylcytosine dioxygenase TET2 is frequently mutated in hematological disorders, including acute myeloid leukemia (AML), and has been suggested to protect CG dinucleotide (CpG) islands and promoters from aberrant DNA methylation. In this study, we present a novel Tet2-dependent leukemia mouse model that closely recapitulates gene expression profiles and hallmarks of human AML1-ETO-induced AML. Using this model, we show that the primary effect of Tet2 loss in preleukemic hematopoietic cells is progressive and widespread DNA hypermethylation affecting up to 25% of active enhancer elements. In contrast, CpG island and promoter methylation does not change in a Tet2-dependent manner but increases relative to population doublings. We confirmed this specific enhancer hypermethylation phenotype in human AML patients with TET2 mutations. Analysis of immediate gene expression changes reveals rapid deregulation of a large number of genes implicated in tumorigenesis, including many down-regulated tumor suppressor genes. Hence, we propose that TET2 prevents leukemic transformation by protecting enhancers from aberrant DNA methylation and that it is the combined silencing of several tumor suppressor genes in TET2 mutated hematopoietic cells that contributes to increased stem cell proliferation and leukemogenesis. PMID:25886910

  7. Acute cholesterol depletion leads to net loss of the organic osmolyte taurine in Ehrlich Lettré tumor cells

    DEFF Research Database (Denmark)

    Villumsen, Kasper Rømer; Duelund, Lars; Lambert, Ian Henry

    2010-01-01

    reveals that cholesterol depletion increases TauT's affinity toward taurine but reduces its maximal transport capacity. Cholesterol depletion has no impact on TauT regulation by protein kinases A and C. Phospholipase A2 activity, which is required for the activation of volume-sensitive organic anion......In mammalian cells, the organic osmolyte taurine is accumulated by the Na-dependent taurine transporter TauT and released though the volume- and DIDS-sensitive organic anion channel. Incubating Ehrlich Lettré tumor cells with methyl-ß-cyclodextrin (5 mM, 1 h) reduces the total cholesterol pool to...... channel (VSOAC), is increased under isotonic and hypotonic conditions following cholesterol depletion, whereas taurine release under hypotonic conditions is reduced following cholesterol depletion. Hence, acute cholesterol depletion of Ehrlich Lettré cells leads to reduced TauT and VSOAC activities and at...

  8. Leading causes of certifiable visual loss in England and Wales during the year ending 31 March 2013.

    Science.gov (United States)

    Quartilho, A; Simkiss, P; Zekite, A; Xing, W; Wormald, R; Bunce, C

    2016-04-01

    PurposeThe last article on causes of sight impairment (SI) in England and Wales was for April 2007-March 2008. This report updates these figures for April 2012-March 2013.MethodsIn England and Wales, registration for SI is initiated by completion of a certificate of vision impairment (CVI). The main cause of visual impairment was ascertained for certificates completed April 2012-March 2013. A proportional comparison against April 2007-March 2008 was made.ResultsWe received 24 009 CVIs of which 10 410 were for severe sight impairment (SSI) and 13 129 were for SI. These numbers were slightly higher than those observed in April 2007-March 2008 (9823 SSI; 12 607 SI). The ratio SI:SSI has remained static with 55% of all certifications being SI. The proportion of certificates without a single main cause has fallen slightly (16.6 to 14%). The proportion of certificates with a main cause of degeneration of the macula and posterior pole (mostly age-related macular degeneration (AMD)) decreased from 58.6 to 50% SSI and from 57.2 to 52.5% SI. Glaucoma remains the second most common cause (11% SSI; 7.6% SI) but hereditary retinal disorders overtook diabetes as third leading cause of SSI.ConclusionAMD is still by far the leading cause of certifications for sight impairment in England and Wales (both SI and SSI). Proportionate changes have been observed since 2008, but it is important to note that a proportionate increase in one condition will impact on others. PMID:26821759

  9. Loss of dysbindin-1, a risk gene for schizophrenia, leads to impaired group 1 metabotropic glutamate receptor function in mice.

    Directory of Open Access Journals (Sweden)

    Sanjeev K Bhardwaj

    2015-03-01

    Full Text Available The expression of dysbindin-1, a protein coded by the risk gene dtnbp1, is reduced in the brains of schizophrenia patients. Evidence indicates a role of dysbindin-1 in dopaminergic and glutamatergic transmission. Glutamatergic transmission and plasticity at excitatory synapses is critically regulated by G-protein coupled metabotropic glutamate receptor (mGluR family members, that have been implicated in schizophrenia. Here, we report a role of dysbindin-1 in hippocampal group 1 mGluR (mGluRI function in mice. In hippocampal synaptoneurosomal preparations from sandy (sdy mice, that have a loss of function mutation in dysbindin-1 gene, we observed a striking reduction in mGluRI agonist [(S-3,5-dihydroxyphenylglycine] (DHPG-induced phosphorylation of extracellular signal regulated kinase 1/2 (ERK1/2. This mGluR-ERK1/2 deficit occurred in the absence of significant changes in protein levels of the two members of the mGluRI family (i.e., mGluR1 and mGluR5 or in another mGluRI signaling pathway, i.e., protein kinase C (PKC. Aberrant mGluRI-ERK1/2 signaling affected hippocampal synaptic plasticity in the sdy mutants as DHPG-induced long-term depression (LTD at CA1 excitatory synapses was significantly reduced. Behavioral data suggest that the mGluRI hypofunction may underlie some of the cognitive abnormalities described in sdy mice as the administration of CDPPB (3-cyano-N-(1,3-diphenyl-1H-pyrazol-5-yl benzamide, a positive allosteric modulator of mGluR5, rescued short-term object recognition and spatial learning and memory deficits in these mice. Taken together, our data suggest a novel role of dysbindin-1 in regulating mGluRI functions.

  10. Arsenite binding-induced zinc loss from PARP-1 is equivalent to zinc deficiency in reducing PARP-1 activity, leading to inhibition of DNA repair

    International Nuclear Information System (INIS)

    Inhibition of DNA repair is a recognized mechanism for arsenic enhancement of ultraviolet radiation-induced DNA damage and carcinogenesis. Poly(ADP-ribose) polymerase-1 (PARP-1), a zinc finger DNA repair protein, has been identified as a sensitive molecular target for arsenic. The zinc finger domains of PARP-1 protein function as a critical structure in DNA recognition and binding. Since cellular poly(ADP-ribosyl)ation capacity has been positively correlated with zinc status in cells, we hypothesize that arsenite binding-induced zinc loss from PARP-1 is equivalent to zinc deficiency in reducing PARP-1 activity, leading to inhibition of DNA repair. To test this hypothesis, we compared the effects of arsenite exposure with zinc deficiency, created by using the membrane-permeable zinc chelator TPEN, on 8-OHdG formation, PARP-1 activity and zinc binding to PARP-1 in HaCat cells. Our results show that arsenite exposure and zinc deficiency had similar effects on PARP-1 protein, whereas supplemental zinc reversed these effects. To investigate the molecular mechanism of zinc loss induced by arsenite, ICP-AES, near UV spectroscopy, fluorescence, and circular dichroism spectroscopy were utilized to examine arsenite binding and occupation of a peptide representing the first zinc finger of PARP-1. We found that arsenite binding as well as zinc loss altered the conformation of zinc finger structure which functionally leads to PARP-1 inhibition. These findings suggest that arsenite binding to PARP-1 protein created similar adverse biological effects as zinc deficiency, which establishes the molecular mechanism for zinc supplementation as a potentially effective treatment to reverse the detrimental outcomes of arsenic exposure. - Highlights: • Arsenite binding is equivalent to zinc deficiency in reducing PARP-1 function. • Zinc reverses arsenic inhibition of PARP-1 activity and enhancement of DNA damage. • Arsenite binding and zinc loss alter the conformation of zinc finger

  11. Enhancement of β-catenin activity by BIG1 plus BIG2 via Arf activation and cAMP signals.

    Science.gov (United States)

    Li, Chun-Chun; Le, Kang; Kato, Jiro; Moss, Joel; Vaughan, Martha

    2016-05-24

    Multifunctional β-catenin, with critical roles in both cell-cell adhesion and Wnt-signaling pathways, was among HeLa cell proteins coimmunoprecipitated by antibodies against brefeldin A-inhibited guanine nucleotide-exchange factors 1 and 2 (BIG1 or BIG2) that activate ADP-ribosylation factors (Arfs) by accelerating the replacement of bound GDP with GTP. BIG proteins also contain A-kinase anchoring protein (AKAP) sequences that can act as scaffolds for multimolecular assemblies that facilitate and limit cAMP signaling temporally and spatially. Direct interaction of BIG1 N-terminal sequence with β-catenin was confirmed using yeast two-hybrid assays and in vitro synthesized proteins. Depletion of BIG1 and/or BIG2 or overexpression of guanine nucleotide-exchange factor inactive mutant, but not wild-type, proteins interfered with β-catenin trafficking, leading to accumulation at perinuclear Golgi structures. Both phospholipase D activity and vesicular trafficking were required for effects of BIG1 and BIG2 on β-catenin activation. Levels of PKA-phosphorylated β-catenin S675 and β-catenin association with PKA, BIG1, and BIG2 were also diminished after BIG1/BIG2 depletion. Inferring a requirement for BIG1 and/or BIG2 AKAP sequence in PKA modification of β-catenin and its effect on transcription activation, we confirmed dependence of S675 phosphorylation and transcription coactivator function on BIG2 AKAP-C sequence. PMID:27162341

  12. Health impact assessment and monetary valuation of IQ loss in pre-school children due to lead exposure through locally produced food.

    Science.gov (United States)

    Bierkens, J; Buekers, J; Van Holderbeke, M; Torfs, R

    2012-01-01

    A case study has been performed which involved the full chain assessment from policy drivers to health effect quantification of lead exposure through locally produced food on loss of IQ in pre-school children at the population level across the EU-27, including monetary valuation of the estimated health impact. Main policy scenarios cover the period from 2000 to 2020 and include the most important Community policy developments expected to affect the environmental release of lead (Pb) and corresponding human exposure patterns. Three distinct scenarios were explored: the emission situation based on 2000 data, a business-as-usual scenario (BAU) up to 2010 and 2020 and a scenario incorporating the most likely technological change expected (Most Feasible Technical Reductions, MFTR) in response to current and future legislation. Consecutive model calculations (MSCE-HM, WATSON, XtraFOOD, IEUBK) were performed by different partners on the project as part of the full chain approach to derive estimates of blood lead (B-Pb) levels in children as a consequence of the consumption of local produce. The estimated B-Pb levels were translated into an average loss of IQ points/child using an empirical relationship based on a meta-analysis performed by Schwartz (1994). The calculated losses in IQ points were subsequently further translated into the average cost/child using a cost estimate of €10.000 per loss of IQ point based on data from a literature review. The estimated average reduction of cost/child (%) for all countries considered in 2010 under BAU and MFTR are 12.16 and 18.08% as compared to base line conditions, respectively. In 2020 the percentages amount to 20.19 and 23.39%. The case study provides an example of the full-chain impact pathway approach taking into account all foreseeable pathways both for assessing the environmental fate and the associated human exposure and the mode of toxic action to arrive at quantitative estimates of health impacts at the individual and

  13. "Big Data" : big gaps of knowledge in the field of internet science

    OpenAIRE

    Snijders, CCP Chris; Matzat, U Uwe; Reips, UD

    2012-01-01

    Research on so-called 'Big Data' has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as 'small world' properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in...

  14. Early-life lead exposure recapitulates the selective loss of parvalbumin-positive GABAergic interneurons and subcortical dopamine system hyperactivity present in schizophrenia.

    Science.gov (United States)

    Stansfield, K H; Ruby, K N; Soares, B D; McGlothan, J L; Liu, X; Guilarte, T R

    2015-01-01

    Environmental factors have been associated with psychiatric disorders and recent epidemiological studies suggest an association between prenatal lead (Pb(2+)) exposure and schizophrenia (SZ). Pb(2+) is a potent antagonist of the N-methyl-D-aspartate receptor (NMDAR) and converging evidence indicates that NMDAR hypofunction has a key role in the pathophysiology of SZ. The glutamatergic hypothesis of SZ posits that NMDAR hypofunction results in the loss of parvalbumin (PV)-positive GABAergic interneurons (PVGI) in the brain. Loss of PVGI inhibitory control to pyramidal cells alters the excitatory drive to midbrain dopamine neurons increasing subcortical dopaminergic activity. We hypothesized that if Pb(2+) exposure in early life is an environmental risk factor for SZ, it should recapitulate the loss of PVGI and reproduce subcortical dopaminergic hyperactivity. We report that on postnatal day 50 (PN50), adolescence rats chronically exposed to Pb(2+) from gestation through adolescence exhibit loss of PVGI in SZ-relevant brain regions. PV and glutamic acid decarboxylase 67 kDa (GAD67) protein were significantly decreased in Pb(2+) exposed rats with no apparent change in calretinin or calbindin protein levels suggesting a selective effect on the PV phenotype of GABAergic interneurons. We also show that Pb(2+) animals exhibit a heightened locomotor response to cocaine and express significantly higher levels of dopamine metabolites and D2-dopamine receptors relative to controls indicative of subcortical dopaminergic hyperactivity. Our results show that developmental Pb(2+) exposure reproduces specific neuropathology and functional dopamine system changes present in SZ. We propose that exposure to environmental toxins that produce NMDAR hypofunction during critical periods of brain development may contribute significantly to the etiology of mental disorders. PMID:25756805

  15. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  16. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  17. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  18. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  19. Loss of MLCK leads to disruption of cell-cell adhesion and invasive behavior of breast epithelial cells via increased expression of EGFR and ERK/JNK signaling.

    Science.gov (United States)

    Kim, D Y; Helfman, D M

    2016-08-25

    Myosin light chain kinase (MLCK) expression is downregulated in breast cancer, including invasive ductal carcinoma compared with ductal breast carcinoma in situ and metastatic breast tumors. However, little is known about how loss of MLCK expression contributes to tumor progression. MLCK is a component of the actin cytoskeleton and its known role is the phosphorylation of the regulatory light chain of myosin II. To gain insights into the role of MLCK in breast cancer, we perturbed its function using small interfering RNA (siRNA) or pharmacological inhibition in untransformed breast epithelial cells (MCF10A). Loss of MLCK by siRNAs led to increased cell migration and invasion, disruption of cell-cell adhesions and enhanced formation of focal adhesions at the leading edge of migratory cells. In addition, downregulation of MLCK cooperated with HER2 in MCF10A cells to promote cell migration and invasion and low levels of MLCK is associated with a poor prognosis in HER2-positive breast cancer patients. Associated with these altered migratory behaviors were increased expression of epidermal growth factor receptor and activation of extracellular signal-regulated kinase and c-Jun N-terminal kinase signaling pathways in MLCK downregulated MCF10A cells. By contrast, inhibition of the kinase function of MLCK using pharmacological agents inhibited cell migration and invasion, and did not affect cellular adhesions. Our results show that loss of MLCK contributes to the migratory properties of epithelial cells resulting from changes in cell-cell and cell-matrix adhesions, and increased epidermal growth factor receptor signaling. These findings suggest that decreased expression of MLCK may have a critical role during tumor progression by facilitating the metastatic potential of tumor cells. PMID:26876209

  20. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  1. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  2. Mining "big data" using big data services

    OpenAIRE

    Reips, UD; Matzat, U Uwe

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  3. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  4. BIG DATA IN BUSINESS ENVIRONMENT

    OpenAIRE

    Logica BANICA; Alina HAGIU

    2015-01-01

    In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured) in order to improve current transactions, to develop new business models, to provide a real image ...

  5. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  6. Effective Dynamics of the Matrix Big Bang

    OpenAIRE

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-01-01

    We study the leading quantum effects in the recently introduced Matrix Big Bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the Big Bang. Surprisingly, the potential decays very rapidly at late times, where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form ...

  7. From Big Crunch to Big Bang

    OpenAIRE

    Khoury, Justin; Ovrut, Burt A.; Seiberg, Nathan; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2001-01-01

    We consider conditions under which a universe contracting towards a big crunch can make a transition to an expanding big bang universe. A promising example is 11-dimensional M-theory in which the eleventh dimension collapses, bounces, and re-expands. At the bounce, the model can reduce to a weakly coupled heterotic string theory and, we conjecture, it may be possible to follow the transition from contraction to expansion. The possibility opens the door to new classes of cosmological models. F...

  8. Big fundamental groups: generalizing homotopy and big homotopy

    OpenAIRE

    Penrod, Keith

    2014-01-01

    The concept of big homotopy theory was introduced by J. Cannon and G. Conner using big intervals of arbitrarily large cardinality to detect big loops. We find, for each space, a canonical cardinal that is sufficient to detect all big loops and all big homotopies in the space.

  9. Expression of HIV-1 Vpu leads to loss of the viral restriction factor CD317/Tetherin from lipid rafts and its enhanced lysosomal degradation.

    Directory of Open Access Journals (Sweden)

    Ruth Rollason

    Full Text Available CD317/tetherin (aka BST2 or HM1.24 antigen is an interferon inducible membrane protein present in regions of the lipid bilayer enriched in sphingolipids and cholesterol (often termed lipid rafts. It has been implicated in an eclectic mix of cellular processes including, most notably, the retention of fully formed viral particles at the surface of cells infected with HIV and other enveloped viruses. Expression of the HIV viral accessory protein Vpu has been shown to lead to intracellular sequestration and degradation of tetherin, thereby counteracting the inhibition of viral release. There is evidence that tetherin interacts directly with Vpu, but it remains unclear where in the cell this interaction occurs or if Vpu expression affects the lipid raft localisation of tetherin. We have addressed these points using biochemical and cell imaging approaches focused on endogenous rather than ectopically over-expressed tetherin. We find i no evidence for an interaction between Vpu and endogenous tetherin at the cell surface, ii the vast majority of endogenous tetherin that is at the cell surface in control cells is in lipid rafts, iii internalised tetherin is present in non-raft fractions, iv expression of Vpu in cells expressing endogenous tetherin leads to the loss of tetherin from lipid rafts, v internalised tetherin enters early endosomes, and late endosomes, in both control cells and cells expressing Vpu, but the proportion of tetherin molecules destined for degradation rather than recycling is increased in cells expressing Vpu vi lysosomes are the primary site for degradation of endogenous tetherin in cells expressing Vpu. Our studies underlie the importance of studying endogenous tetherin and let us propose a model in which Vpu intercepts newly internalised tetherin and diverts it for lysosomal destruction rather than recycling to the cell surface.

  10. Cisplatin Induces Overactivation of the Dormant Primordial Follicle through PTEN/AKT/FOXO3a Pathway which Leads to Loss of Ovarian Reserve in Mice.

    Directory of Open Access Journals (Sweden)

    Eun Mi Chang

    Full Text Available Cisplatin is a first-line chemotherapeutic agent for ovarian cancer that acts by promoting DNA cross links and adduct. However drug resistance and considerable side effects including reproductive toxicity remain a significant challenge. PTEN is well known as a tumor suppressor function which plays a fundamental role in the regulation of the cell cycle, apoptosis and development of cancer. At the same time PTEN has been revealed to be critically important for the maintenance of the primordial follicle pool. In this study, we investigated the role of PTEN/Akt/FOXO3 pathway in cisplatin-induced primordial follicle depletion. Cisplatin induced ovarian failure mouse model was used to evaluate how this pathway involves. In vitro maturation was used for oocyte rescue after cisplatin damage. We found that cisplatin treatment decreased PTEN levels, leading to a subsequent increase in the phosphorylation of key molecules in the pathway. The activation of the PTEN/Akt/FOXO3 pathway cascade increased cytoplasmic translocation of FOXO3a in cisplatin-treated follicles, which in turn increased the pool size of growing follicles, and rapidly depleted the number of dormant follicles. Once activated, the follicles were more prone to apoptosis, and their cumulus cells showed a loss of luteinizing hormone (LH receptor expression, which leads to failure during final maturation and ovulation. In vitro maturation to rescue oocytes in a cisplatin-treated mouse model resulted in successful maturation and fertilization. This study is the first to show the involvement of the PTEN/Akt/FOXO3 pathway in premature ovarian failure after cisplatin treatment and the possibility of rescue through in vitro maturation.

  11. ANALYSIS OF BIG DATA

    OpenAIRE

    Anshul Sharma; Preeti Gulia

    2014-01-01

    Big Data is data that either is too large, grows too fast, or does not fit into traditional architectures. Within such data can be valuable information that can be discovered through data analysis [1]. Big data is a collection of complex and large data sets that are difficult to process and mine for patterns and knowledge using traditional database management tools or data processing and mining systems. Big Data is data whose scale, diversity and complexity require new architecture, technique...

  12. Matrix Big Brunch

    OpenAIRE

    Bedford, J; Papageorgakis, C.; Rodriguez-Gomez, D.; Ward, J.

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  13. The big bang

    International Nuclear Information System (INIS)

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  14. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  15. Differential changes in serum uric acid concentrations in sibutramine promoted weight loss in diabetes: results from four weeks of the lead-in period of the SCOUT trial

    Directory of Open Access Journals (Sweden)

    Caterson Ian D

    2009-10-01

    Full Text Available Abstract Background and aims Elevated levels of serum uric acid are associated with an increased risk of cardiovascular morbidity and mortality. The response of uric acid to weight loss therapy (lifestyle plus sibutramine in an overweight and obese cardiovascular high risk population was studied. Methods and results Data from a four week single-blind lead-in period of the Sibutramine Cardiovascular OUTcomes (SCOUT study were analyzed. 2584 patients (24% had diabetes mellitus (DM only, 1748 (16% had cardiovascular disease (CVD only and 6397 (60% had both DM + CVD. Uric acid concentrations (mean ± standard deviation at screening were significantly higher among patients with CVD compared to patients without CVD (p Conclusion A four week daily intake of sibutramine and life style changes was associated with significant reductions in mean uric acid levels. Changes in renal glucose load in diabetes seem to counteract a potential uricosuric effect of sibutramine. Trial Registration The trial is registered at ClinicalTrial.gov number: NCT00234832.

  16. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  17. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  18. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  19. Research on Privacy Protection in Big Data Environment

    OpenAIRE

    Gang Zeng

    2015-01-01

    Now big data has become a hot topic in academia and industry, it is affecting the mode of thinking and working, daily life. But there are many security risks in data collection, storage and use. Privacy leakage caused serious problems to the user, false data will lead to error results of big data analysis. This paper first introduces the security problems faced by big data,analyzes the causes of privacy problems,discussesthe principle to solve the problem. Finally,discusses techni...

  20. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  1. Amyloid beta protein-induced zinc sequestration leads to synaptic loss via dysregulation of the ProSAP2/Shank3 scaffold

    Directory of Open Access Journals (Sweden)

    Hof Patrick R

    2011-09-01

    Full Text Available Abstract Background Memory deficits in Alzheimer's disease (AD manifest together with the loss of synapses caused by the disruption of the postsynaptic density (PSD, a network of scaffold proteins located in dendritic spines. However, the underlying molecular mechanisms remain elusive. Since it was shown that ProSAP2/Shank3 scaffold assembly within the PSD is Zn2+-dependent and that the amyloid beta protein (Aβ is able to bind Zn2+, we hypothesize that sequestration of Zn2+ ions by Aβ contributes to ProSAP/Shank platform malformation. Results To test this hypothesis, we designed multiple in vitro and in vivo assays demonstrating ProSAP/Shank dysregulation in rat hippocampal cultures following Aβ oligomer accumulation. These changes were independent from alterations on ProSAP/Shank transcriptional level. However, application of soluble Aβ prevented association of Zn2+ ions with ProSAP2/Shank3 in a cell-based assay and decreased the concentration of Zn2+ clusters within dendrites. Zn2+ supplementation or saturation of Aβ with Zn2+ ions prior to cell treatment was able to counter the effects induced by Aβ on synapse density and ProSAP2/Shank3 levels at the PSD. Interestingly, intracellular Zn2+ levels in APP-PS1 mice and human AD hippocampus are reduced along with a reduction in synapse density and synaptic ProSAP2/Shank3 and Shank1 protein levels. Conclusions We conclude that sequestration of Zn2+ ions by Aβ significantly contributes to changes in ProSAP2/Shank3 platforms. These changes in turn lead to less consolidated (mature synapses reflected by a decrease in Shank1 protein levels at the PSD and decreased synapse density in hippocampal neurons.

  2. "Big Data": Big Knowledge Gaps in the Field of Internet Science

    Directory of Open Access Journals (Sweden)

    Ulf-Dietrich Reips

    2012-01-01

    Full Text Available Research on so-called ‘Big Data’ has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as ‘small world’ properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in addition a different strategy that leads to knowledge about micro-processes that match with actual online behavior. This knowledge can then be used for the selection of mathematically-tractable models of online network formation and evolution. Insight from social and behavioral research is needed for pursuing this strategy of knowledge generation about micro-processes. Accordingly, our proposal points to a unique role that social scientists could play in Big Data research. ...

  3. Big Data, Big Knowledge: Big Data for Personalized Healthcare.

    OpenAIRE

    Viceconti, M.; Hunter, P.; Hose, R.

    2015-01-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine soluti...

  4. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  5. Mutations in the HLA class II genes leading to loss of expression of HLA-DR and HLA-DQ in diffuse large B-cell lymphoma

    NARCIS (Netherlands)

    Jordanova, ES; Philippo, K; Giphart, MJ; Schuuring, E; Kluin, PM

    2003-01-01

    Loss of expression of human leukocyte antigen (HLA) class II molecules on tumor cells affects the onset and modulation of the immune response through lack of activation of CD4(+) T lymphocytes. Previously, we showed that the frequent loss of expression of HLA class II in diffuse large B-cell lymphom

  6. BigDog

    Science.gov (United States)

    Playter, R.; Buehler, M.; Raibert, M.

    2006-05-01

    BigDog's goal is to be the world's most advanced quadruped robot for outdoor applications. BigDog is aimed at the mission of a mechanical mule - a category with few competitors to date: power autonomous quadrupeds capable of carrying significant payloads, operating outdoors, with static and dynamic mobility, and fully integrated sensing. BigDog is about 1 m tall, 1 m long and 0.3 m wide, and weighs about 90 kg. BigDog has demonstrated walking and trotting gaits, as well as standing up and sitting down. Since its creation in the fall of 2004, BigDog has logged tens of hours of walking, climbing and running time. It has walked up and down 25 & 35 degree inclines and trotted at speeds up to 1.8 m/s. BigDog has walked at 0.7 m/s over loose rock beds and carried over 50 kg of payload. We are currently working to expand BigDog's rough terrain mobility through the creation of robust locomotion strategies and terrain sensing capabilities.

  7. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  8. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  9. Big Data Analytics for Disaster Preparedness and Response of Mobile Communication Infrastructure during Natural Hazards

    Science.gov (United States)

    Zhong, L.; Takano, K.; Ji, Y.; Yamada, S.

    2015-12-01

    The disruption of telecommunications is one of the most critical disasters during natural hazards. As the rapid expanding of mobile communications, the mobile communication infrastructure plays a very fundamental role in the disaster response and recovery activities. For this reason, its disruption will lead to loss of life and property, due to information delays and errors. Therefore, disaster preparedness and response of mobile communication infrastructure itself is quite important. In many cases of experienced disasters, the disruption of mobile communication networks is usually caused by the network congestion and afterward long-term power outage. In order to reduce this disruption, the knowledge of communication demands during disasters is necessary. And big data analytics will provide a very promising way to predict the communication demands by analyzing the big amount of operational data of mobile users in a large-scale mobile network. Under the US-Japan collaborative project on 'Big Data and Disaster Research (BDD)' supported by the Japan Science and Technology Agency (JST) and National Science Foundation (NSF), we are going to investigate the application of big data techniques in the disaster preparedness and response of mobile communication infrastructure. Specifically, in this research, we have considered to exploit the big amount of operational information of mobile users for predicting the communications needs in different time and locations. By incorporating with other data such as shake distribution of an estimated major earthquake and the power outage map, we are able to provide the prediction information of stranded people who are difficult to confirm safety or ask for help due to network disruption. In addition, this result could further facilitate the network operators to assess the vulnerability of their infrastructure and make suitable decision for the disaster preparedness and response. In this presentation, we are going to introduce the

  10. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  11. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  12. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  13. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  14. Physical training and weight loss in dogs lead to transcriptional changes in genes involved in the glucose-transport pathway in muscle and adipose tissues

    DEFF Research Database (Denmark)

    Herrera Uribe, Juber; Vitger, Anne Désiré; Ritz, Christian;

    2016-01-01

    little attention. The aim of the present study was to investigate changes in the transcriptome of key energy metabolism genes in muscle and adipose tissues in response to diet-induced weight loss alone, or combined with exercise in dogs. Overweight pet dogs were enrolled on a weight loss programme, based......Obesity is a worldwide problem in humans and domestic animals. Interventions, including a combination of dietary management and exercise, have proven to be effective for inducing weight loss in humans. In companion animals, the role of exercise in the management of obesity has received relatively...

  15. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  16. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    ’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  17. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along...

  18. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  19. Sharing big biomedical data

    OpenAIRE

    Toga, Arthur W.; Dinov, Ivo D.

    2015-01-01

    Background The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Findings Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent data...

  20. Testing Big Bang Nucleosynthesis

    OpenAIRE

    Steigman, Gary

    1996-01-01

    Big Bang Nucleosynthesis (BBN), along with the cosmic background radiation and the Hubble expansion, is one of the pillars ofthe standard, hot, big bang cosmology since the primordial synthesis of the light nuclides (D, $^3$He, $^4$He, $^7$Li) must have occurred during the early evolution of a universe described by this model. The overall consistency between the predicted and observed abundances of the light nuclides, each of which spans a range of some nine orders of magnitude, provides impr...

  1. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  2. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  3. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  4. Effective dynamics of the matrix big bang

    International Nuclear Information System (INIS)

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics

  5. Effective Dynamics of the Matrix Big Bang

    CERN Document Server

    Craps, B; Sethi, S; Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-01-01

    We study the leading quantum effects in the recently introduced Matrix Big Bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that decays near the Big Bang. More surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  6. Reduced skeletal muscle mitochondrial respiration and improved glucose metabolism in nondiabetic obese women during a very low calorie dietary intervention leading to rapid weight loss

    DEFF Research Database (Denmark)

    Rabøl, Rasmus; Svendsen, Pernille F; Skovbro, Mette;

    2009-01-01

    measured in permeabilized muscle fibers using high-resolution respirometry. Average weight loss was 11.5% (P < .05), but the levels of IMTG remained unchanged. Fasting plasma glucose, plasma insulin homeostasis model assessment of insulin resistance, and insulin sensitivity index (composite) obtained...

  7. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-01

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  8. Effects of a torsion field on Big Bang nucleosynthesis

    OpenAIRE

    Brüggen, M.

    1999-01-01

    In this paper it is investigated whether torsion, which arises naturally in most theories of quantum gravity, has observable implications for the Big Bang nucleosynthesis. Torsion can lead to spin flips amongst neutrinos thus turning them into sterile neutrinos. In the early Universe they can alter the helium abundance which is tightly constrained by observations. Here I calculate to what extent torsion of the string theory type leads to a disagreement with the Big Bang nucleosynthesis predic...

  9. Big Data Issues: Performance, Scalability, Availability

    Directory of Open Access Journals (Sweden)

    Laura Matei

    2014-03-01

    Full Text Available Nowadays, Big Data is probably one of the most discussed topics not only in the area of data analysis, but, I believe, in the whole realm of information technology. The simple typing of the words „big data” on an online search engine like Google will retrieve approximately 1,660,000,000 results. Having such a buzz gathered around this term, I could not help but wonder what this phenomenon means.The ever greater portion that the combination of Internet, Cloud Computing and mobile devices has been occupying in our lives, lead to an ever increasing amount of data that must be captured, communicated, aggregated, stored, and analyzed. These sets of data that we are generating are called Big Data.

  10. Application of the hybrid Big Bang-Big Crunch algorithm to optimal reconfiguration and distributed generation power allocation in distribution systems

    International Nuclear Information System (INIS)

    In this paper, a multi-objective framework is proposed for simultaneous optimal network reconfiguration and DG (distributed generation) power allocation. The proposed method encompasses objective functions of power losses, voltage stability, DG cost, and greenhouse gas emissions and it is optimized subject to power system operational and technical constraints. In order to solve the optimization problem, the HBB-BC (Hybrid Big Bang-Big Crunch) algorithm as one of the most recent heuristic tools is modified and employed here by introducing a mutation operator to enhance its exploration capability. To resolve the scaling problem of differently-scaled objective functions, a fuzzy membership is used to bring them into a same scale and then, the fuzzy fitness of the final objective function is utilized to measure the satisfaction level of the obtained solution. The proposed method is tested on balanced and unbalanced test systems and its results are comprehensively compared with previous methods considering different scenarios. According to results, the proposed method not only offers an enhanced exploration capability but also has a better converge rate compared with previous methods. In addition, the simultaneous network reconfiguration and DG power allocation leads to a more optimal result than separately doing tasks of reconfiguration and DG power allocation. - Highlights: • Hybrid Big Bang-Big Crunch algorithm is applied to network reconfiguration problem. • Joint reconfiguration and DG power allocation leads to a more optimal solution. • A mutation operator is used to improve the exploration capability of HBB-BC method. • The HBB-BC has a better convergence rate than the compared algorithms

  11. ALICE: Simulated lead-lead collision

    CERN Multimedia

    2003-01-01

    This track is an example of simulated data modelled for the ALICE detector on the Large Hadron Collider (LHC) at CERN, which will begin taking data in 2008. ALICE will focus on the study of collisions between nuclei of lead, a heavy element that produces many different particles when collided. It is hoped that these collisions will produce a new state of matter known as the quark-gluon plasma, which existed billionths of a second after the Big Bang.

  12. BIG3 Inhibits the Estrogen-Dependent Nuclear Translocation of PHB2 via Multiple Karyopherin-Alpha Proteins in Breast Cancer Cells.

    Directory of Open Access Journals (Sweden)

    Nam-Hee Kim

    Full Text Available We recently reported that brefeldin A-inhibited guanine nucleotide-exchange protein 3 (BIG3 binds Prohibitin 2 (PHB2 in cytoplasm, thereby causing a loss of function of the PHB2 tumor suppressor in the nuclei of breast cancer cells. However, little is known regarding the mechanism by which BIG3 inhibits the nuclear translocation of PHB2 into breast cancer cells. Here, we report that BIG3 blocks the estrogen (E2-dependent nuclear import of PHB2 via the karyopherin alpha (KPNA family in breast cancer cells. We found that overexpressed PHB2 interacted with KPNA1, KPNA5, and KPNA6, thereby leading to the E2-dependent translocation of PHB2 into the nuclei of breast cancer cells. More importantly, knockdown of each endogenous KPNA by siRNA caused a significant inhibition of E2-dependent translocation of PHB2 in BIG3-depleted breast cancer cells, thereby enhancing activation of estrogen receptor alpha (ERα. These data indicated that BIG3 may block the KPNAs (KPNA1, KPNA5, and KPNA6 binding region(s of PHB2, thereby leading to inhibition of KPNAs-mediated PHB2 nuclear translocation in the presence of E2 in breast cancer cells. Understanding this regulation of PHB2 nuclear import may provide therapeutic strategies for controlling E2/ERα signals in breast cancer cells.

  13. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution....... The massive involvement of lay publics as instrumented by social media breaks with the strong expert cultures that have underlain the production and use of data in modern organizations. It also sets apart the interactive and communicative processes by which social data is produced from sensor data...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...

  14. Big Data and Peacebuilding

    Directory of Open Access Journals (Sweden)

    Sanjana Hattotuwa

    2013-11-01

    Full Text Available Any peace process is an exercise in the negotiation of big data. From centuries old communal hagiography to the reams of official texts, media coverage and social media updates, peace negotiations generate data. Peacebuilding and peacekeeping today are informed by, often respond and contribute to big data. This is no easy task. As recently as a few years ago, before the term big data embraced the virtual on the web, what informed peace process design and implementation was in the physical domain – from contested borders and resources to background information in the form of text. The move from analogue, face-to-face negotiations to online, asynchronous, web-mediated negotiations – which can still include real world meetings – has profound implications for how peace is strengthened in fragile democracies.

  15. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. PMID:24183925

  16. Big Bear Exploration Ltd. 1998 annual report

    International Nuclear Information System (INIS)

    During the first quarter of 1998 Big Bear completed a purchase of additional assets in the Rainbow Lake area of Alberta in which light oil purchase was financed with new equity and bank debt. The business plan was to immediately exploit these light oil assets, the result of which would be increased reserves, production and cash flow. Although drilling results in the first quarter on the Rainbow Lake properties was mixed, oil prices started to free fall and drilling costs were much higher than expected. As a result, the company completed a reduced program which resulted in less incremental loss and cash flow than it budgeted for. On April 29, 1998, Big Bear entered into agreement with Belco Oil and Gas Corp. and Moan Investments Ltd. for the issuance of convertible preferred shares at a gross value of $15,750,000, which shares were eventually converted at 70 cents per share to common equity. As a result of the continued plunge in oil prices, the lending value of the company's assets continued to fall, requiring it to take action in order to meet its financial commitments. Late in the third quarter Big Bear issued equity for proceeds of $11,032,000 which further reduced the company's debt. Although the company has been extremely active in identifying and pursuing acquisition opportunities, it became evident that Belco Oil and Gas Corp. and Big Bear did nor share common criteria for acquisitions, which resulted in the restructuring of their relationship in the fourth quarter. With the future of oil prices in question, Big Bear decided that it would change its focus to that of natural gas and would refocus ts efforts to acquire natural gas assets to fuel its growth. The purchase of Blue Range put Big Bear in a difficult position in terms of the latter's growth. In summary, what started as a difficult year ended in disappointment

  17. Primordial Big Bang Nucleosynthesis

    OpenAIRE

    Olive, Keith A.

    1999-01-01

    Big Bang Nucleosynthesis is the theory of the production of the the light element isotopes of D, He3, He4, and Li7. After a brief review of the essential elements of the standard Big Bang model at a temperature of about 1 MeV, the theoretical input and predictions of BBN are discussed. The theory is tested by the observational determinations of the light element abundances and the current status of these observations is reviewed. Concordance of standard model and the related observations is f...

  18. Networks & big data

    OpenAIRE

    Litvak, Nelly; Meulen, van der, P.

    2015-01-01

    Once a year, the NWO cluster Stochastics – Theoretical and Applied Research (STAR) organises a STAR Outreach Day, a one-day event around a theme that is of a broad interest to the stochastics community in the Netherlands. The last Outreach Day took place at Eurandom on 12 December 2014. The theme of the day was ‘Networks & Big Data’. The topic is very timely. The Vision document 2025 of the PlatformWiskunde Nederland (PWN) mentions big data as one of the six “major societal and scientific tre...

  19. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  20. Physical training and weight loss in dogs lead to transcriptional changes in genes involved in the glucose-transport pathway in muscle and adipose tissues.

    Science.gov (United States)

    Herrera Uribe, Juber; Vitger, Anne D; Ritz, Christian; Fredholm, Merete; Bjørnvad, Charlotte R; Cirera, Susanna

    2016-02-01

    Obesity is a worldwide problem in humans and domestic animals. Interventions, including a combination of dietary management and exercise, have proven to be effective for inducing weight loss in humans. In companion animals, the role of exercise in the management of obesity has received relatively little attention. The aim of the present study was to investigate changes in the transcriptome of key energy metabolism genes in muscle and adipose tissues in response to diet-induced weight loss alone, or combined with exercise in dogs. Overweight pet dogs were enrolled on a weight loss programme, based on calorie restriction and physical training (FD group, n = 5) or calorie restriction alone (DO group, n = 7). mRNA expression of 12 genes and six microRNAs were investigated using quantitative real-time PCR (qPCR). In the FD group, FOXO1 and RAC1 were expressed at lower levels in adipose tissue, whereas ESRRA and AKT2 were more highly expressed in muscle, when compared with the DO group. Comparing expression before and after the intervention, in the DO group, nine genes and three microRNAs showed significant altered expression in adipose tissue (PPARG, ADIPOQ and FOXO1; P muscle. Thus, calorie restriction causes regulation of several metabolic genes in both tissues. The mild exercise, incorporated into this study design, was sufficient to elicit transcriptional changes in adipose and muscle tissues, suggesting a positive effect on glucose metabolism. The study findings support inclusion of exercise in management of canine obesity. PMID:26701817

  1. Differential changes in serum uric acid concentrations in sibutramine promoted weight loss in diabetes: results from four weeks of the lead-in period of the SCOUT trial

    DEFF Research Database (Denmark)

    Andersson, Charlotte; Weeke, Peter; Brendorp, Bente;

    2009-01-01

    . METHODS AND RESULTS: Data from a four week single-blind lead-in period of the Sibutramine Cardiovascular OUTcomes (SCOUT) study were analyzed. 2584 patients (24%) had diabetes mellitus (DM) only, 1748 (16%) had cardiovascular disease (CVD) only and 6397 (60%) had both DM + CVD. Uric acid concentrations...

  2. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  3. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  4. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  5. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  6. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  7. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  8. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  9. Big is beautiful

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2007-06-08

    Although big solar systems are both effective and architecturally pleasing, they are still not widespread in Germany. Recently, politicians reacted by improving funding conditions. In order to prevent planning errors, planners and fitters must be better trained, and standardisation of systems must be enhanced. (orig.)

  10. Big ideas: innovation policy

    OpenAIRE

    Van Reenen, John

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  11. The over-expression of an Arabidopsis B3 transcription factor, ABS2/NGAL1, leads to the loss of flower petals.

    Science.gov (United States)

    Shao, Jingxia; Liu, Xiayan; Wang, Rui; Zhang, Gaisheng; Yu, Fei

    2012-01-01

    Transcriptional regulations are involved in many aspects of plant development and are mainly achieved through the actions of transcription factors (TF). To investigate the mechanisms of plant development, we carried out genetic screens for mutants with abnormal shoot development. Taking an activation tagging approach, we isolated a gain-of-function mutant abs2-1D (abnormal shoot 2-1D). abs2-1D showed pleiotropic growth defects at both the vegetative and reproductive developmental stages. We cloned ABS2 and it encodes a RAV sub-family of plant B3 type of transcriptional factors. Phylogenetic analysis showed that ABS2 was closely related to NGATHA (NGA) genes that are involved in flower development and was previously named NGATHA-Like 1 (NGAL1). NGAL1 was expressed mainly in the root and the filament of the stamen in flower tissues and sub-cellular localization assay revealed that NGAL1 accumulated in the nucleus. Interestingly, over-expression of NGAL1 driven by the constitutive 35S promoter led to transgenic plants with conspicuous flower defects, particularly a loss-of-petal phenotype. A loss-of-function ngal1-1 mutant did not show obvious phenotype, suggesting the existence of redundant activities and also the utility of gain-of-function genetic screens. Our results show that the over-expression of NGAL1 is capable of altering flower petal development, as well as shoot development. PMID:23185464

  12. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  13. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  14. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  15. Increased expression of the dopamine transporter leads to loss of dopamine neurons, oxidative stress and l-DOPA reversible motor deficits.

    Science.gov (United States)

    Masoud, S T; Vecchio, L M; Bergeron, Y; Hossain, M M; Nguyen, L T; Bermejo, M K; Kile, B; Sotnikova, T D; Siesser, W B; Gainetdinov, R R; Wightman, R M; Caron, M G; Richardson, J R; Miller, G W; Ramsey, A J; Cyr, M; Salahpour, A

    2015-02-01

    The dopamine transporter is a key protein responsible for regulating dopamine homeostasis. Its function is to transport dopamine from the extracellular space into the presynaptic neuron. Studies have suggested that accumulation of dopamine in the cytosol can trigger oxidative stress and neurotoxicity. Previously, ectopic expression of the dopamine transporter was shown to cause damage in non-dopaminergic neurons due to their inability to handle cytosolic dopamine. However, it is unknown whether increasing dopamine transporter activity will be detrimental to dopamine neurons that are inherently capable of storing and degrading dopamine. To address this issue, we characterized transgenic mice that over-express the dopamine transporter selectively in dopamine neurons. We report that dopamine transporter over-expressing (DAT-tg) mice display spontaneous loss of midbrain dopamine neurons that is accompanied by increases in oxidative stress markers, 5-S-cysteinyl-dopamine and 5-S-cysteinyl-DOPAC. In addition, metabolite-to-dopamine ratios are increased and VMAT2 protein expression is decreased in the striatum of these animals. Furthermore, DAT-tg mice also show fine motor deficits on challenging beam traversal that are reversed with l-DOPA treatment. Collectively, our findings demonstrate that even in neurons that routinely handle dopamine, increased uptake of this neurotransmitter through the dopamine transporter results in oxidative damage, neuronal loss and l-DOPA reversible motor deficits. In addition, DAT over-expressing animals are highly sensitive to MPTP-induced neurotoxicity. The effects of increased dopamine uptake in these transgenic mice could shed light on the unique vulnerability of dopamine neurons in Parkinson's disease. PMID:25447236

  16. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  17. Research on Privacy Protection in Big Data Environment

    Directory of Open Access Journals (Sweden)

    Gang Zeng

    2015-05-01

    Full Text Available Now big data has become a hot topic in academia and industry, it is affecting the mode of thinking and working, daily life. But there are many security risks in data collection, storage and use. Privacy leakage caused serious problems to the user, false data will lead to error results of big data analysis. This paper first introduces the security problems faced by big data,analyzes the causes of privacy problems,discussesthe principle to solve the problem. Finally,discusses technical means for privacy protection.

  18. Big Data: Philosophy, emergence, crowdledge, and science education

    Directory of Open Access Journals (Sweden)

    Renato P. dos Santos

    2016-02-01

    Full Text Available Big Data already passed out of hype, is now a field that deserves serious academic investigation, and natural scientists should also become familiar with Analytics. On the other hand, there is little empirical evidence that any science taught in school is helping people to lead happier, more prosperous, or more politically well-informed lives. In this work, we seek support in the Philosophy and Constructionism literatures to discuss the realm of the concepts of Big Data and its philosophy, the notions of ‘emergence’ and crowdledge, and how we see learning-with-Big-Data as a promising new way to learn Science.

  19. Effects of Fastac 50 EC on bumble bee Bombus terrestris L. respiration: DGE disappearance does not lead to increasing water loss.

    Science.gov (United States)

    Muljar, Riin; Karise, Reet; Viik, Eneli; Kuusik, Aare; Williams, Ingrid; Metspalu, Luule; Hiiesaar, Külli; Must, Anne; Luik, Anne; Mänd, Marika

    2012-11-01

    Sublethal effects of pesticides in insects can be observed through physiological changes, which are commonly estimated by metabolic rate and respiratory patterns, more precisely by the patterns of discontinuous gas-exchange (DGE) cycles. The aim of the present research was to study the effect of some low concentrations of Fastac 50 EC on the cycles of CO(2) release and respiratory water loss rates (WLR) in bumble bee Bombus terrestris L. foragers. Bumble bees were dipped into 0.004% and 0.002% Fastac 50 EC solution. Flow-through respirometry was used to record the respiration and WLR 3h before and after the treatment. The respirometry was combined with infrared actography to enable simultaneous recording of abdominal movements. Our results show that Fastac 50 EC has an after-effect on bumble bee respiratory rhythms and muscle activity but does not affect WLR. Treatment with 0.004% Fastac 50 EC solution resulted in disappearance of the respiration cycles; also the lifespan of treated bumble bees was significantly shorter. Treatment with 0.002% Fastac 50 EC solution had no significant effect on respiration patterns or longevity. We found no evidence for the DGE cycles functioning as a water saving mechanism. PMID:22960306

  20. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Asst. Prof. Shubhada Talegaon

    2014-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  1. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  2. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  3. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  4. Risk management using big real time data

    OpenAIRE

    Cheng, Jie

    2014-01-01

    This thesis focuses on risk management of flight delay area using big real time data. It proposes two different prediction models, one is called General Long Term Departure Prediction Model and the other is named as Improved Real Time Arrival Prediction Model. By studying the main factors lead to flight delay, this thesis takes weather, carrier, National Aviation System, security and previous late aircraft as analysis factors. By utilizing our models can do not only long time b...

  5. Big Bang Nucleosynthesis constraints on new physics

    International Nuclear Information System (INIS)

    Primordial Nucleosynthesis provides a probe of the physics of the early Universe when the temperature and particle densities are high. The Cosmic Nuclear Reactor may, thereby, lead to constraints on new physics which may be inaccessible to current accelerators. Current Big Bang Nucleosynthesis (BBN) bounds to the existence and/or properties of new particles are reviewed and used to constrain physics 'beyond the standard model.' (orig.)

  6. Big Data Refinement

    OpenAIRE

    Boiten, Eerke Albert

    2016-01-01

    "Big data" has become a major area of research and associated funding, as well as a focus of utopian thinking. In the still growing research community, one of the favourite optimistic analogies for data processing is that of the oil refinery, extracting the essence out of the raw data. Pessimists look for their imagery to the other end of the petrol cycle, and talk about the "data exhausts" of our society. Obviously, the refinement community knows how to do "refining". This paper explores...

  7. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  8. Canonical Big Operators

    OpenAIRE

    Bertot, Yves; Gonthier, Georges; Ould Biha, Sidi; Pasca, Ioana

    2008-01-01

    In this paper, we present an approach to describe uniformly iterated “big” operations and to provide lemmas that encapsulate all the commonly used reasoning steps on these constructs. We show that these iterated operations can be handled generically using the syntactic notation and canonical structure facilities provided by the Coq system. We then show how these canonical big operations played a crucial enabling role in the study of various parts of linear algebra and multi-dimensional real a...

  9. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  10. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  11. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  12. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  13. Think Small Go Big

    Institute of Scientific and Technical Information of China (English)

    汤维维

    2006-01-01

    Vepoo公司在创立之前,经历了三次创业转型。用他们的话来说,从“think big go small”转到“think small go big”用了一年的时间。这期间他们耗尽了初期筹备资金,幸运的是在最后一刻迎来了黎明的曙光。

  14. Phylogeography of postglacial range expansion in Juglans mandshurica (Juglandaceae) reveals no evidence of bottleneck, loss of genetic diversity, or isolation by distance in the leading-edge populations.

    Science.gov (United States)

    Wang, Wen-Ting; Xu, Bing; Zhang, Da-Yong; Bai, Wei-Ning

    2016-09-01

    The past studies of postglacial recolonization patterns in high latitude regions have revealed a significant role of dispersal capacity in shaping the genetic diversity and population structure of temperate trees. However, most of these studies have focused on species with long-distance dispersal followed by exponential population growth and were therefore unable to reveal the patterns in the case of a gradual expansion. Here we studied the impacts of postglacial range expansions on the distribution of genetic diversity in the Manchurian walnut (Juglans mandshurica), a common tree of East Asian cool-temperate deciduous forests that apparently lacks long-distance seed dispersal ability. The genetic diversity and structure of 19 natural walnut populations in Northeast China and the Korean Peninsula were examined using 17 nuclear simple sequence repeat (SSR) loci. Potential habitats under current and past climatic conditions were predicted using the ecological niche modelling (ENM) method. Bayesian clustering analysis revealed three groups, which were inferred to have diverged through multiple glacial-interglacial cycles in multiple refugia during the Quaternary Period. ENM estimated a southward range shift at the LGM, but high suitability scores still occurred in the western parts of the Changbai Mountains (Northeast China), the Korean peninsula and the exposed seafloor of the Yellow Sea. In contrast to most other cool-temperate trees co-occurring in the same region, the Manchurian walnut did not show any evidence of a population bottleneck, loss of genetic diversity or isolation by distance during the postglacial expansion. Our study clearly indicates that current northern populations originated from one glacial lineage and recolonization via a gradually advancing front due to the lack of a long-distance seed dispersal mechanism led to no latitudinal decrease in genetic diversity. PMID:27346642

  15. Characterization and Architectural Implications of Big Data Workloads

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Zhen JIA; Han, Rui

    2015-01-01

    Big data areas are expanding in a fast way in terms of increasing workloads and runtime systems, and this situation imposes a serious challenge to workload characterization, which is the foundation of innovative system and architecture design. The previous major efforts on big data benchmarking either propose a comprehensive but a large amount of workloads, or only select a few workloads according to so-called popularity, which may lead to partial or even biased observations. In this paper, o...

  16. From Big Bang to Big Crunch and Beyond

    OpenAIRE

    Elitzur, S.; Giveon, A.; Kutasov, D.; Rabinovici, E.

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a ``big bang'' singularity, expands and then contracts to a ``big crunch'' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceeding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spaceti...

  17. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  18. Big Bang Nucleosynthesis Calculation

    CERN Document Server

    Kurki-Suonio, H

    2001-01-01

    I review standard big bang nucleosynthesis and some versions of nonstandard BBN. The abundances of the primordial isotopes D, He-3, and Li-7 produced in standard BBN can be calculated as a function of the baryon density with an accuracy of about 10%. For He-4 the accuracy is better than 1%. The calculated abundances agree fairly well with observations, but the baryon density of the universe cannot be determined with high precision. Possibilities for nonstandard BBN include inhomogeneous and antimatter BBN and nonzero neutrino chemical potentials.

  19. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  20. Big and little OER

    OpenAIRE

    Weller, Martin

    2010-01-01

    Much of the attention around OERs has been on institutional projects which make explicit learning content available. These can be classified as ‘big OER’, but another form of OER is that of small scale, individually produced resources using web 2.0 type services, which are classified as ‘little OER’. This paper examines some of the differences between the use of these two types of OER to highlight issues in open education. These include attitudes towards reputation, the intentionality of the ...

  1. Big Red Telephone, Gone

    Institute of Scientific and Technical Information of China (English)

    Toni Piech

    2006-01-01

    @@ The Chinese big red telephones looked exactly as Iimagined the ones servicing the direct emergen line between the Kreml and the White House duing the cold-war era would have look like. But here in China, every kio seemed to have such a device in t1990s, and anyone could use it for ju 0.2 yuan. The government did not juinstall public phones on street corner but they let small-business owners pa ticipate in telecommunication. Supply and demand were juggled by a kind of Hutong capitalism.

  2. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  3. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  4. Bigness in compatible systems

    OpenAIRE

    Snowden, Andrew; Wiles, Andrew

    2009-01-01

    Clozel, Harris and Taylor have recently proved a modularity lifting theorem of the following general form: if rho is an l-adic representation of the absolute Galois group of a number field for which the residual representation rho-bar comes from a modular form then so does rho. This theorem has numerous hypotheses; a crucial one is that the image of rho-bar must be "big," a technical condition on subgroups of GL(n). In this paper we investigate this condition in compatible systems. Our main r...

  5. Type I interferon-dependent activation of NK cells by rAd28 or rAd35, but not rAd5, leads to loss of vector-insert expression.

    Science.gov (United States)

    Johnson, Matthew J; Björkström, Niklas K; Petrovas, Constantinos; Liang, Frank; Gall, Jason G D; Loré, Karin; Koup, Richard A

    2014-02-01

    Vaccines constructed from rare-serotype recombinant adenovirus vectors (rAd) such as rAd serotype 28 (rAd28) and rAd35 are currently being explored as alternatives to rAd5-based vaccines because they circumvent the problems with pre-existing immunity that complicate the effectiveness of rAd5 vaccines. However, previous work has demonstrated that the immunogenicity of rAd28 and rAd35 is substantially lower than rAd5. Here we show that rAd28 and rAd35 increase apoptosis of antigen presenting cells (APCs), such as monocytes, relative to rAd5 and mock infected controls. APCs undergoing apoptosis showed an increased loss of vector-insert expression. Loss of vector-insert expression correlated with activation of NK cells, which resulted in apoptosis of co-cultured monocytes. Finally, we show that activation of NK cells is dependent on IFNα which is produced by exposure to rAd28 or rAd35, but not to rAd5. Taken together, these data demonstrate that IFNα-induced activation of NK cells leads to increased monocyte apoptosis and subsequent vector-insert loss. This may be a possible mechanism that results in reduced immunogenicity of rAd28 and rAd35-based vectors. PMID:24325826

  6. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3He, 4He, and 7Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  7. Big Rock Point

    International Nuclear Information System (INIS)

    The Big Rock Point Nuclear Plant is the second oldest operating nuclear power plant in the United States. Its 25-yr history is an embodiment of the history of commercial nuclear power. In some respects, its situation today - 5 yr past the midpoint of its design life - can provide operators of other nuclear plants a glimpse of where they will be in another decade. Construction on Big Rock Point began in 1960. It was completed just 2 1/2 yr later at a cost of $27 million. The plant is a General Electric (GE)-designed boiling water direct cycle, forced circulation, high power density reactor. Its construction was undertaken by Consumers Power under the third round of the U.S. Atomic Energy Commission's (AEC's) Power Demonstration Reactor Program. It was an advanced version of GE's Vallecitos boiling water reactor. The plant's fuel was GE's responsibility and, under contract with the AEC, it conducted a fuel research and development (RandD) program involving the plant. Although the plant was designed for research - its original electrical capacity was set at 50 MW(electric) - the unit was subsequently uprated to 69 MW(net electric). The original plant staff included only 44 people and minimal security. Mirroring the industry experience, the number of people on-site had quadrupled

  8. Pre-big bang geometric extensions of inflationary cosmologies

    CERN Document Server

    Klein, David

    2016-01-01

    Robertson-Walker cosmologies within a large class are geometrically extended to larger spacetimes that include spacetime points with zero and negative cosmological times. In the extended spacetimes, the big bang is lightlike, and though singular, it inherits some geometric structure from the original spacetime. Spacelike geodesics are continuous across the cosmological time zero submanifold which is parameterized by the radius of Fermi space slices, i.e, by the proper distances along spacelike geodesics from a comoving observer to the big bang. The continuous extension of the metric, and the continuously differentiable extension of the leading Fermi metric coefficient g{\\tau}{\\tau} of the observer, restrict the geometry of spacetime points with pre-big bang cosmological time coordinates. In our extensions the big bang is two di- mensional in a certain sense, consistent with some findings in quantum gravity.

  9. Differential Privacy Preserving in Big Data Analytics for Connected Health.

    Science.gov (United States)

    Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei

    2016-04-01

    In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy. PMID:26872779

  10. New Framework for Improving Big Data Analysis Using Mobile Agent

    Directory of Open Access Journals (Sweden)

    Youssef M. ESSA

    2014-01-01

    Full Text Available the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS. Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop designed to support parallel and distributed data processing. Hadoop provides a distributed file processing system that stores and processes a large scale of data. It enables a fault tolerant by replicating data on three or more machines to avoid data loss.Hadoop is based on client server model and used single master machine called NameNode. However, Hadoop has several drawbacks affecting on its performance and reliability against big data analysis. In this paper, a new framework is proposed to improve big data analysis and overcome specified drawbacks of Hadoop. These drawbacks are replication tasks, Centralized node and nodes failure. The proposed framework is called MapReduce Agent Mobility (MRAM. MRAM is developed by using mobile agent and MapReduce paradigm under Java Agent Development Framework (JADE.

  11. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  12. IZVEDBENI ELEMENTI U BIG BROTHERU

    OpenAIRE

    Radman, Korana

    2009-01-01

    Big Brother publici nudi "ultimativnu stvarnost" osiguranu cjelodnevnim nadzorom televizijskih kamera, o čemu je polemizirano od početka njegova prikazivanja u Europi i svijetu. Imajući to na umu, ovaj rad je pristupio Big Brotheru iz perspektive izvedbenih studija, pokušavajući u njemu prepoznati neke od mogućih izvedbi.

  13. The Big Read: Case Studies

    Science.gov (United States)

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  14. Resources: Building Big (and Small)

    OpenAIRE

    Kelley, Todd R.

    2007-01-01

    The article offers a set of videos and web resources for elementary teachers to help them explore five different structures, including bridges, domes, skyscrapers, dams, and tunnels, that have been built big to meet the human needs and wants. It includes the miniseries video "Building Big" by David Macaulay and the website www.pbs.org/buildingbig.com.

  15. Big Data: Leveraging Hadoop platform to process Semi and Unstructured data

    OpenAIRE

    Pratiba D; Dr.Shobha G; Vishwas C N

    2015-01-01

    The use of internet has lead to the generation of large amount of data which is termed as big data. Hadoop is tool which is widely used for processing big data. Big data comprises of complex data which may be completely unstructured format or semi-structured format. In this paper we discuss how semistructured data can be processed in Hadoop. We discuss the approaches involved in this and also how MapReduce model helps in this processing

  16. Big Data: Leveraging Hadoop platform to process Semi and Unstructured data

    Directory of Open Access Journals (Sweden)

    Pratiba D

    2015-12-01

    Full Text Available The use of internet has lead to the generation of large amount of data which is termed as big data. Hadoop is tool which is widely used for processing big data. Big data comprises of complex data which may be completely unstructured format or semi-structured format. In this paper we discuss how semistructured data can be processed in Hadoop. We discuss the approaches involved in this and also how MapReduce model helps in this processing

  17. Capture reactions on C-14 in nonstandard big bang nucleosynthesis

    Science.gov (United States)

    Wiescher, Michael; Gorres, Joachim; Thielemann, Friedrich-Karl

    1990-01-01

    Nonstandard big bang nucleosynthesis leads to the production of C-14. The further reaction path depends on the depletion of C-14 by either photon, alpha, or neutron capture reactions. The nucleus C-14 is of particular importance in these scenarios because it forms a bottleneck for the production of heavier nuclei A greater than 14. The reaction rates of all three capture reactions at big bang conditions are discussed, and it is shown that the resulting reaction path, leading to the production of heavier elements, is dominated by the (p, gamma) and (n, gamma) rates, contrary to earlier suggestions.

  18. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  19. A matrix big bang

    International Nuclear Information System (INIS)

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  20. A Matrix Big Bang

    CERN Document Server

    Craps, B; Verlinde, E; Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  1. Big nuclear accidents

    International Nuclear Information System (INIS)

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  2. LEADING WITH LEADING INDICATORS

    International Nuclear Information System (INIS)

    This paper documents Fluor Hanford's use of Leading Indicators, management leadership, and statistical methodology in order to improve safe performance of work. By applying these methods, Fluor Hanford achieved a significant reduction in injury rates in 2003 and 2004, and the improvement continues today. The integration of data, leadership, and teamwork pays off with improved safety performance and credibility with the customer. The use of Statistical Process Control, Pareto Charts, and Systems Thinking and their effect on management decisions and employee involvement are discussed. Included are practical examples of choosing leading indicators. A statistically based color coded dashboard presentation system methodology is provided. These tools, management theories and methods, coupled with involved leadership and employee efforts, directly led to significant improvements in worker safety and health, and environmental protection and restoration at one of the nation's largest nuclear cleanup sites

  3. LEADING WITH LEADING INDICATORS

    Energy Technology Data Exchange (ETDEWEB)

    PREVETTE, S.S.

    2005-01-27

    This paper documents Fluor Hanford's use of Leading Indicators, management leadership, and statistical methodology in order to improve safe performance of work. By applying these methods, Fluor Hanford achieved a significant reduction in injury rates in 2003 and 2004, and the improvement continues today. The integration of data, leadership, and teamwork pays off with improved safety performance and credibility with the customer. The use of Statistical Process Control, Pareto Charts, and Systems Thinking and their effect on management decisions and employee involvement are discussed. Included are practical examples of choosing leading indicators. A statistically based color coded dashboard presentation system methodology is provided. These tools, management theories and methods, coupled with involved leadership and employee efforts, directly led to significant improvements in worker safety and health, and environmental protection and restoration at one of the nation's largest nuclear cleanup sites.

  4. DPF Big One

    International Nuclear Information System (INIS)

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  5. Mining “Big Data” using Big Data Services

    OpenAIRE

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  6. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  7. Partnership between small biotech and big pharma.

    Science.gov (United States)

    Wiederrecht, Gregory J; Hill, Raymond G; Beer, Margaret S

    2006-08-01

    The process involved in the identification and development of novel breakthrough medicines at big pharma has recently undergone significant changes, in part because of the extraordinary complexity that is associated with tackling diseases of high unmet need, and also because of the increasingly demanding requirements that have been placed on the pharmaceutical industry by investors and regulatory authorities. In addition, big pharma no longer have a monopoly on the tools and enabling technologies that are required to identify and discover new drugs, as many biotech companies now also have these capabilities. As a result, researchers at biotech companies are able to identify credible drug leads, as well as compounds that have the potential to become marketed medicinal products. This diversification of companies that are involved in drug discovery and development has in turn led to increased partnering interactions between the biotech sector and big pharma. This article examines how Merck and Co Inc, which has historically relied on a combination of internal scientific research and licensed products, has poised itself to become further engaged in partnering with biotech companies, as well as academic institutions, to increase the probability of success associated with identifying novel medicines to treat unmet medical needs--particularly in areas such as central nervous system disorders, obesity/metabolic diseases, atheroma and cancer, and also to cultivate its cardiovascular, respiratory, arthritis, bone, ophthalmology and infectious disease franchises. PMID:16871465

  8. Managing Research Data in Big Science

    CERN Document Server

    Gray, Norman; Woan, Graham

    2012-01-01

    The project which led to this report was funded by JISC in 2010--2011 as part of its 'Managing Research Data' programme, to examine the way in which Big Science data is managed, and produce any recommendations which may be appropriate. Big science data is different: it comes in large volumes, and it is shared and exploited in ways which may differ from other disciplines. This project has explored these differences using as a case-study Gravitational Wave data generated by the LSC, and has produced recommendations intended to be useful variously to JISC, the funding council (STFC) and the LSC community. In Sect. 1 we define what we mean by 'big science', describe the overall data culture there, laying stress on how it necessarily or contingently differs from other disciplines. In Sect. 2 we discuss the benefits of a formal data-preservation strategy, and the cases for open data and for well-preserved data that follow from that. This leads to our recommendations that, in essence, funders should adopt rather lig...

  9. Compressor leading edges

    OpenAIRE

    Goodhand, Martin

    2011-01-01

    Compressor blades often have a small 'spike' in the surface pressure distribution at the leading edge. This may result from blade erosion, manufacture defects or compromises made in the original design process. In this thesis it is shown that these spikes will increase the loss generated by a blade only when they become large enough to initiate boundary layer transition at the leading edge through a separation bubble; this process increases profile loss by about 30%. A criterion is presen...

  10. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  11. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  12. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  13. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  14. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  15. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  16. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  17. Tax Expert Offers Ideas for Monitoring Big Spending on College Sports

    Science.gov (United States)

    Sander, Libby

    2009-01-01

    The federal government could take a cue from its regulation of charitable organizations in monitoring the freewheeling fiscal habits of big-time college athletics, a leading tax lawyer says. The author reports on the ideas offered by John D. Colombo, a professor at the University of Illinois College of Law, for monitoring big spending on college…

  18. Big data algorithms, analytics, and applications

    CERN Document Server

    Li, Kuan-Ching; Yang, Laurence T; Cuzzocrea, Alfredo

    2015-01-01

    Data are generated at an exponential rate all over the world. Through advanced algorithms and analytics techniques, organizations can harness this data, discover hidden patterns, and use the findings to make meaningful decisions. Containing contributions from leading experts in their respective fields, this book bridges the gap between the vastness of big data and the appropriate computational methods for scientific and social discovery. It also explores related applications in diverse sectors, covering technologies for media/data communication, elastic media/data storage, cross-network media/

  19. Detecting and understanding big events in big cities

    OpenAIRE

    Furletti, Barbara; Trasarti, Roberto; Gabrielli, Lorenzo; Smoreda, Zbigniew; Vanhoof, Maarten; Ziemlicki, Cezary

    2015-01-01

    Recent studies have shown the great potential of big data such as mobile phone location data to model human behavior. Big data allow to analyze people presence in a territory in a fast and effective way with respect to the classical surveys (diaries or questionnaires). One of the drawbacks of these collection systems is incompleteness of the users' traces; people are localized only when they are using their phones. In this work we define a data mining method for identifying people presence an...

  20. Antigravity and the big crunch/big bang transition

    OpenAIRE

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition...

  1. Quantum Fields in a Big Crunch/Big Bang Spacetime

    OpenAIRE

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the Big Crunch/Big Bang transition postulated in the ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it re-expands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interacti...

  2. Sailing through the big crunch-big bang transition

    OpenAIRE

    Bars, Itzhak; Steinhardt, Paul; Turok, Neil

    2013-01-01

    In a recent series of papers, we have shown that theories with scalar fields coupled to gravity (e.g., the standard model) can be lifted to a Weyl-invariant equivalent theory in which it is possible to unambiguously trace the classical cosmological evolution through the transition from big crunch to big bang. The key was identifying a sufficient number of finite, Weyl-invariant conserved quantities to uniquely match the fundamental cosmological degrees of freedom across the transition. In so ...

  3. Hey, big spender

    International Nuclear Information System (INIS)

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  4. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Science.gov (United States)

    de Brevern, Alexandre G.; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries. PMID:26125026

  5. A view on big data and its relation to Informetrics

    Institute of Scientific and Technical Information of China (English)

    Ronald; ROUSSEAU

    2012-01-01

    Purpose:Big data offer a huge challenge.Their very existence leads to the contradiction that the more data we have the less accessible they become,as the particular piece of information one is searching for may be buried among terabytes of other data.In this contribution we discuss the origin of big data and point to three challenges when big data arise:Data storage,data processing and generating insights.Design/methodology/approach:Computer-related challenges can be expressed by the CAP theorem which states that it is only possible to simultaneously provide any two of the three following properties in distributed applications:Consistency(C),availability(A)and partition tolerance(P).As an aside we mention Amdahl’s law and its application for scientific collaboration.We further discuss data mining in large databases and knowledge representation for handling the results of data mining exercises.We further offer a short informetric study of the field of big data,and point to the ethical dimension of the big data phenomenon.Findings:There still are serious problems to overcome before the field of big data can deliver on its promises.Implications and limitations:This contribution offers a personal view,focusing on the information science aspects,but much more can be said about software aspects.Originality/value:We express the hope that the information scientists,including librarians,will be able to play their full role within the knowledge discovery,data mining and big data communities,leading to exciting developments,the reduction of scientific bottlenecks and really innovative applications.

  6. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  7. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  8. Big Lake Dam Inspection Report

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes an inspection of the Big Lake Dam that was done in September of 1983. The inspection did not reveal any conditions that constitute and...

  9. Le Big Bang en laboratoire

    CERN Multimedia

    Roy, Christelle

    2006-01-01

    Physiciens have been dreaming of it for 30 years; Thanks to huge particle accelerators, they were able to observe the matter such as it was some instants after the Big Bang (three different articles in 10 pages)

  10. Big Data Analytics in Healthcare

    OpenAIRE

    Ashwin Belle; Raghuram Thiagarajan; S. M. Reza Soroushmehr; Fatemeh Navidi; Daniel A Beard; Kayvan Najarian

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is sti...

  11. Big Data and Ambulatory Care

    OpenAIRE

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2014-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an ov...

  12. The role of big laboratories

    International Nuclear Information System (INIS)

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  13. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  14. Pre-big bang in M-theory

    OpenAIRE

    Cavaglia, Marco

    2001-01-01

    We discuss a simple cosmological model derived from M-theory. Three assumptions lead naturally to a pre-big bang scenario: (a) 11-dimensional supergravity describes the low-energy world; (b) non-gravitational fields live on a three-dimensional brane; and (c) asymptotically past triviality.

  15. Comments on Thomas Wartenberg's "Big Ideas for Little Kids"

    Science.gov (United States)

    Goering, Sara

    2012-01-01

    This short commentary offers praise for Tom Wartenberg's book "Big Ideas for Little Kids" and raises questions about who is best qualified to lead a philosophy discussion with children, and how we are to assess the benefits of doing philosophy with children.

  16. Human Neuroimaging as a “Big Data” Science

    OpenAIRE

    Van Horn, John Darrell; Toga, Arthur W.

    2014-01-01

    The maturation of in vivo neuroimaging has lead to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of “big data”. A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discus...

  17. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  18. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  19. Dual of Big-bang and Big-crunch

    OpenAIRE

    Bak, Dongsu

    2006-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by procedure of the double anaytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are non singular at all as the coupling goes to zero in the N=4 Super Yang-Mills theory. The cosmological sing...

  20. Turning big bang into big bounce: II. Quantum dynamics

    International Nuclear Information System (INIS)

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  1. Can companies benefit from Big Science? Science and Industry

    CERN Document Server

    Autio, Erkko; Bianchi-Streit, M

    2003-01-01

    Several studies have indicated that there are significant returns on financial investment via "Big Science" centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields - for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm's organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings i...

  2. Big data challenges: impact, potential responses and research needs

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which they...... can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  3. Hearing Loss

    Science.gov (United States)

    ... labor & premature birth The newborn intensive care unit (NICU) Birth defects & other health conditions Loss & grief Tools & ... labor & premature birth The newborn intensive care unit (NICU) Birth defects & other health conditions Loss & grief Ask ...

  4. Lead Poisoning

    Science.gov (United States)

    Lead is a metal that occurs naturally in the earth's crust. Lead can be found in all parts of our ... from human activities such as mining and manufacturing. Lead used to be in paint; older houses may ...

  5. The use of big data in transfusion medicine.

    Science.gov (United States)

    Pendry, K

    2015-06-01

    'Big data' refers to the huge quantities of digital information now available that describe much of human activity. The science of data management and analysis is rapidly developing to enable organisations to convert data into useful information and knowledge. Electronic health records and new developments in Pathology Informatics now support the collection of 'big laboratory and clinical data', and these digital innovations are now being applied to transfusion medicine. To use big data effectively, we must address concerns about confidentiality and the need for a change in culture and practice, remove barriers to adopting common operating systems and data standards and ensure the safe and secure storage of sensitive personal information. In the UK, the aim is to formulate a single set of data and standards for communicating test results and so enable pathology data to contribute to national datasets. In transfusion, big data has been used for benchmarking, detection of transfusion-related complications, determining patterns of blood use and definition of blood order schedules for surgery. More generally, rapidly available information can monitor compliance with key performance indicators for patient blood management and inventory management leading to better patient care and reduced use of blood. The challenges of enabling reliable systems and analysis of big data and securing funding in the restrictive financial climate are formidable, but not insurmountable. The promise is that digital information will soon improve the implementation of best practice in transfusion medicine and patient blood management globally. PMID:26178303

  6. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  7. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  8. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  9. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade. PMID:26151137

  10. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  11. Historical Trauma, Substance Use, and Indigenous Peoples: Seven Generations of Harm From a "Big Event".

    Science.gov (United States)

    Nutton, Jennifer; Fast, Elizabeth

    2015-01-01

    Indigenous peoples the world over have and continue to experience the devastating effects of colonialism including loss of life, land, language, culture, and identity. Indigenous peoples suffer disproportionately across many health risk factors including an increased risk of substance use. We use the term "Big Event" to describe the historical trauma attributed to colonial policies as a potential pathway to explain the disparity in rates of substance use among many Indigenous populations. We present "Big Solutions" that have the potential to buffer the negative effects of the Big Event, including: (1) decolonizing strategies, (2) identity development, and (3) culturally adapted interventions. Study limitations are noted and future needed research is suggested. PMID:26158749

  12. Transcriptome characterization and polymorphism detection between subspecies of big sagebrush (Artemisia tridentata)

    OpenAIRE

    Cronn Richard C; Price Jared C; Richardson Bryce A; Bajgain Prabin; Udall Joshua A

    2011-01-01

    Abstract Background Big sagebrush (Artemisia tridentata) is one of the most widely distributed and ecologically important shrub species in western North America. This species serves as a critical habitat and food resource for many animals and invertebrates. Habitat loss due to a combination of disturbances followed by establishment of invasive plant species is a serious threat to big sagebrush ecosystem sustainability. Lack of genomic data has limited our understanding of the evolutionary his...

  13. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  14. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  15. AAPOR Report on Big Data

    OpenAIRE

    Task Force Members Include: Lilli Japec; Frauke Kreuter; Marcus Berg; Paul Biemer; Paul Decker; Cliff Lampe; Julia Lane; Cathy O'Neil; Abe Usher

    2015-01-01

    In recent years we have seen an increase in the amount of statistics in society describing different phenomena based on so called Big Data. The term Big Data is used for a variety of data as explained in the report, many of them characterized not just by their large volume, but also by their variety and velocity, the organic way in which they are created, and the new types of processes needed to analyze them and make inference from them. The change in the nature of the new types of data, thei...

  16. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  17. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  18. The BigBOSS Experiment

    OpenAIRE

    Schlegel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Prieto, C. Allende; Annis, J.; Aubourg, E.; Azzaro, M.; Baltay, S. Bailey. C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra...

  19. Release plan for Big Pete

    International Nuclear Information System (INIS)

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  20. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  1. A high CO2 -driven decrease in plant transpiration leads to perturbations in the hydrological cycle and may link terrestrial and marine loss of biodiversity: deep-time evidence.

    Science.gov (United States)

    Steinthorsdottir, Margret; Woodward, F. Ian; Surlyk, Finn; McElwain, Jennifer C.

    2013-04-01

    CO2 is obtained and water vapor simultaneously transpired through plant stomata, driving the water uptake of roots. Stomata are key elements of the Earth's hydrological cycle, since a large part of the evapotranspiration from the surface to the atmosphere takes place via stomatal pores. Plants exercise stomatal control, by adjusting stomatal size and/or density in order to preserve water while maintaining carbon uptake for photosynthesis. A global decrease in stomatal density and/or size causes a decrease in transpiration and has the potential to increase global runoff. Here we show, from 91 fossil leaf cuticle specimens from the Triassic/Jurassic boundary transition (Tr-J) of East Greenland, that both stomatal size and density decreased dramatically during the Tr-J, coinciding with mass extinctions, major environmental upheaval and a negative C-isotope excursion. We estimate that these developmental and structural changes in stomata resulted in a 50-60% drop in stomatal and canopy transpiration as calibrated using a stomatal model, based on empirical measurements and adjusted for fossil plants. We additionally present new field evidence indicating a change to increased erosion and bad-land formation at the Tr-J. We hypothesize that plant physiological responses to high carbon dioxide concentrations at the Tr-J may have increased runoff at the local and perhaps even regional scale. Increased runoff may result in increased flux of nutrients from land to oceans, leading to eutrophication, anoxia and ultimately loss of marine biodiversity. High-CO2 driven changes in stomatal and canopy transpiration therefore provide a possible mechanistic link between terrestrial ecological crisis and marine mass extinction at the Tr-J.

  2. Early universe and big bang nucleosynthesis

    International Nuclear Information System (INIS)

    This is a series of six one-hour lectures tuned to the level of a graduate course covering basically the background required for understanding the phenomenon of the big bang nucleosynthesis. It begins with a brief introduction to the geometry, dynamics and thermodynamics of the universe as a whole, followed by one lecture on the discovery, properties and implications of the 3 K microwave background radiation. Then we move on to the thermodynamical properties of the early universe, effects of pair annihilation, the role of the weak interactions in creating a neutrino background and freezing the ratio of the available free neutrons to protons. In the fourth lecture, we describe the process of the big bang nucleosynthesis leading to the formation of deuterium, helium and lithium. The methods of the observational estimations of these primordial abundances are discussed in the fifth lecture, and finally in the sixth, their comparison with the predictions of the standard model and the inadequacy of the standard model, if any. It is in this respect that primordial nucleosynthesis provides a testing ground for one of the possible cosmological consequences of the quark-hadron phase transition in the early universe. (orig.)

  3. Do Big Bottles Kickstart Infant Weight Issues?

    Science.gov (United States)

    ... nih.gov/medlineplus/news/fullstory_159241.html Do Big Bottles Kickstart Infant Weight Issues? Smaller baby bottles ... 2016 (HealthDay News) -- Feeding babies formula from a big bottle might put them at higher risk for ...

  4. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  5. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  6. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  7. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  8. Characterizing and Subsetting Big Data Workloads

    OpenAIRE

    Jia, Zhen; Zhan, Jianfeng; Wang, Lei; Han, Rui; Mckee, Sally A.; Yang, Qiang; Luo, Chunjie; Li, Jingwei

    2014-01-01

    Big data benchmark suites must include a diversity of data and workloads to be useful in fairly evaluating big data systems and architectures. However, using truly comprehensive benchmarks poses great challenges for the architecture community. First, we need to thoroughly understand the behaviors of a variety of workloads. Second, our usual simulation-based research methods become prohibitively expensive for big data. As big data is an emerging field, more and more software stacks are being p...

  9. Big Graph Mining: Frameworks and Techniques

    OpenAIRE

    Aridhi, Sabeur; Nguifo, Engelbert Mephu

    2016-01-01

    Big graph mining is an important research area and it has attracted considerable attention. It allows to process, analyze, and extract meaningful information from large amounts of graph data. Big graph mining has been highly motivated not only by the tremendously increasing size of graphs but also by its huge number of applications. Such applications include bioinformatics, chemoinformatics and social networks. One of the most challenging tasks in big graph mining is pattern mining in big gra...

  10. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  11. Judging Big Deals: Challenges, Outcomes, and Advice

    Science.gov (United States)

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good…

  12. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  13. "Big Data" - Grosse Daten, viel Wissen?

    OpenAIRE

    Hothorn, Torsten

    2015-01-01

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  14. The BigBoss Experiment

    International Nuclear Information System (INIS)

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = λ/Δλ = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 max = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (kmax = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  15. Lead poisoning

    Science.gov (United States)

    ... free solder, lead is still found in some modern faucets. Soil contaminated by decades of car exhaust ... NOT store wine, spirits, or vinegar-based salad dressings in lead crystal decanters for long periods of ...

  16. Lead Toxicity

    Science.gov (United States)

    ... in children over time may lead to reduced IQ, slow learning, Attention Deficit Hyperactivity Disorder (ADHD), or ... avoid exposure to soil. Is there a medical test for lead exposure? • Blood samples can be tested ...

  17. Relational Leading

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Rasmussen, Jørgen Gulddahl

    2015-01-01

    This first chapter presents the exploratory and curious approach to leading as relational processes – an approach that pervades the entire book. We explore leading from a perspective that emphasises the unpredictable challenges and triviality of everyday life, which we consider an interesting......, relevant and realistic way to examine leading. The chapter brings up a number of concepts and contexts as formulated by researchers within the field, and in this way seeks to construct a first understanding of relational leading....

  18. Global Fluctuation Spectra in Big Crunch/Big Bang String Vacua

    OpenAIRE

    Craps, Ben; Ovrut, Burt A.

    2003-01-01

    We study Big Crunch/Big Bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a Big Crunch and a Big Bang cosmology, as well as additional ``whisker'' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the Big Crunch fluctuation spectrum is altered while passing through the bounce...

  19. Hidden loss

    DEFF Research Database (Denmark)

    Kieffer-Kristensen, Rikke; Johansen, Karen Lise Gaardsvig

    2013-01-01

    finding indicates that the children experienced numerous losses, many of which were often suppressed or neglected by the children to protect the ill parents. CONCLUSIONS: The findings indicated that the children seemed to make a special effort to hide their feelings of loss and grief in order to protect...

  20. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  1. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  2. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  3. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  4. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  5. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  6. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  7. Small risk, big price

    International Nuclear Information System (INIS)

    A conference held in the United Kingdom on the harmful effects of low frequency electromagnetic fields (EM), such as those emitted by powerlines, is reported. It was sponsored by solicitors acting on behalf of families taking legal action on the issue of power lines and health risks and the delegates ranged from leading cancer specialists to campaigning groups. The view of the National Grid Company was expressed that, since no cause-and-effect relationships has been established, it would be premature to take astronomically expensive measures to shield substations and house underground pipelines in steel pipes in order to achieve very low field levels acceptable to campaign groups. The possibility of a cancer link with exposure to EM fields could not be ruled out, however. On behalf of one of the pressure groups it was argued that faced with a suspected hazard for which there is statistically significant evidence of association but incomplete evidence of cause, the electricity companies should take some positive action. In the view of an epidemiologist the evidence is sufficiently unclear as to allow people to arrive at differing conclusions and called for a policy response which was something less than panic but something greater than negligence. A solicitor's view was that some form of self-regulation before conclusive proof either way is found would ease public concern and that any such code of practice should specify that new lines should be placed at least 50 m from houses. (UK)

  8. Web Science Big Wins: Information Big Bang & Fundamental Constants

    OpenAIRE

    Carr, Les

    2010-01-01

    We take for granted a Web that provides free and unrestricted information exchange, but the Web is under pressure to change in order to respond to issues of security, commerce, criminality, privacy. Web Science needs to explain how the Web impacts society and predict the outcomes of proposed changes to Web infrastructure on business and society. Using the analogy of the Big Bang, this presentation describes how the Web spread the conditions of its initial creation throughout the whole of soci...

  9. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  10. Nástroje pro Big Data Analytics

    OpenAIRE

    Miloš, Marek

    2013-01-01

    The thesis covers the term for specific data analysis called Big Data. The thesis firstly defines the term Big Data and the need for its creation because of the rising need for deeper data processing and analysis tools and methods. The thesis also covers some of the technical aspects of Big Data tools, focusing on Apache Hadoop in detail. The later chapters contain Big Data market analysis and describe the biggest Big Data competitors and tools. The practical part of the thesis presents a way...

  11. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  12. Big data is not a monolith

    CERN Document Server

    Sugimoto, Cassidy R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  13. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  14. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  15. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  16. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. PMID:26844660

  17. Big Bounce in Dipole Cosmology

    OpenAIRE

    Battisti, Marco Valerio; Marciano, Antonino

    2010-01-01

    We derive the cosmological Big Bounce scenario from the dipole approximation of Loop Quantum Gravity. We show that a non-singular evolution takes place for any matter field and that, by considering a massless scalar field as a relational clock for the dynamics, the semi-classical proprieties of an initial state are preserved on the other side of the bounce. This model thus enhances the relation between Loop Quantum Cosmology and the full theory.

  18. BIG Data – A Review.

    OpenAIRE

    Anuradha Bhatia; Gaurav Vaswani

    2013-01-01

    As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask ...

  19. Big data processing with Hadoop

    OpenAIRE

    Wu, Shiqi

    2015-01-01

    Computing technology has changed the way we work, study, and live. The distributed data processing technology is one of the popular topics in the IT field. It provides a simple and centralized computing platform by reducing the cost of the hardware. The characteristics of distributed data processing technology have changed the whole industry. Hadoop, as the open source project of Apache foundation, is the most representative platform of distributed big data processing. The Hadoop distribu...

  20. Big Bang Nucleosynthesis: An Update

    OpenAIRE

    Olive, Keith A.; Scully, Sean T.

    1995-01-01

    WThe current status of big bang nucleosynthesis is reviewed with an emphasis on the comparison between the observational determination of the light element abundances of \\D, \\he3, \\he4 and \\li7 and the predictions from theory. In particular, we present new analyses for \\he4 and \\li7. Implications for physics beyond the standard model are also discussed. Limits on the effective number of neutrino flavors are also updated.

  1. Industrialization and the Big Push

    OpenAIRE

    1988-01-01

    This paper explores Rosenstein-Rodman's (1943) idea that simultaneous industrialization of many sectors of the economy can be profitable for all of them, even when no sector can break even industrializing alone. We analyze this ides in the context of an imperfectly competitive economy with aggregate demand spillovers, and interpret the big push into industrialization as a move from a bad to a good equilibrium. We show that for two equilibria to exist, it must be the case that an industrializi...

  2. Pragmatic Interaction between Big Powers

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Lu. It is very difficult to summarize the relationship among big powers in 2004. Looking east, there existed a """"small cold war"""" named by some media between Europe and Russia and between the United States and Russia; with regard to the """"orange revolution"""" in Ukraine at the end of the year, a rival show rope and Russia. Looking east, awas displayed between America, Eufresh scent seems to fill the air.

  3. 大数据时代下的情报分析与挖掘技术研究——电信客户流失情况分析%Research on Information Analysis and Data Mining in the Age of Big Data:Analysis of Customer Loss in Telecom

    Institute of Scientific and Technical Information of China (English)

    王晓佳; 杨善林; 陈志强

    2013-01-01

    大数据时代下的信息具有体量大、复杂性高、更新速度快的特点,从具有如此复杂特性的信息中挖掘出用户所需的情报,难度较以往有了很大的提升.要在发展中抢占先机,在大数据时代获取竞争优势,就必须对原有的情报分析思路进行必要的升级改造,以满足信息的情报属性.文章在介绍了大数据以及大数据环境下情报内涵转变的原因之后,提出了一种在大数据背景下的情报分析与挖掘的建模机理,首先应用MapReduce建立情报任务分解概念模型,然后针对分解后的某一单任务数据表进行预处理和数据挖掘工作,利用数学模型、人工智能等方法构造大数据时代下情报分析与数据挖掘的新思路.最后利用仿真实验来验证这一新思路的可行性和合理性.%Large scale, high complexity and update fast are three characteristic of information under the age of big data, the difficulty of mining valuable information form such complex data set has been greatly improved. In order to seize the opportunity in development and gain competitive advantage under the age of big data, it is must to update the original information analysis method and meet the data satisfy the information attribute. Based on the introduction of big data and the change reason of information content under the age of big data, this paper put forward a modeling mechanism of information analysis and mining under the age of big data, the modeling mechanism is, first, construct the model of task decomposition of information by MapReduce tool, then, make data preprocessing and mining operation according to the single task data sheet, use mathematical model, artificial intelligence and other methods to construct the new ideas of information analysis and data mining under the age of big data, finally, a case study presented to demonstrate the feasibility and rationality of our approach.

  4. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  5. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster. PMID:23074865

  6. Memory loss

    Science.gov (United States)

    A person with memory loss needs a lot of support. It helps to show the person familiar objects, music, or and photos or play familiar music. Write down when the person should take any medicine or do other ...

  7. Big Data Analytics Platforms analyze from startups to traditional database players

    OpenAIRE

    Ionut TARANU

    2015-01-01

    Big data analytics enables organizations to analyze a mix of structured, semi-structured and unstructured data in search of valuable business information and insights. The analytical findings can lead to more effective marketing, new revenue opportunities, better customer service, improved operational efficiency, competitive advantages over rival organizations and other business benefits. With so many emerging trends around big data and analytics, IT organizations need to create conditions th...

  8. Performance of the lift-pump with the lead-bismuth cooled fast reactor. Experimental study on bubble distribution and circulation flow rate

    International Nuclear Information System (INIS)

    Recently, the utilization of the lift pump is examined in a small reactor of the lead-bismuth eutectic cooling. Then, the experiments concerning about void behavior and performance of the lift pump in three kinds of risers (1124mm in height and inside diameters φ69.3mm, φ106.3mm, φ155.2mm) were performed by using lead bismuth eutectic. The main results are as follows: (1) The local void fraction varies in horizontal plane in case of the big diameter riser. (2) The lead-bismuth circulating flow rate evaluated by a present design method becomes lower than that of experiments in case of medium and small diameter risers. This design method can be used as an outline evaluating function for these cases, considering the evaluation accuracy of the pressure loss of the test section in the calculation. (3) In the big diameter riser, the present design method excessively evaluates the lead-bismuth circulating flow rate. It thought that the circulation head will not occur in the experiments such as a results of the present design method because the void rises biasing in horizontal plane in case of big diameter riser though the present method is one dimensional model. It is better to utilize a separator which can divides the riser into about 10cm diameter flow path and the void is fed uniformly distributed to each paths to obtain appropriate circulation head. (author)

  9. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    OpenAIRE

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-01-01

    The Big Bang–Big Crunch (BB–BC) optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR) filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response ...

  10. Lead poisoning

    Science.gov (United States)

    ... help if this information is not immediately available. Poison Control If someone has severe symptoms from possible ... be caused by lead poisoning, call your local poison control center. Your local poison center can be ...

  11. Lead Test

    Science.gov (United States)

    ... Other potential lead sources include imported foods, candy, cosmetics, costume jewelry, brass keys, and toys or household ... Health Professionals ©2001 - by American Association for Clinical Chemistry • Contact Us | Terms of Use | Privacy We comply ...

  12. Lead Poisoning

    Science.gov (United States)

    ... has also been associated with juvenile delinquency and criminal behavior. In adults, lead can increase blood pressure ... and-forth manner, but rather from left to right (or vise-versa), or from the top of ...

  13. Acute effect of weight loss on levels of total bilirubin in obese, cardiovascular high-risk patients: an analysis from the lead-in period of the Sibutramine Cardiovascular Outcome trial

    DEFF Research Database (Denmark)

    Andersson, Charlotte; Weeke, Peter; Fosbøl, Emil Loldrup;

    2009-01-01

    Low levels of bilirubin are associated with an increased risk of cardiovascular adverse events. Weight reduction is known to reduce several cardiovascular risk factors, but effects on bilirubin levels have not been reported. We studied the response of weight loss therapy with sibutramine and life...

  14. Bmal1 and Beta cell clock are required for adaptation to circadian disruption, and their loss of function leads to oxidative stress-induced Beta cell failure in mice

    Science.gov (United States)

    Circadian disruption has deleterious effects on metabolism. Global deletion of Bmal1, a core clock gene, results in Beta cell dysfunction and diabetes. However, it is unknown if this is due to loss of cell-autonomous function of Bmal1 in Beta cells. To address this, we generated mice with Beta cell ...

  15. An Overview of Big Data Privacy Issues

    OpenAIRE

    Patrick Hung

    2013-01-01

    Big data is the term for a collection of large and complex datasets from different sources that is difficult to process using traditional data management and processing applications. In these datasets, some information must be kept secret from others. On the other hand, some information has to be released for acquainting information or big data analytical services. The research challenge is how to protect the private information in the context of big data. Privacy is described by the ability ...

  16. Social Big Data and Privacy Awareness

    OpenAIRE

    Sang, Lin

    2015-01-01

    Based on the rapid development of Big Data, the data from the online social network becomea major part of it. Big data make the social networks became data-oriented rather than social-oriented. Taking this into account, this dissertation presents a qualitative study to research howdoes the data-oriented social network affect its users’ privacy management for nowadays. Within this dissertation, an overview of Big Data and privacy issues on the social network waspresented as a background study. ...

  17. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  18. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  19. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  20. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  1. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare. PMID:27520614

  2. Fitting ERGMs on big networks.

    Science.gov (United States)

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  3. Big deformation in 17C

    International Nuclear Information System (INIS)

    Reaction and interaction cross sections of 17C on a carbon target have been re-analyzed using the modified Glauber model. The analysis with a deformed Woods-Saxon density/potential suggests a big deformation structure for 17C. The existence of a tail in the density distribution supports the possibility of it being a one-neutron halo structure. Under a deformed core plus a single-particle assumption, analysis shows a dominant d-wave of the valence neutron in 17C. (authors)

  4. Big bang nucleosynthesis: An update

    International Nuclear Information System (INIS)

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, 4He, and 7Li is discussed and compared to their observational determination. While concordance for D and 4He is satisfactory, the prediction for 7Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed

  5. Big Five -persoonallisuuspiirteiden yhteydet unettomuuteen

    OpenAIRE

    Aronen, Aino

    2015-01-01

    Tutkimuksen tarkoituksena oli selvittÀÀ, ovatko Big Five -persoonallisuuspiirteet (neuroottisuus, ulospÀinsuuntautuneisuus, tunnollisuus, avoimuus kokemuksille ja sovinnollisuus) yhteydessÀ unettomuuden oireisiin, joita olivat nukahtamisvaikeudet, herÀilyt, vaikeudet pysyÀ unessa ja vÀsyneenÀ herÀÀmiset normaalipituisten unien jÀlkeen. Unettomuutta koskevien teorioiden mukaan korkea neuroottisuus, matala ulospÀinsuuntautuneisuus, matala tunnollisuus ja matala sovinnollisuus voivat...

  6. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  7. Big Data Analytics Using Cloud and Crowd

    OpenAIRE

    Allahbakhsh, Mohammad; Arbabi, Saeed; Motahari-Nezhad, Hamid-Reza; Benatallah, Boualem

    2016-01-01

    The increasing application of social and human-enabled systems in people's daily life from one side and from the other side the fast growth of mobile and smart phones technologies have resulted in generating tremendous amount of data, also referred to as big data, and a need for analyzing these data, i.e., big data analytics. Recently a trend has emerged to incorporate human computing power into big data analytics to solve some shortcomings of existing big data analytics such as dealing with ...

  8. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  9. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  10. Urgent Call for Nursing Big Data.

    Science.gov (United States)

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  11. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  12. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  13. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  14. CloudJet4BigData: Streamlining Big Data via an accelerated socket interface

    OpenAIRE

    Frank Z.Wang

    2014-01-01

    Big data needs to feed users with fresh processing results and cloud platforms can be used to speed up big data applications. This paper describes a new data communication protocol (CloudJet) for long distance and large volume big data accessing operations to alleviate the large latencies encountered in sharing big data resources in the clouds. It encapsulates a dynamic multi-stream/multi-path engine at the socket level, which conforms to Portable Operating System Interface (POSIX) and thereb...

  15. CloudJet4BigData: Streamlining Big Data via an Accelerated Socket Interface

    OpenAIRE

    Wang, Frank Zhigang; Dimitrakos, Theo; Helian, Na; Wu, Sining; Li, Ling; Yates, Rodric

    2014-01-01

    Big data needs to feed users with fresh processing results and cloud platforms can be used to speed up big data applications. This paper describes a new data communication protocol (CloudJet) for long distance and large volume big data accessing operations to alleviate the large latencies encountered in sharing big data resources in the clouds. It encapsulates a dynamic multi-stream/multi-path engine at the socket level, which conforms to Portable Operating System Interface (POSIX) and thereb...

  16. Improving the Success of Strategic Management Using Big Data.

    Science.gov (United States)

    Desai, Sapan S; Wilkerson, James; Roberts, Todd

    2016-01-01

    Strategic management involves determining organizational goals, implementing a strategic plan, and properly allocating resources. Poor access to pertinent and timely data misidentifies clinical goals, prevents effective resource allocation, and generates waste from inaccurate forecasting. Loss of operational efficiency diminishes the value stream, adversely impacts the quality of patient care, and hampers effective strategic management. We have pioneered an approach using big data to create competitive advantage by identifying trends in clinical practice, accurately anticipating future needs, and strategically allocating resources for maximum impact. PMID:27180477

  17. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Register (73 FR 76677) on December 17, 2008. For more about the initial process and the history of this... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife...

  18. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    A.G. Thalmayer; G. Saucier; A. Eigenhuis

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  19. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  20. Kasner solutions, climbing scalars and big-bang singularity

    International Nuclear Information System (INIS)

    We elaborate on a recently discovered phenomenon where a scalar field close to big-bang is forced to climb a steep potential by its dynamics. We analyze the phenomenon in more general terms by writing the leading order equations of motion near the singularity. We formulate the conditions for climbing to exist in the case of several scalars and after inclusion of higher-derivative corrections and we apply our results to some models of moduli stabilization. We analyze an example with steep stabilizing potential and notice again a related critical behavior: for a potential steepness above a critical value, going backwards towards big-bang, the scalar undergoes wilder oscillations, with the steep potential pushing it back at every passage and not allowing the scalar to escape to infinity. Whereas it was pointed out earlier that there are possible implications of the climbing phase to CMB, we point out here another potential application, to the issue of initial conditions in inflation

  1. Developmental immunotoxicology of lead

    International Nuclear Information System (INIS)

    The heavy metal, lead, is a known developmental immunotoxicant that has been shown to produce immune alterations in humans as well as other species. Unlike many compounds that exert adverse immune effects, lead exposure at low to moderate levels does not produce widespread loss of immune cells. In contrast, changes resulting from lead exposure are subtle at the immune cell population level but, nevertheless, can be functionally dramatic. A hallmark of lead-induced immunotoxicity is a pronounced shift in the balance in T helper cell function toward T helper 2 responses at the expense of T helper 1 functions. This bias alters the nature and range of immune responses that can be produced thereby influencing host susceptibility to various diseases. Immunotoxic responses to lead appear to differ across life stages not only quantitatively with regard to dose response, but also qualitatively in terms of the spectrum of immune alterations. Experimental studies in several lab animal species suggest the latter stages of gestation are a period of considerable sensitivity for lead-induced immunotoxicity. This review describes the basic characteristics of lead-induced immunotoxicity emphasizing experimental animal results. It also provides a framework for the consideration of toxicant exposure effects across life stages. The existence of and probable basis for developmental windows of immune hyper-susceptibility are presented. Finally, the potential for lead to serve as a perinatal risk factor for childhood asthma as well as other diseases is considered

  2. Assessment of Soil Erosion Severities and Conservation Effects on Reduction of Soil Loss and Sediment Yield by Using the Caesium-137 and Excess Lead-210 Tracing Techniques in the Upper Yangtze River Basin, China

    International Nuclear Information System (INIS)

    In one of the most eroded regions of China, the Upper Yangtze River Basin, 137Cs and 210Pbex tracing techniques have been used to identify sediment sources and assess erosion rates and to evaluate soil conservation benefits and impacts of deforestation and reforestation on soil erosion. Dating of reservoir deposits by 137Cs concentration variations in profiles provided reliable information on sediment yields and average soil losses in a catchment in the Hilly Sichuan Basin and the Three Gorge Region. The collected data indicated that specific sediment yields ranged between 566 t km-2 a-1 and 1869 t km-2 a-1. Further investigation of sediment sources by comparison of the 137Cs and 210Pbex concentrations in reservoir sediment and those in surface soils provided relative sediment contributions of 18%, 46% and 36% from steep forest slopes, gentle cultivated terraces and bare slopes, respectively, in a micro-catchment located in the Hilly Sichuan Basin. Additionally two forest and shrub fires, which occurred in 1960 and 1998, respectively, in the catchment, were identified from the variations of 137Cs and 210Pbex concentrations in a deposit profile in the Jiulongdian Reservoir (Yunnan Plateau). Dating of these reservoir deposits showed that sediment yields are highly respondent to vegetation changes in the catchment. Finally, assessment of soil losses from 137Cs and 210Pbex measurements on the sloping cultivated land in the Sichuan Hilly Basin showed that the traditional and centuries-old soil conservation measures (a drainage system combined with the 'Tiaoshamiantu' cultivation) have reduced soil losses up to 35% compared to losses on sloping land without conservation measures monitored by runoff plots. (author)

  3. Astronomical surveys and big data

    Science.gov (United States)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  4. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  5. THE 2H(alpha, gamma6LI REACTION AT LUNA AND BIG BANG NUCLEOSYNTHETIS

    Directory of Open Access Journals (Sweden)

    Carlo Gustavino

    2013-12-01

    Full Text Available The 2H(α, γ6Li reaction is the leading process for the production of 6Li in standard Big Bang Nucleosynthesis. Recent observations of lithium abundance in metal-poor halo stars suggest that there might be a 6Li plateau, similar to the well-known Spite plateau of 7Li. This calls for a re-investigation of the standard production channel for 6Li. As the 2H(α, γ6Li cross section drops steeply at low energy, it has never before been studied directly at Big Bang energies. For the first time the reaction has been studied directly at Big Bang energies at the LUNA accelerator. The preliminary data and their implications for Big Bang nucleosynthesis and the purported 6Li problem will be shown.

  6. Supporting diagnosis and treatment in medical care based on Big Data processing.

    Science.gov (United States)

    Lupşe, Oana-Sorina; Crişan-Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara; Bernard, Elena

    2014-01-01

    With information and data in all domains growing every day, it is difficult to manage and extract useful knowledge for specific situations. This paper presents an integrated system architecture to support the activity in the Ob-Gin departments with further developments in using new technology to manage Big Data processing - using Google BigQuery - in the medical domain. The data collected and processed with Google BigQuery results from different sources: two Obstetrics & Gynaecology Departments, the TreatSuggest application - an application for suggesting treatments, and a home foetal surveillance system. Data is uploaded in Google BigQuery from Bega Hospital Timişoara, Romania. The analysed data is useful for the medical staff, researchers and statisticians from public health domain. The current work describes the technological architecture and its processing possibilities that in the future will be proved based on quality criteria to lead to a better decision process in diagnosis and public health. PMID:24743079

  7. Does loop quantum cosmology replace the big rip singularity by a non-singular bounce?

    Energy Technology Data Exchange (ETDEWEB)

    Haro, Jaume de, E-mail: jaime.haro@upc.edu [Departament de Matemàtica Aplicada I, Universitat Politècnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain)

    2012-11-01

    It is stated that holonomy corrections in loop quantum cosmology introduce a modification in Friedmann's equation which prevent the big rip singularity. Recently in [1] it has been proved that this modified Friedmann equation is obtained in an inconsistent way, what means that the results deduced from it, in particular the big rip singularity avoidance, are not justified. The problem is that holonomy corrections modify the gravitational part of the Hamiltonian of the system leading, after Legendre's transformation, to a non covariant Lagrangian which is in contradiction with one of the main principles of General Relativity. A more consistent way to deal with the big rip singularity avoidance is to disregard modification in the gravitational part of the Hamiltonian, and only consider inverse volume effects [2]. In this case we will see that, not like the big bang singularity, the big rip singularity survives in loop quantum cosmology. Another way to deal with the big rip avoidance is to take into account geometric quantum effects given by the the Wheeler-De Witt equation. In that case, even though the wave packets spread, the expectation values satisfy the same equations as their classical analogues. Then, following the viewpoint adopted in loop quantum cosmology, one can conclude that the big rip singularity survives when one takes into account these quantum effects. However, the spreading of the wave packets prevents the recover of the semiclassical time, and thus, one might conclude that the classical evolution of the universe come to and end before the big rip is reached. This is not conclusive because. as we will see, it always exists other external times that allows us to define the classical and quantum evolution of the universe up to the big rip singularity.

  8. Leading men

    DEFF Research Database (Denmark)

    Bekker-Nielsen, Tønnes

    2016-01-01

    Through a systematic comparison of c. 50 careers leading to the koinarchate or high priesthood of Asia, Bithynia, Galatia, Lycia, Macedonia and coastal Pontus, as described in funeral or honorary inscriptions of individual koinarchs, it is possible to identify common denominators but also...

  9. Lead grids

    CERN Multimedia

    1974-01-01

    One of the 150 lead grids used in the multiwire proportional chamber g-ray detector. The 0.75 mm diameter holes are spaced 1 mm centre to centre. The grids were made by chemical cutting techniques in the Godet Workshop of the SB Physics.

  10. Lead Poisoning

    Science.gov (United States)

    ... Environment Kids Health Kids Environment Kids Health Topics Environment & Health Healthy Living Pollution Reduce, Reuse, Recycle Science – How ... poisoning is still one of the most important health issues in the United States ... in housing built before 1946 have elevated blood lead levels. These ...

  11. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  12. The ethics of Big data: analytical survey

    OpenAIRE

    GIBER L.; KAZANTSEV N.

    2015-01-01

    The number of recent publications on the matter of ethical challenges of the implementation of Big Data has signified the growing interest to all the aspects of this issue. The proposed study specifically aims at analyzing ethical issues connected with Big Data.

  13. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  14. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  15. Big Science and Long-tail Science

    CERN Multimedia

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  16. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  17. Big Red: A Development Environment for Bigraphs

    DEFF Research Database (Denmark)

    Faithfull, Alexander John; Perrone, Gian David; Hildebrandt, Thomas

    2013-01-01

    We present Big Red, a visual editor for bigraphs and bigraphical reactive systems, based upon Eclipse. The editor integrates with several existing bigraph tools to permit simulation and model-checking of bigraphical models. We give a brief introduction to the bigraphs formalism, and show how these...... concepts manifest within the tool using a small motivating example bigraphical model developed in Big Red....

  18. Hom-Big Brackets: Theory and Applications

    OpenAIRE

    Cai, Liqiang; Sheng, Yunhe

    2015-01-01

    In this paper, we introduce the notion of hom-big brackets, which is a generalization of Kosmann-Schwarzbach's big brackets. We show that it gives rise to a graded hom-Lie algebra. Thus, it is a useful tool to study hom-structures. In particular, we use it to describe hom-Lie bialgebras and hom-Nijenhuis operators.

  19. Big system: Interactive graphics for the engineer

    Science.gov (United States)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  20. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined. PMID:9728415

  1. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  2. Big Food, Food Systems, and Global Health

    OpenAIRE

    Stuckler, David; Nestle, Marion

    2012-01-01

    In an article that forms part of the PLoS Medicine series on Big Food, guest editors David Stuckler and Marion Nestle lay out why more examination of the food industry is necessary, and offer three competing views on how public health professionals might engage with Big Food.

  3. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  4. How Important Is Surgeon's Skill for Weight-Loss Surgery Outcomes?

    Science.gov (United States)

    ... news/fullstory_158292.html How Important Is Surgeon's Skill for Weight-Loss Surgery Outcomes? Study saw no ... WEDNESDAY, April 13, 2016 (HealthDay News) -- A surgeon's skill level does not seem to have a big ...

  5. BIG Data – A Review.

    Directory of Open Access Journals (Sweden)

    Anuradha Bhatia

    2013-08-01

    Full Text Available As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask before. How can a financial organization find better ways to detect fraud? How can an insurance company gain a deeper insight into its customers to see who may be the least economical to insure? How does a software company find its most at-risk customers those who are about to deploy a competitive product? They need to integrate Big Data techniques with their current enterprise data to gain that competitive advantage. Heterogeneity, scale, timeliness, complexity, and privacy problems with Big Data impede progress at all phases of the pipeline that can create value from data. The problems start right away during data acquisition, when the data tsunami requires us to make decisions, currently in an ad hoc manner, about what data to keep and what to discard, and how to store what we keep reliably with the right metadata. Much data today is not natively in structured format; for example, tweets and blogs are weakly structured pieces of text, while images and video are structured for storage and display, but not for semantic content and search: transforming such content into a structured format for later analysis is a major challenge. The value of data explodes when it can be linked with other data, thus data integration is a major creator of value. Since most data is directly generated in digital format today, we have the opportunity and the challenge both to influence the creation to facilitate later linkage and to automatically link previously created data

  6. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  7. Evidence of the Big Fix

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2014-01-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value $v_{h}$. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self coupling are fixed when we vary $v_{h}$. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental low in our case.

  8. Big data ja yrityksen markkinointi

    OpenAIRE

    Perolainen, Pekka

    2014-01-01

    Opinnäytetyössä oli tavoitteena tutkia big datan hyödyntämistä yrityksen myyntityössä ja markkinoinnissa. Yrityksillä on mahdollisuuksia käyttää omista tai ulkoisista lähteistä kerättyä tietoa toimintansa tehostamiseen. Yrityksen omat tiedot ovat lähinnä transaktiotietoja, asiakaskorttitietoa, logistiikkadataa tai anturidataa. Kameratallenteet ovat myös osa yritysten keräämää dataa, lainsäädännössä tämä data lasketaan henkilörekisteritiedoksi. Yritysten on mahdollista kerätä, käsitellä ja yhd...

  9. Spinoffs of big nuclear projects

    International Nuclear Information System (INIS)

    Spinoffs so far used to be discussed only in connection with space travel. The question is well worth investigating wether also big nuclear projects, such as the advanced reactor lines or the nuclear fuel cycle, produce technical spinoffs. One misunderstanding should be cleared right at the beginning: man did not travel to the moon to invent the teflon coated frying pan. Nor is nuclear spinoff the actual purpose of the exercise. The high temperature reactor and the fast breeder reactor, or the closing of the nuclear fuel cycle, are justified independent goals of energy policy. However, if the overall benefit to the national economy of nuclear high technology is to be evaluated, also the question of technical spinoff must be considered. (orig.)

  10. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  11. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  12. Big Book of Apple Hacks

    CERN Document Server

    Seibold, Chris

    2008-01-01

    Bigger in size, longer in length, broader in scope, and even more useful than our original Mac OS X Hacks, the new Big Book of Apple Hacks offers a grab bag of tips, tricks and hacks to get the most out of Mac OS X Leopard, as well as the new line of iPods, iPhone, and Apple TV. With 125 entirely new hacks presented in step-by-step fashion, this practical book is for serious Apple computer and gadget users who really want to take control of these systems. Many of the hacks take you under the hood and show you how to tweak system preferences, alter or add keyboard shortcuts, mount drives and

  13. Big Bang nucleosynthesis in crisis?

    International Nuclear Information System (INIS)

    A new evaluation of the constraint on the number of light neutrino species (Nν) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4He abundance has been underestimated by 0.014±0.004 (1σ) or less than 10% (95% C.L.) of 3He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is Nν=2.1±0.3 (1σ) and the upper limit is Nνν=3) at the 98.6% C.L. copyright 1995 The American Physical Society

  14. The safety of big workmanship

    International Nuclear Information System (INIS)

    This book brings together the contributions of a colloquium given in memory of Pierre Londe (1922-1999) and dealing with the safety of big workmanship. The main topics concern: the 3-D under pressure water flow inside fractured environments; Rion-Antarion bridge: reliability and para-seismic design of foundations; Rion-Antarion bridge: design and realization; geology and safety of dams; risk assessment; salt storage cavities: evaluation of tightness; safety of tunnels supporting in deformed rock massifs: application to the El Achir tunnel; instability risk of rock formations on the natural slopes of the Alps; safety approach applied to the civil engineering of nuclear facilities; lessons learnt from the accidents of offshore platforms; the engineer in front of the natural hazards; science and regulation. (J.S.)

  15. Exploring Relationships in Big Data

    Science.gov (United States)

    Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.

    2015-12-01

    Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.

  16. Big-bang nucleosynthesis - observational aspects

    International Nuclear Information System (INIS)

    Extrapolation of observational data on the abundances of D, 3He, 4He and 7Li in various astrophysical objects to derive their primordial values leads to results in good accordance with calculations from Standard Big Bang nucleosynthesis theory over 9 orders of magnitude in abundance and has led to the following predictions: There are not more than 3 light neutrino species or other particles contributing relativistic degrees of freedom at temperatures of a few MeV; the neutron half-life is less than 10.4 minutes; and baryonic dark matter exists, but not in sufficient quantities to close the universe. (The first two of these predictions have been confirmed by laboratory experiments). Searches for a primordial component in the abundance of any other element heavier than hydrogen - such as might have resulted from inhomogeneities due to phase transitions in the early universe, notably the quark-hadron transition - have so far proved completely negative. The primordial helium abundance is found from observations of extragalactic ionized hydrogen clouds to be close to 0.230 by mass, a little lower than predicted, but the difference does not exceed likely errors. (orig.)

  17. Mass loss from stars

    International Nuclear Information System (INIS)

    This article discusses the different mass-loss processes of stars and how mass-loss rates determine the fate of stars in advanced stages of stellar evolution. Main sequence stars have their atmospheric structure dominated by radiation pressure. The pressure exerted by energetic photons is sufficient to drive gases off into space. This process can impact enormous turbulence to the local interstellar medium. Evolutionary effects keep these stars from fully evaporating, but the very course of their evolution is determined by this mass shedding process. Lower main sequence stars, like the sun, have a turbulent atmosphere enveloped in hot, thin coronal gas, blowing off a light stellar breeze. As the main sequence star evolves to a giant, its corona dissipates and the breeze turns into a strong stellar wind. Intermitten sputters combined with pulsational instabilities can lead to partial ejection of the atmosphere and envelope of a red giant, i.e. a planetary nebula results. The mass-loss from stars through planetary nebule combined with other mass-loss processes such as stellar winds returns a substantial amount of material to the interstellar environment. Mass-loss in binary systems is also discussed

  18. Quantum Big Bang without fine-tuning in a toy-model

    International Nuclear Information System (INIS)

    The question of possible physics before Big Bang (or after Big Crunch) is addressed via a schematic non-covariant simulation of the loss of observability of the Universe. Our model is drastically simplified by the reduction of its degrees of freedom to the mere finite number. The Hilbert space of states is then allowed time-dependent and singular at the critical time t = tc. This option circumvents several traditional theoretical difficulties in a way illustrated via solvable examples. In particular, the unitary evolution of our toy-model quantum Universe is shown interruptible, without any fine-tuning, at the instant of its bang or collapse t = tc.

  19. Application of Diamond Based Beam Loss Monitors

    OpenAIRE

    Hempel, Maria

    2013-01-01

    The LHC has an operational stored energy of 130MJ per beam. Only a small percentage of beam losses in the LHC equipment can damage material or lead to magnet quenches. Therefore, it is important to monitor different types of beam losses, e.g. scattering on residual gas particles, UFOs, collisions and injection losses. A detailed understanding of beam loss mechanisms is necessary to reduce them and ensure save operation. Two different beam loss monitors are installed in the LHC tunnel: ionizat...

  20. Oil Churning losses in Automatic Transmission

    OpenAIRE

    Desai, Kaushik

    2013-01-01

    Lately the discussion on how big impact vehicles has on the environment has grown bigger and bigger. The vehicle manufacturers are building cars, trucks, and wheel loaders etc. that are getting more and more fuel efficient for every year. But where do the engineers at the companies make the improvements? The drive train is one area of the vehicle where many different sources of losses are found. The gearbox is one example, where the losses are both dependent and independent on the torque tran...

  1. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects. PMID:25680334

  2. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization. PMID:24887521

  3. BLENDING IOT AND BIG DATA ANALYTICS

    OpenAIRE

    Tulasi.B*; Girish J Vemulkar

    2016-01-01

    Internet is continuously evolving and changing. Internet of Things (IoT) can be considered as the future of Internet applications which involves machine to machine learning (M2M). The actionable intelligence can be derived through fusion of Big Data and real time analytics with IoT. Big Data and IoT can be viewed as two sides of a coin. With the connection between Big Data and the objects on Internet benefits of IoT can be easily reaped. The applications of IoT spread across various domains l...

  4. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  5. M theory model of a big crunch/big bang transition

    International Nuclear Information System (INIS)

    We consider a picture in which the transition from a big crunch to a big bang corresponds to the collision of two empty orbifold planes approaching each other at a constant nonrelativistic speed in a locally flat background space-time, a situation relevant to recently proposed cosmological models. We show that p-brane states which wind around the extra dimension propagate smoothly and unambiguously across the orbifold plane collision. In particular we calculate the quantum mechanical production of winding M2-branes extending from one orbifold to the other. We find that the resulting density is finite and that the resulting gravitational backreaction is small. These winding states, which include the string theory graviton, can be propagated smoothly across the transition using a perturbative expansion in the membrane tension, an expansion which from the point of view of string theory is an expansion in inverse powers of α'. The conventional description of a crunch based on Einstein general relativity, involving Kasner or mixmaster behavior is misleading, we argue, because general relativity is only the leading order approximation to string theory in an expansion in positive powers of α'. In contrast, in the M theory setup we argue that interactions should be well behaved because of the smooth evolution of the fields combined with the fact that the string coupling tends to zero at the crunch. The production of massive Kaluza-Klein states should also be exponentially suppressed for small collision speeds. We contrast this good behavior with that found in previous studies of strings in Lorentzian orbifolds

  6. A novel autosomal recessive TERT T1129P mutation in a dyskeratosis congenita family leads to cellular senescence and loss of CD34+ hematopoietic stem cells not reversible by mTOR-inhibition.

    Science.gov (United States)

    Stockklausner, Clemens; Raffel, Simon; Klermund, Julia; Bandapalli, Obul Reddy; Beier, Fabian; Brümmendorf, Tim H; Bürger, Friederike; Sauer, Sven W; Hoffmann, Georg F; Lorenz, Holger; Tagliaferri, Laura; Nowak, Daniel; Hofmann, Wolf-Karsten; Buergermeister, Rebecca; Kerber, Carolin; Rausch, Tobias; Korbel, Jan O; Luke, Brian; Trumpp, Andreas; Kulozik, Andreas E

    2015-11-01

    The TERT gene encodes for the reverse transcriptase activity of the telomerase complex and mutations in TERT can lead to dysfunctional telomerase activity resulting in diseases such as dyskeratosis congenita (DKC). Here, we describe a novel TERT mutation at position T1129P leading to DKC with progressive bone marrow (BM) failure in homozygous members of a consanguineous family. BM hematopoietic stem cells (HSCs) of an affected family member were 300-fold reduced associated with a significantly impaired colony forming capacity in vitro and impaired repopulation activity in mouse xenografts. Recent data in yeast suggested improved cellular checkpoint controls by mTOR inhibition preventing cells with short telomeres or DNA damage from dividing. To evaluate a potential therapeutic option for the patient, we treated her primary skin fibroblasts and BM HSCs with the mTOR inhibitor rapamycin. This led to prolonged survival and decreased levels of senescence in T1129P mutant fibroblasts. In contrast, the impaired HSC function could not be improved by mTOR inhibition, as colony forming capacity and multilineage engraftment potential in xenotransplanted mice remained severely impaired. Thus, rapamycin treatment did not rescue the compromised stem cell function of TERTT1129P mutant patient HSCs and outlines limitations of a potential DKC therapy based on rapamycin. PMID:26546739

  7. A Perplexed Economist Confronts 'too Big to Fail'

    Directory of Open Access Journals (Sweden)

    Scherer, F. M.

    2010-12-01

    Full Text Available This paper examines premises and data underlying the assertion that some financial institutions in the U.S. economy were "too big to fail" and hence warranted government bailout. It traces the merger histories enhancing the dominance of six leading firms in the U. S. banking industry and he sharp increases in the concentration of financial institution assets accompanying that merger wave. Financial institution profits are found to have soared in tandem with rising concentration. The paper advances hypotheses why these phenomena might be related and surveys relevant empirical literature on the relationships between market concentration, interest rates received and charged by banks, and economies of scale in banking.

  8. Cosmological BCS mechanism and the big bang singularity

    International Nuclear Information System (INIS)

    We provide a novel mechanism that resolves the big bang singularity present in Friedman-Lemaitre-Robertson-Walker space-times without the need for ghost fields. Building on the fact that a four-fermion interaction arises in general relativity when fermions are covariantly coupled, we show that at early times the decrease in scale factor enhances the correlation between pairs of fermions. This enhancement leads to a BCS-like condensation of the fermions and opens a gap dynamically driving the Hubble parameter H to zero and results in a nonsingular bounce, at least in some special cases.

  9. Special Issue: Big data and predictive computational modeling

    Science.gov (United States)

    Koutsourelakis, P. S.; Zabaras, N.; Girolami, M.

    2016-09-01

    The motivation for this special issue stems from the symposium on "Big Data and Predictive Computational Modeling" that took place at the Institute for Advanced Study, Technical University of Munich, during May 18-21, 2015. With a mindset firmly grounded in computational discovery, but a polychromatic set of viewpoints, several leading scientists, from physics and chemistry, biology, engineering, applied mathematics, scientific computing, neuroscience, statistics and machine learning, engaged in discussions and exchanged ideas for four days. This special issue contains a subset of the presentations. Video and slides of all the presentations are available on the TUM-IAS website http://www.tum-ias.de/bigdata2015/.

  10. European lead cooled system (ELSY)

    International Nuclear Information System (INIS)

    The international Generation IV (GEN IV) initiative has once more highlighted that fast reactors are indispensable for a sustainable development of the Nuclear Energy. Europe has historically a large experience in the field of sodium-cooled fast reactors and recently has made a big effort in the development of the Lead-Bismuth Eutectic (LBE) technology for use in the sub-critical reactors, starting from the Russian technology for the submarine propulsion programme. The evolution from the LBE technology towards the pure lead technology is a natural and logical way because lead is less expensive, less corrosive and of lesser radiological concern. Lead has chemical and neutronic characteristics which are unique for a safe fast reactor. Molten lead, namely, operates at low pressure and high temperature, is relatively inert to air and water. The ELSY consortium intends to design a Lead-cooled Fast Reactor (LFR) system that complies with all GEN IV goals and gives assurance of investment protection. The EC FP6-ELSY project aims to demonstrate that it is possible to design a competitive and safe fast critical reactor using simple engineered technical features. ELSY is a 36-months project (starting September 1, 2006) partially funded as a Specific Targeted Research Project entitled to the European Commission

  11. BIG DATA, BIG CONSEQUENCES? EEN VERKENNING NAAR PRIVACY EN BIG DATA GEBRUIK BINNEN DE OPSPORING, VERVOLGING EN RECHTSPRAAK

    OpenAIRE

    Lodder, A.R.; Meulen, van der, N.; Wisman, T.H.A.; Meij, Lisette; Zwinkels, C.M.M.

    2014-01-01

    In deze verkenning is ingegaan op de privacy aspecten van Big Data analysis binnen het domein Veiligheid en Justitie. Besproken zijn toepassingen binnen de rechtspraak zoals voorspellen van uitspraken en gebruik in rechtszaken. Met betrekking tot opsporing is onder andere ingegaan op predictive policing en internetopsporing. Na een uiteenzetting van de privacynormen en toepassingsmogelijkheden, zijn de volgende zes uitgangspunten voor Big Data toepassingen voorgesteld: 7 A.R. Lodder e.a. ‐ Bi...

  12. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  13. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  14. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    Directory of Open Access Journals (Sweden)

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-02-01

    Full Text Available The Big Bang–Big Crunch (BB–BC optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response of FIR filters and error graph. The BB-BC seems to be promising tool for FIR filter design especially in a dynamic environment where filter coefficients have to be adapted and fast convergence is of importance.

  15. Computing seismic damage estimates for buildings within a big city. Bucharest case study.

    Science.gov (United States)

    Toma-Danila, Dragos; Armas, Iuliana

    2016-04-01

    The seismic risk analysis of big cities is a very demanding yet necessary task; the modeling of such complex systems requires first of all insightful input data at good resolution, referring to local effects, buildings and socio-economic aspects. Also, seismic risk estimation methods with good confidence levels are needed. Until recently, these requirements were not fulfilled for Bucharest, one of the most endangered capital city in Europe due to earthquakes. Based on 2011 and 2002 census data, standardized according to the framework of the Near-real time System for Estimating the Seismic Damage in Romania (SeisDaRo) through a unique approach and on relevant hazard scenarios, we estimate for the first time the building damage within the city, divided in more than 120 areas. The methodology applied relies on 48 vulnerability curves for buildings, on the Improved Displacement Coefficient Analytical Method included in the SELENA software for computing damage probabilities and on multiple seismic hazard scenarios, including the maximum possible. In order to compare results with real losses we use a scenario based on the 4 March 1977 Vrancea earthquake (7.4 moment-magnitude) that lead to 1424 deaths in Bucharest. By using overlay analysis with satellite imagery and a new methodology integrated in GIS we show how results can be enhanced, reflecting even more local characteristics. Best practices for seismic risk mapping are also expressed. Results are promising and contribute to the mitigation efforts in Bucharest.

  16. BDGS: A Scalable Big Data Generator Suite in Big Data Benchmarking

    OpenAIRE

    Ming, Zijian; Luo, Chunjie; Gao, Wanling; Han, Rui; Yang, Qiang; Wang, Lei; Zhan, Jianfeng

    2014-01-01

    Data generation is a key issue in big data benchmarking that aims to generate application-specific data sets to meet the 4V requirements of big data. Specifically, big data generators need to generate scalable data (Volume) of different types (Variety) under controllable generation rates (Velocity) while keeping the important characteristics of raw data (Veracity). This gives rise to various new challenges about how we design generators efficiently and successfully. To date, most existing tec...

  17. HOW BIG ARE ’BIG FOUR’ COMPANIES – EVIDENCE FROM ROMANIA

    OpenAIRE

    SORIN ROMULUS BERINDE

    2013-01-01

    The audit market is divided between two main categories of auditors: Big Four auditors and Non Big Four auditors. The general accepted opinion is that the former cover most audit services. The objective of the study is to quantify the share covered by Big Four auditors at the level of Romanian market. In this respect one collected and processed data obtained from the audited companies from the North-West Region of Romania which is considered representative for extrapolating the results at nat...

  18. BigDataBench: a Big Data Benchmark Suite from Web Search Engines

    OpenAIRE

    Gao, Wanling; Zhu, Yuqing; Jia, Zhen; Luo, Chunjie; Wang, Lei; Li, Zhiguo; Zhan, Jianfeng; Qi, Yong; He, Yongqiang; Gong, Shiming; Li, Xiaona; Zhang, Shujie; Qiu, Bizhu

    2013-01-01

    This paper presents our joint research efforts on big data benchmarking with several industrial partners. Considering the complexity, diversity, workload churns, and rapid evolution of big data systems, we take an incremental approach in big data benchmarking. For the first step, we pay attention to search engines, which are the most important domain in Internet services in terms of the number of page views and daily visitors. However, search engine service providers treat data, applications,...

  19. 6 Top Tools for Taming Big Data%6Top Tools for Taming Big Data

    Institute of Scientific and Technical Information of China (English)

    JakoB BJ orklund

    2012-01-01

    The industry now has a buzzword,"big data," for how we're going to do something with the huge amount of information piling up."Big data" is replacing "business intelligence,"which subsumed "reporting," which put a nicer gloss on "spreadsheets," which beat out the old-fashioned "printouts."Managers who long ago studied printouts are now hiring mathematicians who claim to be big data specialists to help them solve the same old problem:What's selling and why?

  20. 'Big bang' of quantum universe

    International Nuclear Information System (INIS)

    The reparametrization-invariant generating functional for the unitary and causal perturbation theory in general relativity in a finite space-time is obtained. The classical cosmology of a Universe and the Faddeev-Popov-DeWitt functional correspond to different orders of decomposition of this functional over the inverse 'mass' of a Universe. It is shown that the invariant content of general relativity as a constrained system can be covered by two 'equivalent' unconstrained systems: the 'dynamic' (with 'dynamic' time as the cosmic scale factor and conformal field variables) and 'geometric' (given by the Levi-Civita type canonical transformation to the action-angle variables which determine initial cosmological states with the arrow of the proper time measured by the watch of an observer in the comoving frame). 'Big Bang', the Hubble evolution, and creation of 'dynamic' particles by the 'geometric' vacuum are determined by 'relations' between the dynamic and geometric systems as pure relativistic phenomena, like the Lorentz-type 'relation' between the rest and comoving frames in special relativity

  1. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  2. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  3. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  4. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  5. Astronomical Surveys and Big Data

    CERN Document Server

    Mickaelian, A M

    2015-01-01

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum are reviewed, from Gamma-ray to radio, such as Fermi-GLAST and INTEGRAL in Gamma-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and II based catalogues (APM, MAPS, USNO, GSC) in optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio and many others, as well as most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS) and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era. Astrophysical Virtual Observatories and Computational Astrophysics play a...

  6. Deuterium and big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Measurements of deuterium absorption in high redshift quasar absorption systems provide a direct inference of the deuterium abundance produced by big bang nucleosynthesis (BBN). With measurements and limits from five independent absorption systems, we place strong constraints on the primordial ratio of deuterium to hydrogen, (D/H)p = 3.4 ± 0.3 x 10-5 [1,2]. We employ a direct numerical treatment to improve the estimates of critical reaction rates and reduce the uncertainties in BBN predictions of D/H and 7Li/H by a factor of three[3] over previous efforts[4]. Using our measurements of (D/H)p and new BBN predictions, we find at 95% confidence the baryon density ρb = (3.6 ± 0.4) x 10-31 g cm-3 (Ωbh265 = 0.045 ± 0.006 in units of the critical density), and cosmological baryon-photon ratio η = (5.1 ± 0.6) x 10-10

  7. Tick-Borne Diseases: The Big Two

    Science.gov (United States)

    ... Ticks and Diseases Tick-borne Diseases: The Big Two Past Issues / Spring - Summer 2010 Table of Contents ... muscle pain. The red-spotted rash usually happens 2 to 5 days after the fever begins. Antibiotics ...

  8. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  9. Heat Waves Pose Big Health Threats

    Science.gov (United States)

    ... https://medlineplus.gov/news/fullstory_159744.html Heat Waves Pose Big Health Threats Kids, elderly among those ... can be inherently dangerous, but the initial heat waves every summer can be particularly perilous to those ...

  10. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  11. Cosmic relics from the big bang

    International Nuclear Information System (INIS)

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab

  12. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  13. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  14. Big Fish and Prized Trees Gain Protection

    Institute of Scientific and Technical Information of China (English)

    Fred Pearce; 吴敏

    2004-01-01

    @@ Decisions made at a key conservation① meeting are good news for big and quirky② fish and commercially prized trees. Several species will enjoy extra protection against trade following rulings made at the Convention on International Trade in Endangered Species (CITES).

  15. Hunting Plan : Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Big Stone National Wildlife Refuge Hunting Plan provides guidance for the management of hunting on the refuge. Hunting program objectives include providing a...

  16. Conjecture on Avoidance of Big Crunch

    Institute of Scientific and Technical Information of China (English)

    SUN Cheng-Yi; ZHANG De-Hai

    2006-01-01

    By conjecturing the physics at the Planck scale, we modify the definition of the Hawking temperature and modify the Friedmann equation. It is found that we can avoid the singularity of the big crunch and obtain a bouncing cosmological model.

  17. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Science.gov (United States)

    2011-02-11

    ... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its Second Revised and Restated Open Access Transmission Tariff. Big Rivers also requests waiver of the...

  18. From data quality to big data quality

    OpenAIRE

    Batini, C; Rula, A; Scannapieco, M; Viscusi, G

    2015-01-01

    This article investigates the evolution of data quality issues from traditional structured data managed in relational databases to Big Data. In particular, the paper examines the nature of the relationship between Data Quality and several research coordinates that are relevant in Big Data, such as the variety of data types, data sources and application domains, focusing on maps, semi-structured texts, linked open data, sensor &sensor networks and official statistics. Consequently a set of str...

  19. Adapting bioinformatics curricula for big data

    OpenAIRE

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S; Jason H Moore

    2015-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these...

  20. Mining Big Data to Predicting Future

    OpenAIRE

    Tyagi, Amit K.; Priya, R.

    2015-01-01

    Due to technological advances, vast data sets (e.g. big data) are increasing now days. Big Data a new term; is used to identify the collected datasets. But due to their large size and complexity, we cannot manage with our current methodologies or data mining software tools to extract those datasets. Such datasets provide us with unparalleled opportunities for modelling and predicting of future with new challenges. So as an awareness of this and weaknesses as well as the possibilit...

  1. Scientific Big Data Analytics by HPC

    OpenAIRE

    Lippert, Thomas; Mallmann, Daniel; Riedel, Morris

    2016-01-01

    Storing, managing, sharing, curating and especially analysing huge amounts of data face animmense visibility and importance in industry and economy as well as in science and research.Industry and economy exploit "Big Data" for predictive analysis, to increase the efficiency ofinfrastructures, customer segmentation, and tailored services. In science, Big Data allows foraddressing problems with complexities that were impossible to deal with so far. The amountsof data are growing exponentially i...

  2. Dark energy, wormholes, and the Big Rip

    OpenAIRE

    Faraoni, Valerio; Israel, Werner

    2005-01-01

    The time evolution of a wormhole in a Friedmann universe approaching the Big Rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid - two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the Big Rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  3. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  4. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  5. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a...... example of a global proces in which key lexical categories that contest, trace and shape how global historical change is experienced and constituted through linguistic categories....

  6. Data Confidentiality Challenges in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  7. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  8. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  9. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  10. Soil biogeochemistry in the age of big data

    Science.gov (United States)

    Cécillon, Lauric; Barré, Pierre; Coissac, Eric; Plante, Alain; Rasse, Daniel

    2015-04-01

    Data is becoming one of the key resource of the XXIst century. Soil biogeochemistry is not spared by this new movement. The conservation of soils and their services recently came into the political agenda. However, clear knowledge on the links between soil characteristics and the various processes ensuring the provision of soil services is rare at the molecular or the plot scale, and does not exist at the landscape scale. This split between society's expectations on its natural capital, and scientific knowledge on the most complex material on earth has lead to an increasing number of studies on soils, using an increasing number of techniques of increasing complexity, with an increasing spatial and temporal coverage. From data scarcity with a basic data management system, soil biogeochemistry is now facing a proliferation of data, with few quality controls from data collection to publication and few skills to deal with them. Based on this observation, here we (1) address how big data could help in making sense of all these soil biogeochemical data, (2) point out several shortcomings of big data that most biogeochemists will experience in their future career. Massive storage of data is now common and recent opportunities for cloud storage enables data sharing among researchers all over the world. The need for integrative and collaborative computational databases in soil biogeochemistry is emerging through pioneering initiatives in this direction (molTERdb; earthcube), following soil microbiologists (GenBank). We expect that a series of data storage and management systems will rapidly revolutionize the way of accessing raw biogeochemical data, published or not. Data mining techniques combined with cluster or cloud computing hold significant promises for facilitating the use of complex analytical methods, and for revealing new insights previously hidden in complex data on soil mineralogy, organic matter and biodiversity. Indeed, important scientific advances have

  11. Initial conditions and the structure of the singularity in pre-big-bang cosmology

    NARCIS (Netherlands)

    A. Feinstein; K.E. Kunze; M.A. Vazquez-Mozo

    2000-01-01

    We propose a picture, within the pre-big-bang approach, in which the universe emerges from a bath of plane gravitational and dilatonic waves. The waves interact gravitationally breaking the exact plane symmetry and lead generically to gravitational collapse resulting in a singularity with the Kasner

  12. Bid to recreate the Big Bang and unlock the secrets of life hits a

    CERN Multimedia

    Morgan, James

    2007-01-01

    "It was not the kind of "big band" they were hopint for - but the explosion at the new £6.81 bn particle accelerator in Switzerland on Saturday, was "not a major setback", says a British scientist who is leading the project." (1 page)

  13. Occurrence of an iterative exponential function in cosmology without the big bang singularity

    International Nuclear Information System (INIS)

    Application of the 5-dimensional Projective Unified Field Theory of the author to a homogeneous isotropic and spherical-symmetric cosmological model leads to a regular solution of the field equations. On the way to this non-big-bang model iterative exponential functions occur, having never been met in this field of research. (authors)

  14. Analyzing Big Data with the Hybrid Interval Regression Methods

    OpenAIRE

    Chia-Hui Huang; Keng-Chieh Yang; Han-Ying Kao

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM...

  15. Detection of Equipment Faults Before Beam Loss

    CERN Document Server

    Galambos, J

    2016-01-01

    High-power hadron accelerators have strict limits on fractional beam loss. In principle, once a high-quality beam is set up in an acceptable state, beam loss should remain steady. However, in practice, there are many trips in operational machines, owing to excessive beam loss. This paper deals with monitoring equipment health to identify precursor signals that indicate an issue with equipment that will lead to unacceptable beam loss. To this end, a variety of equipment and beam signal measurements are described. In particular, several operational examples from the Spallation Neutron Source (SNS) of deteriorating equipment functionality leading to beam loss are reported.

  16. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  17. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  18. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  19. Big Computing in Astronomy: Perspectives and Challenges

    Science.gov (United States)

    Pankratius, Victor

    2014-06-01

    Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds

  20. Small government or big government?

    Directory of Open Access Journals (Sweden)

    MATEO SPAHO

    2015-03-01

    Full Text Available Since the beginning of the twentieth century, economists and philosophers were polarizedon their positions beyond the role that the government should have in the economy. On one hand John Maynard Keynes represented, within the optics of market economy, a position where the state should intervene in the economy to maintain the aggregate demand and the employment in the country, without hesitation in creating budget deficits and public debt expansion. This approach happens especially in the moments when the domestic economy and global economic trends show a weak growth or a recession. This means a heavy interference inthe economy, with higher income but with high expenditure to GDP too. On the other side, Liberals and Neoliberalsled by Friedrich Hayek advocated a withdrawal of the government from economic activity not just in moments of economic growth but also during the crisis, believing that the market has self-regulating mechanisms within itself. The government, as a result will have a smaller dimension with lower income and also low expenditures compared to the GDP of the country. We took the South-Eastern Europe countries distinguishing those with a "Big Government" or countries with "Small Government". There are analyzed the economic performances during the global crisis (2007-2014. In which countries the public debt grew less? Which country managed to attract more investments and which were the countries that preserved the purchasing power of their consumers? We shall see if during the economic crisis in Eastern Europe the Great Government or the Liberal and "Small" one has been the most successful the model.

  1. Big bang nucleosynthesis: Present status

    Science.gov (United States)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nνdata. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  2. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  3. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  4. How Big Should Government Be?

    OpenAIRE

    Feldstein, Martin

    1997-01-01

    The appropriate size and role of government depends on the deadweight burden caused by incremental transfers of funds from the private sector. The magnitude of that burden depends on the increases in tax rates required to raise incremental revenue and on the deadweight loss that results from higher tax rates. Both components depend on the full range of behavioral responses of taxpayers to increases in tax rates. The first part of this paper explains why the official method of revenue estimati...

  5. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    Science.gov (United States)

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  6. Human Neuroimaging as a “Big Data” Science

    Science.gov (United States)

    Van Horn, John Darrell; Toga, Arthur W.

    2013-01-01

    The maturation of in vivo neuroimaging has lead to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of “big data”. A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a mutifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociologial and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, “big data” can become “big” brain science. PMID:24113873

  7. Autoimmunity in visual loss.

    Science.gov (United States)

    Petzold, Axel; Wong, Sui; Plant, Gordon T

    2016-01-01

    There are a number of autoimmune disorders which can affect visual function. There are a very large number of mechanisms in the visual pathway which could potentially be the targets of autoimmune attack. In practice it is the retina and the anterior visual pathway (optic nerve and chiasm) that are recognised as being affected in autoimmune disorders. Multiple Sclerosis is one of the commonest causes of visual loss in young adults because of the frequency of attacks of optic neuritis in that condition, however the basis of the inflammation in Multiple Sclerosis and the confirmation of autoimmunity is lacking. The immune process is known to be highly unusual in that it is not systemic and confined to the CNS compartment. Previously an enigmatic partner to Multiple Sclerosis, Neuromyelitis Optica is now established to be autoimmune and two antibodies - to Aquaporin4 and to Myelin Oligodendrocyte Glycoprotein - have been implicated in the pathogenesis. The term Chronic Relapsing Inflammatory Optic Neuropathy is applied to those cases of optic neuritis which require long term immunosuppression and hence are presumed to be autoimmune but where no autoimmune pathogenesis has been confirmed. Optic neuritis occurring post-infection and post vaccination and conditions such as Systemic Lupus Erythematosus and various vasculitides may cause direct autoimmune attack to visual structures or indirect damage through occlusive vasculopathy. Chronic granulomatous disorders such as Sarcoidosis affect vision commonly by a variety of mechanisms, whether and how these are placed in the autoimmune panoply is unknown. As far as the retina is concerned Cancer Associated Retinopathy and Melanoma Associated Retinopathy are well characterised clinically but a candidate autoantibody (recoverin) is only described in the former disorder. Other, usually monophasic, focal retinal inflammatory disorders (Idiopathic Big Blind Spot Syndrome, Acute Zonal Occult Outer Retinopathy and Acute Macular

  8. On Subtitle Translation of Sitcoms-A Case Study of The Big Bang Theory

    Institute of Scientific and Technical Information of China (English)

    杨雯婷

    2013-01-01

    As we all know that exquisite subtitle translation of foreign film and television series is the fatal elements for them to spread among Chinese audiences. This article is based on Eugene·Nida’s“the Functional Equivalence”principle with three char⁃acteristics of sitcoms’subtitle to study the type, form and features of the Big Bang Theory, which lead to the conclusion of sitcom subtitle’s characteristics. It helps us to analyze its subtitle from six aspects. As the result, the author of the paper makes the conclu⁃sion of translation tactic about Big Bang Theory, which could help the subtitle translation of similar sitcoms.

  9. Extract Five Categories CPIVW from the 9V’s Characteristics of the Big Data

    OpenAIRE

    Suhail Sami Owais; Nada Sael Hussein

    2016-01-01

    There is an exponential growth in the amount of data from different fields around the world, and this is known as Big Data. It needs more data management, analysis, and accessibility. This leads to an increase in the number of systems around the world that will manage and manipulate the data in different places at any time. Big Data is a systematically analysed data that depends on the existence of complex processes, devices, and resources. Data are no longer stored in traditional database st...

  10. Kasner asymptotics of mixmaster Horava-Witten and pre-big-bang cosmologies

    International Nuclear Information System (INIS)

    We discuss various superstring effective actions and, in particular, their common sector which leads to the so-called pre-big-bang cosmology (cosmology in a weak coupling limit of heterotic superstring theory. Using the conformal relationship between these two theories we present Kasner asymptotic solutions of Bianchi type IX geometries within these theories and make predictions about possible emergence of chaos. Finally, we present a possible method of generating Horava-Witten cosmological solutions out of the well-known general relativistic or pre-big-bang solutions

  11. Superhorizon curvaton amplitude in inflation and pre-big bang cosmology

    DEFF Research Database (Denmark)

    Sloth, Martin Snoager

    2002-01-01

    We follow the evolution of the curvaton on superhorizon scales and check that the spectral tilt of the curvaton perturbations is unchanged as the curvaton becomes non-relativistic. Both inflation and pre-big bang cosmology can be treated since the curvaton mechanism within the two scenarios works...... the same way. We also discuss the amplitude of the density perturbations, which leads to some interesting constrains on the pre-big bang scenario. It is shown that within a SL(3,R) non-linear sigma model one of the three axions has the right coupling to the dilaton and moduli to yield a flat spectrum...

  12. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  13. 大数据,大变革%Big Data Big Changes

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.%大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。

  14. Big Data Big Changes%大数据,大变革

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。%Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.

  15. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  16. Vision Loss, Sudden

    Science.gov (United States)

    ... of age-related macular degeneration. Spotlight on Aging: Vision Loss in Older People Most commonly, vision loss ... Some Causes and Features of Sudden Loss of Vision Cause Common Features* Tests Sudden loss of vision ...

  17. Big Data Analytics Platforms analyze from startups to traditional database players

    Directory of Open Access Journals (Sweden)

    Ionut TARANU

    2015-07-01

    Full Text Available Big data analytics enables organizations to analyze a mix of structured, semi-structured and unstructured data in search of valuable business information and insights. The analytical findings can lead to more effective marketing, new revenue opportunities, better customer service, improved operational efficiency, competitive advantages over rival organizations and other business benefits. With so many emerging trends around big data and analytics, IT organizations need to create conditions that will allow analysts and data scientists to experiment. "You need a way to evaluate, prototype and eventually integrate some of these technologies into the business," says Chris Curran[1]. In this paper we are going to review 10 Top Big Data Analytics Platforms and compare the key-features.

  18. Big sized players on the European Union’s financial advisory market

    Directory of Open Access Journals (Sweden)

    Nicolae, C.

    2013-06-01

    Full Text Available The paper presents the activity and the objectives of “The Big Four” Group of Financial Advisory Firms. The “Big Four” are the four largest international professional services networks in accountancy and professional services, offering audit, assurance, tax, consulting, advisory, actuarial, corporate finance and legal services. They handle the vast majority of audits for publicly traded companies as well as many private companies, creating an oligopoly in auditing large companies. It is reported that the Big Four audit all but one of the companies that constitute the FTSE 100, and 240 of the companies in the FTSE 250, an index of the leading mid-cap listing companies.

  19. HADOOP+Big Data: Analytics Using Series Queue with Blocking Model

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2014-07-01

    Full Text Available Big data deals with large volumes of tons and tons of data. Since managing this much amount of data is not in the mere way for the traditional data mining techniques. Technology is in the world of pervasive environment i.e., technology follows up with its tremendous growth. Hence coordinating these amount of data in a linear way is mere little difficult, hence we proposed a new scheme in order to draw the data and data transformation in large data base. We extended our work in HADOOP (one of the big data managing tool. Our model is fully based on aggregation of data and data modelling. Our proposed model leads to high end data transformation for big data processing. We achieved our analytical result by applying our model with 2 HADOOP clusters, 4 nodes and with 25 jobs in MR functionality.

  20. Big Bang-Big Crunch Algorithm for Voltage Stability Limit Improvement by Coordinated Control of SVC Settings

    Directory of Open Access Journals (Sweden)

    S. Sakthivel

    2013-07-01

    Full Text Available Modern power system networks are operated under highly stressed conditions and there is a risk of voltage instability problems owing to increased load demand. A power system needs to be with sufficient voltage stability margin for secured operation. In this study, SVC parameters of location and size along with generator bus voltages, transformer tap settings are considered as control parameters for voltage stability limit improvement by minimizing loss and voltage deviation. The control parameters are varied in a coordinated manner for better results. The line based LQP voltage stability indicator is used for voltage stability assessment. The nature inspired meta heuristic Big Bang-Big Crunch (BB-BC algorithm is exploited for optimization of the control variables and the performance is compared with that of PSO algorithm. The effectiveness of the proposed algorithm is tested on the standard IEEE 30 bus system under normal and N-1 line outage contingency conditions. The results obtained from the simulation encourage the performances of the new algorithm.