WorldWideScience

Sample records for big losses lead

  1. Big losses lead to irrational decision-making in gambling situations: relationship between deliberation and impulsivity.

    Directory of Open Access Journals (Sweden)

    Yuji Takano

    Full Text Available In gambling situations, we found a paradoxical reinforcing effect of high-risk decision-making after repeated big monetary losses. The computerized version of the Iowa Gambling Task (Bechara et al., 2000, which contained six big loss cards in deck B', was conducted on normal healthy college students. The results indicated that the total number of selections from deck A' and deck B' decreased across trials. However, there was no decrease in selections from deck B'. Detailed analysis of the card selections revealed that some people persisted in selecting from the "risky" deck B' as the number of big losses increased. This tendency was prominent in self-rated deliberative people. However, they were implicitly impulsive, as revealed by the matching familiar figure test. These results suggest that the gap between explicit deliberation and implicit impulsivity drew them into pathological gambling.

  2. Plant-Based Diets Score Big for Healthy Weight Loss

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_162874.html Plant-Based Diets Score Big for Healthy Weight Loss ... row, U.S. News & World Report has named the plant-based eating plan as the best choice overall, ...

  3. Blood lead levels and chronic blood loss

    Energy Technology Data Exchange (ETDEWEB)

    Manci, E.A.; Cabaniss, M.L.; Boerth, R.C.; Blackburn, W.R.

    1986-03-01

    Over 90% of lead in blood is bound to the erythrocytes. This high affinity of lead for red cells may mean that chronic blood loss is a significant means for excretion of lead. This study sought correlations between blood lead levels and clinical conditions involving chronic blood loss. During May, June and July, 146 patients with normal hematocrits and red cell indices were identified from the hospital and clinic populations. For each patient, age, race, sex and medical history were noted, and a whole blood sample was analyzed by flameless atomic absorption spectrophotometry. Age-and race-matched pairs showed a significant correlation of chronic blood loss with lead levels. Patients with the longest history of blood loss (menstruating women) had the lowest level (mean 6.13 ..mu..g/dl, range 3.6-10.3 ..mu..g/dl). Post-menopausal women had levels (7.29 ..mu..g/dl, 1.2-14 ..mu..g/dl) comparable to men with peptic ulcer disease, or colon carcinoma (7.31 ..mu..g/dl, 5.3-8.6 ..mu..g/dl). The highest levels were among men who had no history of bleeding problems (12.39 ..mu..g/dl, 2.08-39.35 ..mu..g/dl). Chronic blood loss may be a major factor responsible for sexual differences in blood lead levels. Since tissue deposition of environmental pollutants is implicated in diseases, menstruation may represent a survival advantage for women.

  4. Weight Loss Leads to Strong Increase in Appetite

    Science.gov (United States)

    ... News Releases Media Advisory Friday, October 14, 2016 Weight loss leads to strong increase in appetite Study with ... changes in caloric expenditure that typically accompany weight loss — and weight loss plateau. Findings from the analyses suggest that ...

  5. [Overall digitalization: leading innovation of endodontics in big data era].

    Science.gov (United States)

    Ling, J Q

    2016-04-09

    In big data era, digital technologies bring great challenges and opportunities to modern stomatology. The applications of digital technologies, such as cone-beam CT(CBCT), computer aided design,(CAD)and computer aided manufacture(CAM), 3D printing and digital approaches for education , provide new concepts and patterns to the treatment and study of endodontic diseases. This review provides an overview of the application and prospect of commonly used digital technologies in the development of endodontics.

  6. Lead exposure in free-flying turkey vultures is associated with big game hunting in California.

    Directory of Open Access Journals (Sweden)

    Terra R Kelly

    Full Text Available Predatory and scavenging birds are at risk of lead exposure when they feed on animals injured or killed by lead ammunition. While lead ammunition has been banned from waterfowl hunting in North America for almost two decades, lead ammunition is still widely used for hunting big game and small game animals. In this study, we evaluated the association between big game hunting and blood lead concentration in an avian scavenger species that feeds regularly on large mammals in California. We compared blood lead concentration in turkey vultures within and outside of the deer hunting season, and in areas with varying wild pig hunting intensity. Lead exposure in turkey vultures was significantly higher during the deer hunting season compared to the off-season, and blood lead concentration was positively correlated with increasing wild pig hunting intensity. Our results link lead exposure in turkey vultures to deer and wild pig hunting activity at these study sites, and we provide evidence that spent lead ammunition in carrion poses a significant risk of lead exposure to scavengers.

  7. Lead exposure in bald eagles from big game hunting, the continental implications and successful mitigation efforts.

    Science.gov (United States)

    Bedrosian, Bryan; Craighead, Derek; Crandall, Ross

    2012-01-01

    Studies suggest hunter discarded viscera of big game animals (i.e., offal) is a source of lead available to scavengers. We investigated the incidence of lead exposure in bald eagles in Wyoming during the big game hunting season, the influx of eagles into our study area during the hunt, the geographic origins of eagles exposed to lead, and the efficacy of using non-lead rifle ammunition to reduce lead in eagles. We tested 81 blood samples from bald eagles before, during and after the big game hunting seasons in 2005-2010, excluding 2008, and found eagles had significantly higher lead levels during the hunt. We found 24% of eagles tested had levels indicating at least clinical exposure (>60 ug/dL) during the hunt while no birds did during the non-hunting seasons. We performed driving surveys from 2009-2010 to measure eagle abundance and found evidence to suggest that eagles are attracted to the study area during the hunt. We fitted 10 eagles with satellite transmitters captured during the hunt and all migrated south after the cessation of the hunt. One returned to our study area while the remaining nine traveled north to summer/breed in Canada. The following fall, 80% returned to our study area for the hunting season, indicating that offal provides a seasonal attractant for eagles. We fitted three local breeding eagles with satellite transmitters and none left their breeding territories to feed on offal during the hunt, indicating that lead ingestion may be affecting migrants to a greater degree. During the 2009 and 2010 hunting seasons we provided non-lead rifle ammunition to local hunters and recorded that 24% and 31% of successful hunters used non-lead ammunition, respectively. We found the use of non-lead ammunition significantly reduced lead exposure in eagles, suggesting this is a viable solution to reduce lead exposure in eagles.

  8. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    Science.gov (United States)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    disasters continues to inspire new chapters in their "Layers: Places in Peril" exhibit! A slide show includes images of paintings for "small problems, Big Trouble". Brey and Waller will lead a discussion on their process of incorporating broader collaboration with geoscientists and others in an educational art exhibition.

  9. Dielectric relaxation losses in lead chloride and lead bromide: Localized dipoles

    NARCIS (Netherlands)

    Brom, W.E. van den; Volger, J.

    1974-01-01

    A further analysis of previous reported measurements of dielectric relaxation losses in lead chloride and lead bromide crystals shows that the dipoles may occupy several energetically different positions, giving rise to localization of the dipoles and anomalous behaviour of the susceptibility. This

  10. Costs of IQ Loss from Leaded Aviation Gasoline Emissions.

    Science.gov (United States)

    Wolfe, Philip J; Giang, Amanda; Ashok, Akshay; Selin, Noelle E; Barrett, Steven R H

    2016-09-06

    In the United States, general aviation piston-driven aircraft are now the largest source of lead emitted to the atmosphere. Elevated lead concentrations impair children's IQ and can lead to lower earnings potentials. This study is the first assessment of the nationwide annual costs of IQ losses from aircraft lead emissions. We develop a general aviation emissions inventory for the continental United States and model its impact on atmospheric concentrations using the community multi-scale air quality model (CMAQ). We use these concentrations to quantify the impacts of annual aviation lead emissions on the U.S. population using two methods: through static estimates of cohort-wide IQ deficits and through dynamic economy-wide effects using a computational general equilibrium model. We also examine the sensitivity of these damage estimates to different background lead concentrations, showing the impact of lead controls and regulations on marginal costs. We find that aircraft-attributable lead contributes to $1.06 billion 2006 USD ($0.01-$11.6) in annual damages from lifetime earnings reductions, and that dynamic economy-wide methods result in damage estimates that are 54% larger. Because the marginal costs of lead are dependent on background concentration, the costs of piston-driven aircraft lead emissions are expected to increase over time as regulations on other emissions sources are tightened.

  11. Modeling of Blood Lead Levels in Astronauts Exposed to Lead from Microgravity-Accelerated Bone Loss

    Science.gov (United States)

    Garcia, H.; James, J.; Tsuji, J.

    2014-01-01

    Human exposure to lead has been associated with toxicity to multiple organ systems. Studies of various population groups with relatively low blood lead concentrations (adults. Cognitive effects are considered by regulatory agencies to be the most sensitive endpoint at low doses. Although 95% of the body burden of lead is stored in the bones, the adverse effects of lead correlate with the concentration of lead in the blood better than with that in the bones. NASA has found that prolonged exposure to microgravity during spaceflight results in a significant loss of bone minerals, the extent of which varies from individual to individual and from bone to bone, but generally averages about 0.5% per month. During such bone loss, lead that had been stored in bones would be released along with calcium. The effects on the concentration of lead in the blood (PbB) of various concentrations of lead in drinking water (PbW) and of lead released from bones due to accelerated osteoporosis in microgravity, as well as changes in exposure to environmental lead before, during, and after spaceflight were evaluated using a physiologically based pharmacokinetic (PBPK) model that incorporated exposure to environmental lead both on earth and in flight and included temporarily increased rates of osteoporosis during spaceflight.

  12. Circadian Disruption Leads to Loss of Homeostasis and Disease

    Directory of Open Access Journals (Sweden)

    Carolina Escobar

    2011-01-01

    Full Text Available The relevance of a synchronized temporal order for adaptation and homeostasis is discussed in this review. We present evidence suggesting that an altered temporal order between the biological clock and external temporal signals leads to disease. Evidence mainly based on a rodent model of “night work” using forced activity during the sleep phase suggests that altered activity and feeding schedules, out of phase from the light/dark cycle, may be the main cause for the loss of circadian synchrony and disease. It is proposed that by avoiding food intake during sleep hours the circadian misalignment and adverse consequences can be prevented. This review does not attempt to present a thorough revision of the literature, but instead it aims to highlight the association between circadian disruption and disease with special emphasis on the contribution of feeding schedules in circadian synchrony.

  13. Loss of Nfkb1 leads to early onset aging.

    Science.gov (United States)

    Bernal, Giovanna M; Wahlstrom, Joshua S; Crawley, Clayton D; Cahill, Kirk E; Pytel, Peter; Liang, Hua; Kang, Shijun; Weichselbaum, Ralph R; Yamini, Bakhtiar

    2014-11-01

    NF-κB is a major regulator of age-dependent gene expression and the p50/NF-κB1 subunit is an integral modulator of NF-κB signaling. Here, we examined Nfkb1-/- mice to investigate the relationship between this subunit and aging. Although Nfkb1-/- mice appear similar to littermates at six months of age, by 12 months they have a higher incidence of several observable age-related phenotypes. In addition, aged Nfkb1-/- animals have increased kyphosis, decreased cortical bone, increased brain GFAP staining and a decrease in overall lifespan compared to Nfkb1+/+. In vitro, serially passaged primary Nfkb1-/- MEFs have more senescent cells than comparable Nfkb1+/+ MEFs. Also, Nfkb1-/- MEFs have greater amounts of phospho-H2AX foci and lower levels of spontaneous apoptosis than Nfkb1+/+, findings that are mirrored in the brains of Nfkb1-/- animals compared to Nfkb1+/+. Finally, in wildtype animals a substantial decrease in p50 DNA binding is seen in aged tissue compared to young. Together, these data show that loss of Nfkb1 leads to early animal aging that is associated with reduced apoptosis and increased cellular senescence. Moreover, loss of p50 DNA binding is a prominent feature of aged mice relative to young. These findings support the strong link between the NF-κB pathway and mammalian aging.

  14. Chronic Conductive Hearing Loss Leads to Cochlear Degeneration.

    Directory of Open Access Journals (Sweden)

    M Charles Liberman

    Full Text Available Synapses between cochlear nerve terminals and hair cells are the most vulnerable elements in the inner ear in both noise-induced and age-related hearing loss, and this neuropathy is exacerbated in the absence of efferent feedback from the olivocochlear bundle. If age-related loss is dominated by a lifetime of exposure to environmental sounds, reduction of acoustic drive to the inner ear might improve cochlear preservation throughout life. To test this, we removed the tympanic membrane unilaterally in one group of young adult mice, removed the olivocochlear bundle in another group and compared their cochlear function and innervation to age-matched controls one year later. Results showed that tympanic membrane removal, and the associated threshold elevation, was counterproductive: cochlear efferent innervation was dramatically reduced, especially the lateral olivocochlear terminals to the inner hair cell area, and there was a corresponding reduction in the number of cochlear nerve synapses. This loss led to a decrease in the amplitude of the suprathreshold cochlear neural responses. Similar results were seen in two cases with conductive hearing loss due to chronic otitis media. Outer hair cell death was increased only in ears lacking medial olivocochlear innervation following olivocochlear bundle cuts. Results suggest the novel ideas that 1 the olivocochlear efferent pathway has a dramatic use-dependent plasticity even in the adult ear and 2 a component of the lingering auditory processing disorder seen in humans after persistent middle-ear infections is cochlear in origin.

  15. Delayed Interval Delivery following Early Loss of the Leading Twin

    Directory of Open Access Journals (Sweden)

    P. C. Udealor

    2015-01-01

    Full Text Available This was a case of a nulliparous woman with reduced chance of conception following unilateral salpingectomy and years of infertility. She eventually conceived following ovulation induction resulting in twin pregnancy. She had miscarriage that led to loss of one of the twins at 17 weeks of gestational age. The pregnancy was however continued for 116 days following meticulous management with eventual delivery of a live female baby with good outcome.

  16. Association of defects in lead chloride and lead bromide: Ionic conductivity and dielectric loss measurements

    NARCIS (Netherlands)

    Brom, W.E. van den; Schoonman, J.; Wit, J.H.W. de

    1972-01-01

    The ionic conductivity data of pure and doped lead bromide without associated defects are used in order to explain the anomalous conductivity behaviour of copper (I) bromide and lead oxide-doped lead-bromide crystals. In these crystals precipitated dopant and associated defects are present. The asso

  17. Wildfire and forest disease interaction lead to greater loss of soil nutrients and carbon.

    Science.gov (United States)

    Cobb, Richard C; Meentemeyer, Ross K; Rizzo, David M

    2016-09-01

    Fire and forest disease have significant ecological impacts, but the interactions of these two disturbances are rarely studied. We measured soil C, N, Ca, P, and pH in forests of the Big Sur region of California impacted by the exotic pathogen Phytophthora ramorum, cause of sudden oak death, and the 2008 Basin wildfire complex. In Big Sur, overstory tree mortality following P. ramorum invasion has been extensive in redwood and mixed evergreen forests, where the pathogen kills true oaks and tanoak (Notholithocarpus densiflorus). Sampling was conducted across a full-factorial combination of disease/no disease and burned/unburned conditions in both forest types. Forest floor organic matter and associated nutrients were greater in unburned redwood compared to unburned mixed evergreen forests. Post-fire element pools were similar between forest types, but lower in burned-invaded compared to burned-uninvaded plots. We found evidence disease-generated fuels led to increased loss of forest floor C, N, Ca, and P. The same effects were associated with lower %C and higher PO4-P in the mineral soil. Fire-disease interactions were linear functions of pre-fire host mortality which was similar between the forest types. Our analysis suggests that these effects increased forest floor C loss by as much as 24.4 and 21.3 % in redwood and mixed evergreen forests, respectively, with similar maximum losses for the other forest floor elements. Accumulation of sudden oak death generated fuels has potential to increase fire-related loss of soil nutrients at the region-scale of this disease and similar patterns are likely in other forests, where fire and disease overlap.

  18. Lead exposure through consumption of big game meat in Quebec, Canada: risk assessment and perception.

    Science.gov (United States)

    Fachehoun, Richard Coovi; Lévesque, Benoit; Dumas, Pierre; St-Louis, Antoine; Dubé, Marjolaine; Ayotte, Pierre

    2015-01-01

    Game meat from animals killed by lead ammunition may expose consumers to lead. We assessed the risk related to lead intake from meat consumption of white-tailed deer and moose killed by lead ammunition and documented the perception of hunters and butchers regarding this potential contamination. Information on cervid meat consumption and risk perception were collected using a mailed self-administrated questionnaire which was addressed to a random sample of Quebec hunters. In parallel, 72 samples of white-tailed deer (n = 35) and moose (n = 37) meats were collected from voluntary hunters and analysed for lead content using inductively coupled plasma-mass spectrometry. A risk assessment for people consuming lead shot game meat was performed using Monte Carlo simulations. Mean lead levels in white-tailed deer and moose killed by lead ammunition were 0.28 and 0.17 mg kg(-1) respectively. Risk assessment based on declared cervid meat consumption revealed that 1.7% of the surveyed hunters would exceed the dose associated with a 1 mmHg increase in systolic blood pressure (SBP). For consumers of moose meat once, twice or three times a week, simulations predicted that 0.5%, 0.9% and 1.5% of adults would be exposed to a dose associated with a 1 mmHg increase in SBP, whereas 0.9%, 1.9% and 3.3% of children would be exposed to a dose associated with 1 point intelligence quotient (IQ) decrease, respectively. For consumers of deer meat once, twice or three times a week, the proportions were 1.6%, 2.9% and 4% for adults and 2.9%, 5.8% and 7.7% for children, respectively. The consumption of meat from cervids killed with lead ammunition may increase lead exposure and its associated health risks. It would be important to inform the population, particularly hunters, about this potential risk and promote the use of lead-free ammunition.

  19. Application of Lead Viscoelastic Dampers to Wind Vibration Control on Big-Span Power Transmission Tower

    Institute of Scientific and Technical Information of China (English)

    LIANG Zheng-ping; LI Li; YIN Peng; DUAN Song-tao

    2008-01-01

    To study the wind vibration response of power transmission tower, the lead viscoelastic dampers (LVDs) were applied to a cup tower. With time history analysis method, the displacement, velocity, acceleration and force response of the tower was calculated and analyzed. The results show that the control effect of lead viscoelastic dampers is very good, and the damping ratio can reach 20% or more when they are applied to the tower head.

  20. Concentration trends for lead and calcium-normalized lead in fish fillets from the Big River, a mining-contaminated stream in southeastern Missouri USA

    Science.gov (United States)

    Schmitt, Christopher J.; McKee, Michael J.

    2016-01-01

    Lead (Pb) and calcium (Ca) concentrations were measured in fillet samples of longear sunfish (Lepomis megalotis) and redhorse suckers (Moxostoma spp.) collected in 2005–2012 from the Big River, which drains a historical mining area in southeastern Missouri and where a consumption advisory is in effect due to elevated Pb concentrations in fish. Lead tends to accumulated in Ca-rich tissues such as bone and scale. Concentrations of Pb in fish muscle are typically low, but can become elevated in fillets from Pb-contaminated sites depending in part on how much bone, scale, and skin is included in the sample. We used analysis-of-covariance to normalize Pb concentration to the geometric mean Ca concentration (415 ug/g wet weight, ww), which reduced variation between taxa, sites, and years, as was the number of samples that exceeded Missouri consumption advisory threshold (300 ng/g ww). Concentrations of Pb in 2005–2012 were lower than in the past, especially after Ca-normalization, but the consumption advisory is still warranted because concentrations were >300 ng/g ww in samples of both taxa from contaminated sites. For monitoring purposes, a simple linear regression model is proposed for estimating Ca-normalized Pb concentrations in fillets from Pb:Ca molar ratios as a way of reducing the effects of differing preparation methods on fillet Pb variation.

  1. Alveolar bone loss associated to periodontal disease in lead intoxicated rats under environmental hypoxia.

    Science.gov (United States)

    Terrizzi, Antonela R; Fernandez-Solari, Javier; Lee, Ching M; Bozzini, Clarisa; Mandalunis, Patricia M; Elverdin, Juan C; Conti, María Ines; Martínez, María Pilar

    2013-10-01

    Previously reported studies from this laboratory revealed that rats chronically intoxicated with lead (Pb) under hypoxic conditions (HX) impaired growth parameters and induced damages on femoral and mandibular bones predisposing to fractures. We also described periodontal inflammatory processes under such experimental conditions. Periodontitis is characterised by inflammation of supporting tissues of the teeth that result in alveolar bone loss. The existence of populations living at high altitudes and exposed to lead contamination aimed us to establish the macroscopic, biochemical and histological parameters consistent with a periodontal disease in the same rat model with or without experimental periodontitis (EP). Sixty female rats were divided into: Control; Pb (1000ppm of lead acetate in drinking water); HX (506mbar) and PbHX (both treatments simultaneously). EP was induced by placing ligatures around the molars of half of the rats during the 14 days previous to the autopsy. Hemi-mandibles were extracted to evaluate bone loss by histomorphometrical techniques. TNFα plasmatic concentration was greater (palveolar bone loss, while Pb showed spontaneous bone loss also. In conclusion, these results show that lead intoxication under hypoxic environment enhanced not only alveolar bone loss but also systemic and oral tissues inflammatory parameters, which could aggravate the physiopathological alterations produced by periodontal disease.

  2. Does my step look big in this? A visual illusion leads to safer stepping behaviour.

    Directory of Open Access Journals (Sweden)

    David B Elliott

    Full Text Available BACKGROUND: Tripping is a common factor in falls and a typical safety strategy to avoid tripping on steps or stairs is to increase foot clearance over the step edge. In the present study we asked whether the perceived height of a step could be increased using a visual illusion and whether this would lead to the adoption of a safer stepping strategy, in terms of greater foot clearance over the step edge. The study also addressed the controversial question of whether motor actions are dissociated from visual perception. METHODOLOGY/PRINCIPAL FINDINGS: 21 young, healthy subjects perceived the step to be higher in a configuration of the horizontal-vertical illusion compared to a reverse configuration (p = 0.01. During a simple stepping task, maximum toe elevation changed by an amount corresponding to the size of the visual illusion (p<0.001. Linear regression analyses showed highly significant associations between perceived step height and maximum toe elevation for all conditions. CONCLUSIONS/SIGNIFICANCE: The perceived height of a step can be manipulated using a simple visual illusion, leading to the adoption of a safer stepping strategy in terms of greater foot clearance over a step edge. In addition, the strong link found between perception of a visual illusion and visuomotor action provides additional support to the view that the original, controversial proposal by Goodale and Milner (1992 of two separate and distinct visual streams for perception and visuomotor action should be re-evaluated.

  3. Connexin26 (GJB2) deficiency reduces active cochlear amplification leading to late-onset hearing loss.

    Science.gov (United States)

    Zhu, Y; Chen, J; Liang, C; Zong, L; Chen, J; Jones, R O; Zhao, H-B

    2015-01-22

    Connexin26 (Cx26, GJB2) mutations account for >50% of nonsyndromic hearing loss. The deafness is not always congenital. A large group of these patients (∼30%) demonstrate a late-onset hearing loss, starting in childhood. They have normal hearing early in life and are therefore good candidates for applying protective and therapeutic interventions. However, the underlying deafness mechanism is unclear. In this study, we used a time-controlled, inducible gene knockout technique to knockout Cx26 expression in the cochlea after birth. We found that deletion of Cx26 after postnatal day 5 (P5) in mice could lead to late-onset hearing loss. Similar to clinical observations, the mice demonstrated progressive, mild to moderate hearing loss. The hearing loss initiated at high frequencies and then extended to the middle- and low-frequency range. The cochlea showed normal development and had no apparent hair cell loss. However, distortion product otoacoustic emission (DPOAE) was reduced. The reduction was also progressive and large at high-frequencies. Consistent with DPOAE reduction, we found that outer hair cell electromotility-associated nonlinear capacitance was shifted to the right and the slope of voltage dependence was reduced. The endocochlear potential was reduced in Cx26 conditional knockout (cKO) mice but the reduction was not associated with progressive hearing loss. These data suggest that Cx26 deficiency may impair active cochlear amplification leading to late-onset hearing loss. Our study also helps develop newer protective and therapeutic interventions to this common nonsyndromic hearing loss.

  4. Theoretical and experimental loss and efficiency studies of a magnetic lead screw

    DEFF Research Database (Denmark)

    Berg, Nick Ilsø; Holm, Rasmus Koldborg; Rasmussen, Peter Omand

    2013-01-01

    This paper investigates mechanical and magnetic losses in a magnetic lead screw (MLS). The MLS converts a low speed high force linear motion of a translator into a high speed low torque rotational motion of a rotor through helically shaped magnets. Initial tests performed with a novel 17 k...

  5. Theoretical and Experimental Loss and Efficiency Studies of a Magnetic Lead Screw

    DEFF Research Database (Denmark)

    Berg, Nick Ilsø; Holm, Rasmus Koldborg; Rasmussen, Peter Omand

    2015-01-01

    This paper investigates mechanical and magnetic losses in a magnetic lead screw (MLS). The MLS converts a low-speed high-force linear motion of a translator into a high-speed low-torque rotational motion of a rotor through helically shaped magnets. Initial tests performed with a novel 17-kN demon...

  6. Parkin loss leads to PARIS-dependent declines in mitochondrial mass and respiration

    Science.gov (United States)

    Stevens, Daniel A.; Lee, Yunjong; Kang, Ho Chul; Lee, Byoung Dae; Lee, Yun-Il; Bower, Aaron; Jiang, Haisong; Kang, Sung-Ung; Andrabi, Shaida A.; Dawson, Valina L.; Shin, Joo-Ho; Dawson, Ted M.

    2015-01-01

    Mutations in parkin lead to early-onset autosomal recessive Parkinson’s disease (PD) and inactivation of parkin is thought to contribute to sporadic PD. Adult knockout of parkin in the ventral midbrain of mice leads to an age-dependent loss of dopamine neurons that is dependent on the accumulation of parkin interacting substrate (PARIS), zinc finger protein 746 (ZNF746), and its transcriptional repression of PGC-1α. Here we show that adult knockout of parkin in mouse ventral midbrain leads to decreases in mitochondrial size, number, and protein markers consistent with a defect in mitochondrial biogenesis. This decrease in mitochondrial mass is prevented by short hairpin RNA knockdown of PARIS. PARIS overexpression in mouse ventral midbrain leads to decreases in mitochondrial number and protein markers and PGC-1α–dependent deficits in mitochondrial respiration. Taken together, these results suggest that parkin loss impairs mitochondrial biogenesis, leading to declining function of the mitochondrial pool and cell death. PMID:26324925

  7. Preventing Mitochondrial Fission Impairs Mitochondrial Function and Leads to Loss of Mitochondrial DNA

    OpenAIRE

    Parone, Philippe A.; Sandrine Da Cruz; Daniel Tondera; Yves Mattenberger; James, Dominic I.; Pierre Maechler; François Barja; Jean-Claude Martinou

    2008-01-01

    Mitochondria form a highly dynamic tubular network, the morphology of which is regulated by frequent fission and fusion events. However, the role of mitochondrial fission in homeostasis of the organelle is still unknown. Here we report that preventing mitochondrial fission, by down-regulating expression of Drp1 in mammalian cells leads to a loss of mitochondrial DNA and a decrease of mitochondrial respiration coupled to an increase in the levels of cellular reactive oxygen species (ROS). At t...

  8. Loss of FTO antagonises Wnt signaling and leads to developmental defects associated with ciliopathies.

    Directory of Open Access Journals (Sweden)

    Daniel P S Osborn

    Full Text Available Common intronic variants in the Human fat mass and obesity-associated gene (FTO are found to be associated with an increased risk of obesity. Overexpression of FTO correlates with increased food intake and obesity, whilst loss-of-function results in lethality and severe developmental defects. Despite intense scientific discussions around the role of FTO in energy metabolism, the function of FTO during development remains undefined. Here, we show that loss of Fto leads to developmental defects such as growth retardation, craniofacial dysmorphism and aberrant neural crest cells migration in Zebrafish. We find that the important developmental pathway, Wnt, is compromised in the absence of FTO, both in vivo (zebrafish and in vitro (Fto(-/- MEFs and HEK293T. Canonical Wnt signalling is down regulated by abrogated β-Catenin translocation to the nucleus whilst non-canonical Wnt/Ca(2+ pathway is activated via its key signal mediators CaMKII and PKCδ. Moreover, we demonstrate that loss of Fto results in short, absent or disorganised cilia leading to situs inversus, renal cystogenesis, neural crest cell defects and microcephaly in Zebrafish. Congruently, Fto knockout mice display aberrant tissue specific cilia. These data identify FTO as a protein-regulator of the balanced activation between canonical and non-canonical branches of the Wnt pathway. Furthermore, we present the first evidence that FTO plays a role in development and cilia formation/function.

  9. Research and Manufacture of a Big-lead Variable-pitch Screw%大导程、变螺距螺杆的研制

    Institute of Scientific and Technical Information of China (English)

    揭晓; 覃岭

    2011-01-01

    Process analysis for a big-lead variable-pitch screw was made. Rational NC program was programmed. Qualified part was gotten. The requirement of custom is satisfied.%对大导程、变螺距螺杆进行了工艺分析,编制出合理的数控程序,制造出合格的零件,满足了用户的要求.

  10. Loss of interleukin-21 leads to atrophic germinal centers in multicentric Castleman's disease.

    Science.gov (United States)

    Yajima, Hidetaka; Yamamoto, Motohisa; Shimizu, Yui; Sakurai, Nodoka; Suzuki, Chisako; Naishiro, Yasuyoshi; Imai, Kohzoh; Shinomura, Yasuhisa; Takahashi, Hiroki

    2016-01-01

    Both multicentric Castleman's disease (MCD) and immunoglobulin (Ig)G4-related disease (IgG4-RD) are systemic diseases, presenting with hypergammaglobulinemia and elevated serum levels of IgG4. However, with regard to histopathological findings, MCD shows atrophic germinal centers. On the other hand, expanded germinal centers are detected in IgG4-RD. We extracted germinal centers from specimens of each disorder by microdissection and analyzed the expression of mRNAs by real-time polymerase chain reaction to clarify the mechanisms underlying atrophied germinal centers in MCD. This analysis disclosed loss of interleukin (IL)-21 and B cell lymphoma (Bcl)-6 in the germinal centers of MCD. Loss of IL-21 is considered to be involved in the disappearance of Bcl-6 and leads to atrophied germinal centers in MCD.

  11. Preventing mitochondrial fission impairs mitochondrial function and leads to loss of mitochondrial DNA.

    Directory of Open Access Journals (Sweden)

    Philippe A Parone

    Full Text Available Mitochondria form a highly dynamic tubular network, the morphology of which is regulated by frequent fission and fusion events. However, the role of mitochondrial fission in homeostasis of the organelle is still unknown. Here we report that preventing mitochondrial fission, by down-regulating expression of Drp1 in mammalian cells leads to a loss of mitochondrial DNA and a decrease of mitochondrial respiration coupled to an increase in the levels of cellular reactive oxygen species (ROS. At the cellular level, mitochondrial dysfunction resulting from the lack of fission leads to a drop in the levels of cellular ATP, an inhibition of cell proliferation and an increase in autophagy. In conclusion, we propose that mitochondrial fission is required for preservation of mitochondrial function and thereby for maintenance of cellular homeostasis.

  12. Loss of ATF2 function leads to cranial motoneuron degeneration during embryonic mouse development.

    Directory of Open Access Journals (Sweden)

    Julien Ackermann

    Full Text Available The AP-1 family transcription factor ATF2 is essential for development and tissue maintenance in mammals. In particular, ATF2 is highly expressed and activated in the brain and previous studies using mouse knockouts have confirmed its requirement in the cerebellum as well as in vestibular sense organs. Here we present the analysis of the requirement for ATF2 in CNS development in mouse embryos, specifically in the brainstem. We discovered that neuron-specific inactivation of ATF2 leads to significant loss of motoneurons of the hypoglossal, abducens and facial nuclei. While the generation of ATF2 mutant motoneurons appears normal during early development, they undergo caspase-dependent and independent cell death during later embryonic and foetal stages. The loss of these motoneurons correlates with increased levels of stress activated MAP kinases, JNK and p38, as well as aberrant accumulation of phosphorylated neurofilament proteins, NF-H and NF-M, known substrates for these kinases. This, together with other neuropathological phenotypes, including aberrant vacuolisation and lipid accumulation, indicates that deficiency in ATF2 leads to neurodegeneration of subsets of somatic and visceral motoneurons of the brainstem. It also confirms that ATF2 has a critical role in limiting the activities of stress kinases JNK and p38 which are potent inducers of cell death in the CNS.

  13. Environmental lead pollution and its possible influence on tooth loss and hard dental tissue lesions

    Directory of Open Access Journals (Sweden)

    Cenić-Milošević Desanka

    2013-01-01

    Full Text Available Bacground/Aim. Environmental lead (Pb pollution is a global problem. Hard dental tissue is capable of accumulating lead and other hard metals from the environment. The aim of this study was to investigate any correlation between the concentration of lead in teeth extracted from inhabitants of Pančevo and Belgrade, Serbia, belonging to different age groups and occurrence of tooth loss, caries and non-carious lesions. Methods. A total of 160 volunteers were chosen consecutively from Pančevo (the experimental group and Belgrade (the control group and divided into 5 age subgroups of 32 subjects each. Clinical examination consisted of caries and hard dental tissue diagnostics. The Decayed Missing Filled Teeth (DMFT Index and Significant Caries Index were calculated. Extracted teeth were freed of any organic residue by UV digestion and subjected to voltammetric analysis for the content of lead. Results. The average DMFT scores in Pančevo (20.41 were higher than in Belgrade (16.52; in the patients aged 31-40 and 41-50 years the difference was significant (p < 0.05 and highly significant in the patients aged 51-60 (23.69 vs 18.5, p < 0.01. Non-carious lesions were diagnosed in 71 (44% patients from Pančevo and 39 (24% patients from Belgrade. The concentrations of Pb in extracted teeth in all the groups from Pančevo were statistically significantly (p < 0.05 higher than in all the groups from Belgrade. In the patients from Pančevo correlations between Pb concentration in extracted teeth and the number of extracted teeth, the number of carious lesions and the number of non-carious lesions showed a statistical significance (p < 0.001, p < 0.01 and p < 0.001, respectively. Conclusion. According to correlations between lead concentration and the number of extracted teeth, number of carious lesions and non-carious lesions found in the patients living in Pančevo, one possible cause of tooth loss and hard dental tissue damage could be a long

  14. Knockdown of cytosolic glutaredoxin 1 leads to loss of mitochondrial membrane potential: implication in neurodegenerative diseases.

    Directory of Open Access Journals (Sweden)

    Uzma Saeed

    Full Text Available Mitochondrial dysfunction including that caused by oxidative stress has been implicated in the pathogenesis of neurodegenerative diseases. Glutaredoxin 1 (Grx1, a cytosolic thiol disulfide oxido-reductase, reduces glutathionylated proteins to protein thiols and helps maintain redox status of proteins during oxidative stress. Grx1 downregulation aggravates mitochondrial dysfunction in animal models of neurodegenerative diseases, such as Parkinson's and motor neuron disease. We examined the mechanism underlying the regulation of mitochondrial function by Grx1. Downregulation of Grx1 by shRNA results in loss of mitochondrial membrane potential (MMP, which is prevented by the thiol antioxidant, alpha-lipoic acid, or by cyclosporine A, an inhibitor of mitochondrial permeability transition. The thiol groups of voltage dependent anion channel (VDAC, an outer membrane protein in mitochondria but not adenosine nucleotide translocase (ANT, an inner membrane protein, are oxidized when Grx1 is downregulated. We then examined the effect of beta-N-oxalyl amino-L-alanine (L-BOAA, an excitatory amino acid implicated in neurolathyrism (a type of motor neuron disease, that causes mitochondrial dysfunction. Exposure of cells to L-BOAA resulted in loss of MMP, which was prevented by overexpression of Grx1. Grx1 expression is regulated by estrogen in the CNS and treatment of SH-SY5Y cells with estrogen upregulated Grx1 and protected from L-BOAA mediated MMP loss. Our studies demonstrate that Grx1, a cytosolic oxido-reductase, helps maintain mitochondrial integrity and prevents MMP loss caused by oxidative insult. Further, downregulation of Grx1 leads to mitochondrial dysfunction through oxidative modification of the outer membrane protein, VDAC, providing support for the critical role of Grx1 in maintenance of MMP.

  15. Thermodynamics of the ytterbium-lead system by simultaneous weight-loss-mass-spectrometry Knudsen effusion

    Energy Technology Data Exchange (ETDEWEB)

    Schiffman, R.A.

    1982-09-16

    The ytterbium-lead system was investigated in the temperature range 750 to 1381 K by measurements of vapor pressures with a simultaneous weight-loss-mass-spectrometric (WLMS) Knudsen effusion technique. A congruently vaporizing phase was located at YbPb/sub 1/ /sub 04/. The nonstoichiometric composition ranges of the intermetallic compounds were determined, and a Yb-Pb phase diagram was constructed. The heats of formation at 298 K calculated from the thermodynamic measurements were as follows(kJ/mol): Yb/sub 2/Pb, -174.6; Yb/sub 5/Pb/sub 3/, -461.6; YbPb, -116.9; YbPb/sub 3/, -152.4. 15 figures.

  16. Big Data: Big Confusion? Big Challenges?

    Science.gov (United States)

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data : Big Confusion? Big Challenges? Mary Maureen... Data : Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...Acquisition Research Symposium • ~!& UNC CHARlD1TE 90% of the data in the world today was created in the last two years Big Data growth from

  17. Lead

    Science.gov (United States)

    ... found? Who is at risk? What are the health effects of lead? Get educational material about lead Get certified as a Lead Abatement Worker, or other abatement discipline Lead in drinking water Lead air pollution Test your child Check and maintain your home ...

  18. Lipoprotein(a) and inflammation: A dangerous duet leading to endothelial loss of integrity.

    Science.gov (United States)

    Pirro, Matteo; Bianconi, Vanessa; Paciullo, Francesco; Mannarino, Massimo R; Bagaglia, Francesco; Sahebkar, Amirhossein

    2017-02-07

    Lipoprotein(a) [Lp(a)] is an enigmatic lipoprotein whose ancestral useful properties have been gradually obscured by its adverse pro-atherogenic and pro-thrombotic effects, that culminate into an increased risk of ischemic cardiovascular events. Although plasma Lp(a) levels are largely determined on a genetic basis, multiple factors have been reported to interfere with its plasma levels. Inflammation is one of these factors and it is believed to promote pro-atherogenic and pro-thrombotic changes leading to increased cardiovascular disease risk. The influence of inflammation on plasma Lp(a) levels is variable, with studies reporting either increased, reduced or unchanged Lp(a) expression and plasma concentrations following exposure to pro-inflammatory stimuli. The complex association between inflammation and Lp(a) is further amplified by additional findings showing that Lp(a) may promote the expression of a plethora of pro-inflammatory cytokines and induces the endothelium to switch into an activated status which results in adhesion molecules expression and inflammatory cells invasion into the arterial wall. In this picture, it emerges that increased plasma Lp(a) levels and inflammation may coexist and their coexistence may exert a deleterious impact on endothelial integrity both at a functional and structural level. Also, the detrimental duet of inflammation and Lp(a) may interfere with the physiological endothelial repair response, thus further amplifying endothelial loss of integrity and protective functions. A fundamental understanding of the interaction between Lp(a) and inflammation is critical for our comprehension of the mechanisms leading to the derangement of endothelial homeostasis and vascular dysfunction.

  19. Estrogen Deficiency Leads to Further Bone Loss in the Mandible of CKD Mice.

    Directory of Open Access Journals (Sweden)

    Yuchen Guo

    Full Text Available Chronic kidney disease (CKD has been regarded as a grave public health problem. Estrogen is a critical factor for both renal protection and bone remodeling. Our previous study demonstrated that CKD impairs the healing of titanium implants. The aim of this study was to investigate the effects of estrogen deficiency on the mandibular bone in CKD mice.Forty eleven-week-old female C57BL mice were used in this study. Uremia and estrogen deficiency were induced by 5/6 nephrectomy and ovariectomy (OVX, respectively. After 8 weeks, the mice were sacrificed, and their mandibles were collected for micro-CT analysis and histological examination.All the mice survived the experimental period. Serum measurements confirmed a significant increase in BUN in the CKD group that was further increased by OVX. OVX led to significant decreases in both the BV/TV and cortical thickness of the mandibular bone in CKD mice.In summary, our findings indicate that estrogen deficiency leads to further mandibular bone loss in CKD mice.

  20. Transgenic n-3 PUFAs enrichment leads to weight loss via modulating neuropeptides in hypothalamus.

    Science.gov (United States)

    Ma, Shuangshuang; Ge, Yinlin; Gai, Xiaoying; Xue, Meilan; Li, Ning; Kang, Jingxuan; Wan, Jianbo; Zhang, Jinyu

    2016-01-12

    Body weight is related to fat mass, which is associated with obesity. Our study explored the effect of fat-1 gene on body weight in fat-1 transgenic mice. In present study, we observed that the weight/length ratio of fat-1 transgenic mice was lower than that of wild-type mice. The serum levels of triglycerides (TG), cholesterol (CT), high-density lipoprotein cholesterol (HDL-c), low-density lipoprotein cholesterol (LDL-c) and blood glucose (BG) in fat-1 transgenic mice were all decreased. The weights of peri-bowels fat, perirenal fat and peri-testicular fat in fat-1 transgenic mice were reduced. We hypothesized that increase of n-3 PUFAs might alter the expression of hypothalamic neuropeptide genes and lead to loss of body weight in fat-1 transgenic mice. Therefore, we measured mRNA levels of appetite neuropeptides, Neuropeptide Y (NPY), Agouti-related peptides (AgRP), Proopiomelanocortin (POMC), Cocaine and amphetamine regulated transcript (CART), ghrelin and nesfatin-1 in hypothalamus by real-time PCR. Compared with wild-type mice, the mRNA levels of CART, POMC and ghrelin were higher, while the mRNA levels of NPY, AgRP and nesfatin-1 were lower in fat-1 transgenic mice. The results indicate that fat-1 gene or n-3 PUFAs participates in regulation of body weight, and the mechanism of this phenomenon involves the expression of appetite neuropeptides and lipoproteins in fat-1 transgenic mice.

  1. Big-data-oriented Cloud Logistics Leads the Logistics Mode Change%云物流和大数据对物流模式的变革

    Institute of Scientific and Technical Information of China (English)

    梁红波

    2014-01-01

    大数据为企业营销提供科学的、快捷的、可靠的数据分析与建议,依据大数据技术发展云物流,可以高效整合物流资源、降低供应链各节点企业的物流成本、提升物流企业的增值服务水平。在云物流和大数据引领下,一些新型物流模式如物流企业联盟、虚拟无水港、供应链物流一体化等被推广应用。在这个过程中,物流企业在云物流环境下发生着变革、转型和升级。%Big data provides enterprise marketing with scientific, qui ck and reliable data analysis and suggestions; the development of cloud logistics on the basis of big data technology will help us to effectively integrate logistic resources, reduce logistic cost in different nodes of supply chain and improve the value-added service level of logistic enterprises. Under the leading of cloud logistics and big data, some new logistic mode, such as the alliance of logistic enterprises, virtual dry port and the integration of supply chain logistics, will be popularized and applied;the logistics enterprises will also experience changes, transformation and upgrade in the environment of cloud logistics.

  2. Systematic Evaluation of Dissolved Lead Sorption Losses to Particulate Syringe Filter Materials

    Science.gov (United States)

    Distinguishing between soluble and particulate lead in drinking water is useful in understanding the mechanism of lead release and identifying remedial action. Typically, particulate lead is defined as the amount of lead removed by a 0.45 µm filter. Unfortunately, there is little...

  3. Continuous feedings of fortified human milk lead to nutrient losses of fat, calcium, and phosphorous

    Science.gov (United States)

    Substantial losses of nutrients may occur during tube (gavage) feeding of fortified human milk. Our objective was to compare the losses of key macronutrients and minerals based on method of fortification, and gavage feeding method. We used clinically available gavage feeding systems and measured pre...

  4. The association between low levels of lead in blood and occupational noise-induced hearing loss in steel workers

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Yaw-Huei [Institute of Occupational Medicine and Industrial Hygiene, College of Public Health, National Taiwan University, Taipei, Taiwan, ROC Rm. 735, 17, Xu-Zhou Rd., Taipei, Taiwan, ROC (China); Department of Public Health, College of Public Health, National Taiwan University, Taipei, Taiwan, ROC Rm. 735, 17, Xu-Zhou Rd., Taipei, Taiwan, ROC (China); Chiang, Han-Yueh [Institute of Occupational Medicine and Industrial Hygiene, College of Public Health, National Taiwan University, Taipei, Taiwan, ROC Rm. 735, 17, Xu-Zhou Rd., Taipei, Taiwan, ROC (China); Yen-Jean, Mei-Chu [Division of Family Medicine, E-Da Hospital, Taiwan, ROC 1, E-Da Rd., Jiau-Shu Tsuen, Yan-Chau Shiang, Kaohsiung County, Taiwan, ROC (China); I-Shou University, Kaohsiung County, Taiwan, ROC 1, Sec. 1, Syuecheng Rd., Da-Shu Shiang, Kaohsiung County, Taiwan, ROC (China); Wang, Jung-Der, E-mail: jdwang@ntu.edu.tw [Institute of Occupational Medicine and Industrial Hygiene, College of Public Health, National Taiwan University, Taipei, Taiwan, ROC Rm. 735, 17, Xu-Zhou Rd., Taipei, Taiwan, ROC (China); Department of Public Health, College of Public Health, National Taiwan University, Taipei, Taiwan, ROC Rm. 735, 17, Xu-Zhou Rd., Taipei, Taiwan, ROC (China); Department of Internal Medicine, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei, Taiwan, ROC No. 1, Chang-Teh St., Taipei, Taiwan, ROC (China)

    2009-12-15

    As the use of leaded gasoline has ceased in the last decade, background lead exposure has generally been reduced. The aim of this study was to examine the effect of low-level lead exposure on human hearing loss. This study was conducted in a steel plant and 412 workers were recruited from all over the plant. Personal information such as demographics and work history was obtained through a questionnaire. All subjects took part in an audiometric examination of hearing thresholds, for both ears, with air-conducted pure tones at frequencies of 500, 1000, 2000, 3000, 4000, 6000 and 8000 Hz. Subjects' blood samples were collected and analyzed for levels of manganese, copper, zinc, arsenic, cadmium and lead with inductive couple plasma-mass spectrometry. Meanwhile, noise levels in different working zones were determined using a sound level meter with A-weighting network. Only subjects with hearing loss difference of no more than 15 dB between both ears and had no congenital abnormalities were included in further data analysis. Lead was the only metal in blood found significantly correlated with hearing loss for most tested sound frequencies (p < 0.05 to p < 0.0001). After adjustment for age and noise level, the logistic regression model analysis indicated that elevated blood lead over 7 {mu}g/dL was significantly associated with hearing loss at the sound frequencies of 3000 through 8000 Hz with odds ratios raging from 3.06 to 6.26 (p < 0.05 {approx} p < 0.005). We concluded that elevated blood lead at level below 10 {mu}g/dL might enhance the noise-induced hearing loss. Future research needs to further explore the detailed mechanism.

  5. Wildlife and Wildlife Habitat Loss Assessment at Detroit Big Cliff Dam and Reservoir Project, North Santiam River, Oregon, 1985 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Noyes, J.H.

    1985-02-01

    A habitat based assessment was conducted of the US Army Corps of Engineers' Detroit/Big Cliff Dam and Reservoir Project (Detroit Project) on the North Santiam River, Oregon, to determine losses or gains resulting from the development and operation of the hydroelectric-related components of the project. Preconstruction, postconstruction, and recent vegetation cover types at the project site were mapped based on aerial photographs from 1939, 1956, and 1979, respectively. Vegetation cover types were identified within the affected area and acreages of each type at each time period were determined. Ten wildlife target species were selected to represent a cross-section of species groups affected by the project. An interagency team evaluated the suitability of the habitat to support the target species at each time period. An evaluation procedure which accounted for both the quantity and quality of habitat was used to aid in assessing impacts resulting from the project. The Detroit Project extensively altered or affected 6324 acres of land and river in the North Santiam River drainage. Impacts to wildlife centered around the loss of 1,608 acres of conifer forest and 620 acres of riparian habitat. Impacts resulting from the Detroit Project included the loss of winter range for black-tailed deer and Roosevelt elk, and the loss of year-round habitat for deer, river otter, beaver, ruffed grouse, pileated woodpecker, spotted owl, and many other wildlife species. Bald eagle and osprey were benefited by an increase in foraging habitat. The potential of the affected area to support wildlife was greatly altered as a result of the Detroit Project. Losses or gains in the potential of the habitat to support wildlife will exist over the life of the project.

  6. Continuous feedings of fortified human milk lead to nutrient losses of fat, calcium and phosphorous.

    Science.gov (United States)

    Rogers, Stefanie P; Hicks, Penni D; Hamzo, Maria; Veit, Lauren E; Abrams, Steven A

    2010-03-01

    Substantial losses of nutrients may occur during tube (gavage) feeding of fortified human milk. Our objective was to compare the losses of key macronutrients and minerals based on method of fortification and gavage feeding method. We used clinically available gavage feeding systems and measured pre- and post-feeding (end-point) nutrient content of calcium (Ca), phosphorus (Phos), protein, and fat. Comparisons were made between continuous, gravity bolus, and 30-minute infusion pump feeding systems, as well as human milk fortified with donor human milk-based and bovine milk-based human milk fortifier using an in vitro model. Feeding method was significantly associated with fat and Ca losses, with increased losses in continuous feeds. Fat losses in continuous feeds were substantial, with 40 ± 3 % of initial fat lost during the feeding process. After correction for feeding method, human milk fortified with donor milk-based fortifier was associated with significantly less loss of Ca (8 ± 4% vs. 28 ± 4%, pmilk fortified with a bovine milk-based fortifier (Mean ± SEM).

  7. Big data, big governance

    NARCIS (Netherlands)

    Reep, Frans van der

    2016-01-01

    “Natuurlijk is het leuk dat mijn koelkast zelf melk bestelt op basis van data gerelateerde patronen. Deep learning op basis van big data kent grote beloften,” zegt Frans van der Reep van Inholland. Geen wonder dat dit op de Hannover Messe tijdens de Wissenstag van ScienceGuide een hoofdthema zal zij

  8. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    In this paper we investigate the micro-mechanisms governing the structural evolution of a scientific collaboration. Empirical evidence indicates that we have transcended into a new paradigm with a new modus operandi where scientific discovery are not lead by so called lone ?stars?, or big egos......, but instead by a group of people, from a multitude of institutions, having a diverse knowledge set and capable of operating more and more complex instrumentation. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize a stochastic actor oriented model...

  9. Uptake and loss of mercury, cadmium and lead in marine organisms

    Digital Repository Service at National Institute of Oceanography (India)

    Kureishy, T.W.; DeSilva, C.

    Experiments were conducted on Tilapia mosambica, Perna viridis and Villorita cyprinoides on the uptake and loss of Hg, Cd and Pb in various tissues of these organisms. Hg was found to be highly toxic to the clams as compared to the fish and mussels...

  10. Loss of BAP1 function leads to EZH2-dependent transformation

    Science.gov (United States)

    LaFave, Lindsay M.; Béguelin, Wendy; Koche, Richard; Teater, Matt; Spitzer, Barbara; Chramiec, Alan; Papalexi, Efthymia; Keller, Matthew D.; Hricik, Todd; Konstantinoff, Katerina; Micol, Jean-Baptiste; Durham, Benjamin; Knutson, Sarah K.; Campbell, John E.; Blum, Gil; Shi, Xinxu; Doud, Emma H.; Krivtsov, Andrei V.; Chung, Young Rock; Khodos, Inna; de Stanchina, Elisa; Ouerfelli, Ouathek; Adusumilli, Prasad S.; Thomas, Paul M.; Kelleher, Neil L.; Luo, Minkui; Keilhack, Heike; Abdel-Wahab, Omar; Melnick, Ari; Armstrong, Scott A.

    2015-01-01

    Introductory Paragraph BAP1 and ASXL1 interact to form a polycomb deubiquitinase complex that removes monoubiquitin from histone H2A lysine 119 (H2AK119Ub). However, BAP1 and ASXL1 are mutated in distinct cancer types, consistent with independent roles in regulating epigenetic state and malignant transformation. Here we demonstrate that Bap1 loss results in increased trimethylated histone H3 lysine 27 (H3K27me3), elevated Ezh2 expression, and enhanced repression of Polycomb Repressive Complex 2 (PRC2) targets. These findings contrast with the reduction in H3K27me3 seen with Asxl1 loss. Conditional deletion of Bap1 and Ezh2 in vivo abrogates the myeloid progenitor expansion induced by Bap1 loss alone. Loss of Bap1 results in a marked decrease in H4K20 monomethylation (H4K20me1). Consistent with a role for H4K20me1 in EZH2 transcriptional regulation, expression of SETD8, the H4K20me1 methyltransferase, reduces EZH2 expression and abrogates the proliferation of BAP1-mutant cells. Further, mesothelioma cells that lack BAP1 are sensitive to EZH2 pharmacologic inhibition, suggesting a novel therapeutic approach for BAP1-mutant malignancies. PMID:26437366

  11. Dual Loss of Rb1 and Trp53 in the Adrenal Medulla Leads to Spontaneous Pheochromocytoma

    Directory of Open Access Journals (Sweden)

    Ian D. Tonks

    2010-03-01

    Full Text Available Using a Cre/loxP system, we have determined the phenotypic consequences attributable to in vivo deletion of both Rb1 and Trp53 in the mouse adrenal medulla. The coablation of these two tumor suppressor genes during embryogenesis did not disrupt adrenal gland development but resulted in the neoplastic transformation of the neural crest-derived adrenal medulla, yielding pheochromocytomas (PCCs that developed with complete penetrance and were inevitably bilateral. Despite their typically benign status, these PCCs had profound ramifications on mouse vitality, with effected mice having a median survival of only 121 days. Evaluation of these PCCs by both immunohistochemistry and electron microscopy revealed that most Rb1-/-:Trp53-/- chromaffin cells possessed atypical chromagenic vesicles that did not seem capable of appropriately storing synthesized catecholamines. The structural remodeling of the heart in mice harboring Rb1-/-:Trp53-/- PCCs suggests that the mortality of these mice may be attributable to the inappropriate release of catecholamines from the mutated adrenal chromaffin cells. On the basis of the collective data from Rb1 and Trp53 knockout mouse models, it seems that the conversion of Rb1 loss-driven adrenal medulla hyperplasia to PCC can be greatly enhanced by the compound loss of Trp53, whereas the loss of Trp53 alone is generally ineffectual on adrenal chromaffin cell homeostasis. Consequently, the Trp53 tumor suppressor gene is an efficient genetic modifier of Rb1 loss in the development of PCC, and their compound loss in the adrenal medulla has a profound impact on both cellular homeostasis and animal vitality.

  12. Calculation of losses in a HTS current lead with the help of the dimensional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Douine, B.; Leveque, J.; Netter, D.; Rezzoug, A

    2003-12-01

    The calculation of losses is highly required to design any superconducting device. To do that the analytical approach is the best way in term of parameter analysis. Bean's model is based on the fact that the resistive transition is sudden. This assumption is more suitable for low critical temperature superconductors. For ceramics, the transition is smoother, so the variation of electric field E with current density is a function well approached by kJ{sup n}. Using this kind of function and a dimensional analysis the authors propose a new analytic formula to calculate the losses in the case of incomplete penetration of current. Calculated results are compared to measured ones and the validity limit is shown.

  13. High humidity leads to loss of infectious influenza virus from simulated coughs.

    Directory of Open Access Journals (Sweden)

    John D Noti

    Full Text Available BACKGROUND: The role of relative humidity in the aerosol transmission of influenza was examined in a simulated examination room containing coughing and breathing manikins. METHODS: Nebulized influenza was coughed into the examination room and Bioaerosol samplers collected size-fractionated aerosols (4 µM aerodynamic diameters adjacent to the breathing manikin's mouth and also at other locations within the room. At constant temperature, the RH was varied from 7-73% and infectivity was assessed by the viral plaque assay. RESULTS: Total virus collected for 60 minutes retained 70.6-77.3% infectivity at relative humidity ≤23% but only 14.6-22.2% at relative humidity ≥43%. Analysis of the individual aerosol fractions showed a similar loss in infectivity among the fractions. Time interval analysis showed that most of the loss in infectivity within each aerosol fraction occurred 0-15 minutes after coughing. Thereafter, losses in infectivity continued up to 5 hours after coughing, however, the rate of decline at 45% relative humidity was not statistically different than that at 20% regardless of the aerosol fraction analyzed. CONCLUSION: At low relative humidity, influenza retains maximal infectivity and inactivation of the virus at higher relative humidity occurs rapidly after coughing. Although virus carried on aerosol particles 40% will significantly reduce the infectivity of aerosolized virus.

  14. Premature capacity loss in lead/acid batteries: a discussion of the antimony-free effect and related phenomena

    Science.gov (United States)

    Hollenkamp, A. F.

    Instances of severe capacity loss in apparently healthy lead/acid batteries have been reported over a period of many years, and are still common today. In most cases, these phenomena are linked to the use of antimony-free positive grids and are invoked by repetitive deep-discharge duties. This situation represents probably the greatest barrier to the expansion of markets for lead/acid batteries. To date, research has focused on several possible explanations for capacity loss; notably, degradation of the positive active mass (e.g., relaxable insufficient mass utilization) and the development of electrical barriers around the grid. Although much of the evidence gathered is circumstantial, it does point to the key issues that must be addressed in future work.

  15. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  16. Loss of Vps54 Function Leads to Vesicle Traffic Impairment, Protein Mis-Sorting and Embryonic Lethality

    OpenAIRE

    Karlsson, Páll; Droce, Aida; Moser, Jakob; Cuhlmann, Simon; Padilla, Carolina; Heimann, Peter; Bartsch, Jörg; Füchtbauer, Annette; Füchtbauer, Ernst-Martin; Schmitt-John, Thomas

    2013-01-01

    The identification of the mutation causing the phenotype of the amyotrophic lateral sclerosis (ALS) model mouse, wobbler, has linked motor neuron degeneration with retrograde vesicle traffic. The wobbler mutation affects protein stability of Vps54, a ubiquitously expressed vesicle-tethering factor and leads to partial loss of Vps54 function. Moreover, the Vps54 null mutation causes embryonic lethality, which is associated with extensive membrane blebbing in the neural tube and is most likely ...

  17. Effects of historical lead-zinc mining on riffle-dwelling benthic fish and crayfish in the Big River of southeastern Missouri, USA.

    Science.gov (United States)

    Allert, A L; DiStefano, R J; Fairchild, J F; Schmitt, C J; McKee, M J; Girondo, J A; Brumbaugh, W G; May, T W

    2013-04-01

    The Big River (BGR) drains much of the Old Lead Belt mining district (OLB) in southeastern Missouri, USA, which was historically among the largest producers of lead-zinc (Pb-Zn) ore in the world. We sampled benthic fish and crayfish in riffle habitats at eight sites in the BGR and conducted 56-day in situ exposures to the woodland crayfish (Orconectes hylas) and golden crayfish (Orconectes luteus) in cages at four sites affected to differing degrees by mining. Densities of fish and crayfish, physical habitat and water quality, and the survival and growth of caged crayfish were examined at sites with no known upstream mining activities (i.e., reference sites) and at sites downstream of mining areas (i.e., mining and downstream sites). Lead, zinc, and cadmium were analyzed in surface and pore water, sediment, detritus, fish, crayfish, and other benthic macro-invertebrates. Metals concentrations in all materials analyzed were greater at mining and downstream sites than at reference sites. Ten species of fish and four species of crayfish were collected. Fish and crayfish densities were significantly greater at reference than mining or downstream sites, and densities were greater at downstream than mining sites. Survival of caged crayfish was significantly lower at mining sites than reference sites; downstream sites were not tested. Chronic toxic-unit scores and sediment probable effects quotients indicated significant risk of toxicity to fish and crayfish, and metals concentrations in crayfish were sufficiently high to represent a risk to wildlife at mining and downstream sites. Collectively, the results provided direct evidence that metals associated with historical mining activities in the OLB continue to affect aquatic life in the BGR.

  18. Cascading impacts of anthropogenically driven habitat loss: deforestation, flooding, and possible lead poisoning in howler monkeys (Alouatta pigra).

    Science.gov (United States)

    Serio-Silva, Juan Carlos; Olguín, Eugenia J; Garcia-Feria, Luis; Tapia-Fierro, Karla; Chapman, Colin A

    2015-01-01

    To construct informed conservation plans, researchers must go beyond understanding readily apparent threats such as habitat loss and bush-meat hunting. They must predict subtle and cascading effects of anthropogenic environmental modifications. This study considered a potential cascading effect of deforestation on the howler monkeys (Alouatta pigra) of Balancán, Mexico. Deforestation intensifies flooding. Thus, we predicted that increased flooding of the Usumacinta River, which creates large bodies of water that slowly evaporate, would produce increased lead content in the soils and plants, resulting in lead exposure in the howler monkeys. The average lead levels were 18.18 ± 6.76 ppm in the soils and 5.85 ± 4.37 ppm in the plants. However, the average lead content of the hair of 13 captured howler monkeys was 24.12 ± 5.84 ppm. The lead levels in the animals were correlated with 2 of 15 blood traits (lactate dehydrogenase and total bilirubin) previously documented to be associated with exposure to lead. Our research illustrates the urgent need to set reference values indicating when adverse impacts of high environmental lead levels occur, whether anthropogenic or natural, and the need to evaluate possible cascading effects of deforestation on primates.

  19. Cytological evidence for assortment mitosis leading to loss of heterozygosity in rice.

    Science.gov (United States)

    Wang, Richard R-C; Li, Xiaomei; Chatterton, N Jerry

    2006-05-01

    In the root meristem cells of the rice line AMR, which causes loss of heterozygosity in its hybrids, both normal and assortment mitoses were observed. During normal mitosis, chromosomes did not form homologous pairs at metaphase; all chromosomes lined up at the equatorial plate and 2 chromatids of each chromosome disjoined at the centromere and moved toward opposite poles. During assortment mitosis, varying numbers of paired homologues were observed at mitotic metaphase. Two groups of 12 chromosomes separated and moved towards the opposite poles of daughter cells with few chromosomes having their chromatids separated at anaphase. These observations support the proposed mechanism that is responsible for early genotype fixation in rice hybrids involving AMR.

  20. Loss of P2X7 nucleotide receptor function leads to abnormal fat distribution in mice.

    Science.gov (United States)

    Beaucage, Kim L; Xiao, Andrew; Pollmann, Steven I; Grol, Matthew W; Beach, Ryan J; Holdsworth, David W; Sims, Stephen M; Darling, Mark R; Dixon, S Jeffrey

    2014-01-01

    The P2X7 receptor is an ATP-gated cation channel expressed by a number of cell types. We have shown previously that disruption of P2X7 receptor function results in downregulation of osteogenic markers and upregulation of adipogenic markers in calvarial cell cultures. In the present study, we assessed whether loss of P2X7 receptor function results in changes to adipocyte distribution and lipid accumulation in vivo. Male P2X7 loss-of-function (KO) mice exhibited significantly greater body weight and epididymal fat pad mass than wild-type (WT) mice at 9 months of age. Fat pad adipocytes did not differ in size, consistent with adipocyte hyperplasia rather than hypertrophy. Histological examination revealed ectopic lipid accumulation in the form of adipocytes and/or lipid droplets in several non-adipose tissues of older male KO mice (9-12 months of age). Ectopic lipid was observed in kidney, extraorbital lacrimal gland and pancreas, but not in liver, heart or skeletal muscle. Specifically, lacrimal gland and pancreas from 12-month-old male KO mice had greater numbers of adipocytes in perivascular, periductal and acinar regions. As well, lipid droplets accumulated in the renal tubular epithelium and lacrimal acinar cells. Blood plasma analyses revealed diminished total cholesterol levels in 9- and 12-month-old male KO mice compared with WT controls. Interestingly, no differences were observed in female mice. Moreover, there were no significant differences in food consumption between male KO and WT mice. Taken together, these data establish novel in vivo roles for the P2X7 receptor in regulating adipogenesis and lipid metabolism in an age- and sex-dependent manner.

  1. FGF23 deficiency leads to mixed hearing loss and middle ear malformation in mice.

    Directory of Open Access Journals (Sweden)

    Andrew C Lysaght

    Full Text Available Fibroblast growth factor 23 (FGF23 is a circulating hormone important in phosphate homeostasis. Abnormal serum levels of FGF23 result in systemic pathologies in humans and mice, including renal phosphate wasting diseases and hyperphosphatemia. We sought to uncover the role FGF23 plays in the auditory system due to shared molecular mechanisms and genetic pathways between ear and kidney development, the critical roles multiple FGFs play in auditory development and the known hearing phenotype in mice deficient in klotho (KL, a critical co-factor for FGF23 signaling. Using functional assessments of hearing, we demonstrate that Fgf[Formula: see text] mice are profoundly deaf. Fgf[Formula: see text] mice have moderate hearing loss above 20 kHz, consistent with mixed conductive and sensorineural pathology of both middle and inner ear origin. Histology and high-voltage X-ray computed tomography of Fgf[Formula: see text] mice demonstrate dysplastic bulla and ossicles; Fgf[Formula: see text] mice have near-normal morphology. The cochleae of mutant mice appear nearly normal on gross and microscopic inspection. In wild type mice, FGF23 is ubiquitously expressed throughout the cochlea. Measurements from Fgf[Formula: see text] mice do not match the auditory phenotype of Kl-/- mice, suggesting that loss of FGF23 activity impacts the auditory system via mechanisms at least partially independent of KL. Given the extensive middle ear malformations and the overlap of initiation of FGF23 activity and Eustachian tube development, this work suggests a possible role for FGF23 in otitis media.

  2. Rhabdomyolysis-Associated Mutations in Human LPIN1 Lead to Loss of Phosphatidic Acid Phosphohydrolase Activity.

    Science.gov (United States)

    Schweitzer, George G; Collier, Sara L; Chen, Zhouji; Eaton, James M; Connolly, Anne M; Bucelli, Robert C; Pestronk, Alan; Harris, Thurl E; Finck, Brian N

    2015-01-01

    Rhabdomyolysis is an acute syndrome due to extensive injury of skeletal muscle. Recurrent rhabdomyolysis is often caused by inborn errors in intermediary metabolism, and recent work has suggested that mutations in the human gene encoding lipin 1 (LPIN1) may be a common cause of recurrent rhabdomyolysis in children. Lipin 1 dephosphorylates phosphatidic acid to form diacylglycerol (phosphatidic acid phosphohydrolase; PAP) and acts as a transcriptional regulatory protein to control metabolic gene expression. Herein, a 3-year-old boy with severe recurrent rhabdomyolysis was determined to be a compound heterozygote for a novel c.1904T>C (p.Leu635Pro) substitution and a previously reported genomic deletion of exons 18-19 (E766-S838_del) in LPIN1. Western blotting with patient muscle biopsy lysates demonstrated a marked reduction in lipin 1 protein, while immunohistochemical staining for lipin 1 showed abnormal subcellular localization. We cloned cDNAs to express recombinant lipin 1 proteins harboring pathogenic mutations and showed that the E766-S838_del allele was not expressed at the RNA or protein level. Lipin 1 p.Leu635Pro was expressed, but the protein was less stable, was aggregated in the cytosol, and was targeted for proteosomal degradation. Another pathogenic single amino acid substitution, lipin 1 p.Arg725His, was well expressed and retained its transcriptional regulatory function. However, both p.Leu635Pro and p.Arg725His proteins were found to be deficient in PAP activity. Kinetic analyses demonstrated a loss of catalysis rather than diminished substrate binding. These data suggest that loss of lipin 1-mediated PAP activity may be involved in the pathogenesis of rhabdomyolysis in lipin 1 deficiency.

  3. The path to host extinction can lead to loss of generalist parasites.

    Science.gov (United States)

    Farrell, Maxwell J; Stephens, Patrick R; Berrang-Ford, Lea; Gittleman, John L; Davies, T Jonathan

    2015-07-01

    Host extinction can alter disease transmission dynamics, influence parasite extinction and ultimately change the nature of host-parasite systems. While theory predicts that single-host parasites are among the parasite species most susceptible to extinction following declines in their hosts, documented parasite extinctions are rare. Using a comparative approach, we investigate how the richness of single-host and multi-host parasites is influenced by extinction risk among ungulate and carnivore hosts. Host-parasite associations for free-living carnivores (order Carnivora) and terrestrial ungulates (orders Perissodactyla + Cetartiodactyla minus cetaceans) were merged with host trait data and IUCN Red List status to explore the distribution of single-host and multi-host parasites among threatened and non-threatened hosts. We find that threatened ungulates harbour a higher proportion of single-host parasites compared to non-threatened ungulates, which is explained by decreases in the richness of multi-host parasites. However, among carnivores threat status is not a significant predictor of the proportion of single-host parasites, or the richness of single-host or multi-host parasites. The loss of multi-host parasites from threatened ungulates may be explained by decreased cross-species contact as hosts decline and habitats become fragmented. Among carnivores, threat status may not be important in predicting patterns of parasite specificity because host decline results in equal losses of both single-host parasites and multi-host parasites through reduction in average population density and frequency of cross-species contact. Our results contrast with current models of parasite coextinction and highlight the need for updated theories that are applicable across host groups and account for both inter- and intraspecific contact.

  4. 大数据时代引领财务报告变革%Big Data Era Leads the Transformation of Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    管萍; 宋良荣

    2014-01-01

    The development of financial reporting is inseparable from the improvement of information technology, Extensible Business Reporting Language (XBRL) promote the development of financial reporting in the internet age, big data era of technology transformation and the change of thinking will again lead the new breakthrough of the financial reporting, which have a significant impact on the financial reporting model reconstruction, and improve information disclosure system and internet financial reporting.%财务报告的改进离不开信息技术的发展,可扩展商业报告语言(XBRL)推动了财务报告在互联网时代的发展,而大数据时代的技术变革和思维变革将再次引领财务报告新的突破,在财务报告模式的重构、信息披露制度的改进、网络财务报告方面产生重要的影响。

  5. Loss of Function of Evc2 in Dental Mesenchyme Leads to Hypomorphic Enamel.

    Science.gov (United States)

    Zhang, H; Takeda, H; Tsuji, T; Kamiya, N; Kunieda, T; Mochida, Y; Mishina, Y

    2017-04-01

    Ellis-van Creveld (EvC) syndrome is an autosomal-recessive skeletal dysplasia, characterized by short stature and postaxial polydactyly. A series of dental abnormalities, including hypomorphic enamel formation, has been reported in patients with EvC. Despite previous studies that attempted to uncover the mechanism leading to abnormal tooth development, little is known regarding how hypomorphic enamel is formed in patients with EvC. In the current study, using Evc2/ Limbin mutant mice we recently generated, we analyzed enamel formation in the mouse incisor. Consistent with symptoms in human patients, we observed that Evc2 mutant mice had smaller incisors with enamel hypoplasia. Histologic observations coupled with ameloblast marker analyses suggested that Evc2 mutant preameloblasts were capable of differentiating to secretory ameloblasts; this process, however, was apparently delayed, due to delayed odontoblast differentiation, mediated by a limited number of dental mesenchymal stem cells in Evc2 mutant mice. This concept was further supported by the observation that dental mesenchymal-specific deletion of Evc2 phenocopied the tooth abnormalities in Evc2 mutants. Overall, our findings suggest that mutations in Evc2 affect dental mesenchymal stem cell homeostasis, which further leads to hypomorphic enamel formation.

  6. Predicting short-term weight loss using four leading health behavior change theories

    Directory of Open Access Journals (Sweden)

    Barata José T

    2007-04-01

    Full Text Available Abstract Background This study was conceived to analyze how exercise and weight management psychosocial variables, derived from several health behavior change theories, predict weight change in a short-term intervention. The theories under analysis were the Social Cognitive Theory, the Transtheoretical Model, the Theory of Planned Behavior, and Self-Determination Theory. Methods Subjects were 142 overweight and obese women (BMI = 30.2 ± 3.7 kg/m2; age = 38.3 ± 5.8y, participating in a 16-week University-based weight control program. Body weight and a comprehensive psychometric battery were assessed at baseline and at program's end. Results Weight decreased significantly (-3.6 ± 3.4%, p Conclusion The present models were able to predict 20–30% of variance in short-term weight loss and changes in weight management self-efficacy accounted for a large share of the predictive power. As expected from previous studies, exercise variables were only moderately associated with short-term outcomes; they are expected to play a larger explanatory role in longer-term results.

  7. Loss of arylformamidase with reduced thymidine kinase expression leads to impaired glucose tolerance

    Directory of Open Access Journals (Sweden)

    Alison J. Hugill

    2015-11-01

    Full Text Available Tryptophan metabolites have been linked in observational studies with type 2 diabetes, cognitive disorders, inflammation and immune system regulation. A rate-limiting enzyme in tryptophan conversion is arylformamidase (Afmid, and a double knockout of this gene and thymidine kinase (Tk has been reported to cause renal failure and abnormal immune system regulation. In order to further investigate possible links between abnormal tryptophan catabolism and diabetes and to examine the effect of single Afmid knockout, we have carried out metabolic phenotyping of an exon 2 Afmid gene knockout. These mice exhibit impaired glucose tolerance, although their insulin sensitivity is unchanged in comparison to wild-type animals. This phenotype results from a defect in glucose stimulated insulin secretion and these mice show reduced islet mass with age. No evidence of a renal phenotype was found, suggesting that this published phenotype resulted from loss of Tk expression in the double knockout. However, despite specifically removing only exon 2 of Afmid in our experiments we also observed some reduction of Tk expression, possibly due to a regulatory element in this region. In summary, our findings support a link between abnormal tryptophan metabolism and diabetes and highlight beta cell function for further mechanistic analysis.

  8. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  9. Loss of GSNOR1 Function Leads to Compromised Auxin Signaling and Polar Auxin Transport.

    Science.gov (United States)

    Shi, Ya-Fei; Wang, Da-Li; Wang, Chao; Culler, Angela Hendrickson; Kreiser, Molly A; Suresh, Jayanti; Cohen, Jerry D; Pan, Jianwei; Baker, Barbara; Liu, Jian-Zhong

    2015-09-01

    Cross talk between phytohormones, nitric oxide (NO), and auxin has been implicated in the control of plant growth and development. Two recent reports indicate that NO promoted auxin signaling but inhibited auxin transport probably through S-nitrosylation. However, genetic evidence for the effect of S-nitrosylation on auxin physiology has been lacking. In this study, we used a genetic approach to understand the broader role of S-nitrosylation in auxin physiology in Arabidopsis. We compared auxin signaling and transport in Col-0 and gsnor1-3, a loss-of-function GSNOR1 mutant defective in protein de-nitrosylation. Our results showed that auxin signaling was impaired in the gsnor1-3 mutant as revealed by significantly reduced DR5-GUS/DR5-GFP accumulation and compromised degradation of AXR3NT-GUS, a useful reporter in interrogating auxin-mediated degradation of Aux/IAA by auxin receptors. In addition, polar auxin transport was compromised in gsnor1-3, which was correlated with universally reduced levels of PIN or GFP-PIN proteins in the roots of the mutant in a manner independent of transcription and 26S proteasome degradation. Our results suggest that S-nitrosylation and GSNOR1-mediated de-nitrosylation contribute to auxin physiology, and impaired auxin signaling and compromised auxin transport are responsible for the auxin-related morphological phenotypes displayed by the gsnor1-3 mutant.

  10. Sirolimus Therapy for Patients With Lymphangioleiomyomatosis Leads to Loss of Chylous Ascites and Circulating LAM Cells.

    Science.gov (United States)

    Harari, Sergio; Elia, Davide; Torre, Olga; Bulgheroni, Elisabetta; Provasi, Elena; Moss, Joel

    2016-08-01

    A young woman received a diagnosis of abdominal, sporadic lymphangioleiomyomatosis (LAM) and multiple abdominal lymphangioleiomyomas and was referred for recurrent chylous ascites responding only to a fat-free diet. On admission, pulmonary function test (PFT) results showed a moderate reduction in the transfer factor for carbon monoxide with normal exercise performance. The serum vascular endothelial growth factor D (VEGF-D) level was 2,209 pg/mL. DNA sequences, amplified at loci kg8, D16S3395, D16S3024, D16S521, and D16S291 on chromosome 16p13.3, showed a loss of heterozygosity (LOH) only for kg8. Fat-free total parenteral nutrition in association with sirolimus (2 mg po daily) was initiated. Serum sirolimus levels were maintained at concentrations between 5 and 15 ng/mL. After 1 month, reintroduction of a low-fat oral feeding was achieved without recurrence of ascites. PFT results were stable. Interestingly, clinical improvement was associated with a reduction in the VEGF-D serum level (1,558 pg/mL). LOH at the kg8 biomarker in blood LAM cells was no longer detected.

  11. DENTAL CARIES LEADING TO PREMATURE LOSS OF BABY TEETH- IMPLICATIONS AND MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rachana Bahuguna

    2011-12-01

    Full Text Available Dental caries is a destructive process causing decalcification of the tooth enamel and leading to continued destruction of enamel and dentin, and cavitation of the tooth. Dental caries can occur soon after eruption of the primary teeth, starting at 6 months of age. Primary teeth are present for a reason. One key reason is that they save space for the permanent tooth, which will erupt into its position when the deciduous / primary tooth is lost normally. If a primary tooth (baby or milk tooth, has to be removed early due to say, an abscess which is mostly a result of dental caries, a space maintainer may be recommended to save the space. If the space is not preserved, the other teeth may drift causing difficult to treat crowding and orthodontic problems. These "spacers" are placed temporarily, and are not permanent. They are removed when the new tooth (usually a bicuspid erupts or the abutment teeth get loose.

  12. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  13. Human impact on atolls leads to coral loss and community homogenisation: a modeling study.

    Directory of Open Access Journals (Sweden)

    Bernhard M Riegl

    Full Text Available We explore impacts on pristine atolls subjected to anthropogenic near-field (human habitation and far-field (climate and environmental change pressure. Using literature data of human impacts on reefs, we parameterize forecast models to evaluate trajectories in coral cover under impact scenarios that primarily act via recruitment and increased mortality of larger corals. From surveys across the Chagos, we investigate the regeneration dynamics of coral populations distant from human habitation after natural disturbances. Using a size-based mathematical model based on a time-series of coral community and population data from 1999-2006, we provide hind- and forecast data for coral population dynamics within lagoons and on ocean-facing reefs verified against monitoring from 1979-2009. Environmental data (currents, temperatures were used for calibration. The coral community was simplified into growth typologies: branching and encrusting, arboresent and massive corals. Community patterns observed in the field were influenced by bleaching-related mortality, most notably in 1998. Survival had been highest in deep lagoonal settings, which suggests a refuge. Recruitment levels were higher in lagoons than on ocean-facing reefs. When adding stress by direct human pressure, climate and environmental change as increased disturbance frequency and modified recruitment and mortality levels (due to eutrophication, overfishing, pollution, heat, acidification, etc, models suggest steep declines in coral populations and loss of community diversification among habitats. We found it likely that degradation of lagoonal coral populations would impact regeneration potential of all coral populations, also on ocean-facing reefs, thus decreasing reef resilience on the entire atoll.

  14. Loss of runt-related transcription factor 3 expression leads hepatocellular carcinoma cells to escape apoptosis

    Directory of Open Access Journals (Sweden)

    Nakamura Shinichiro

    2011-01-01

    Full Text Available Abstract Background Runt-related transcription factor 3 (RUNX3 is known as a tumor suppressor gene for gastric cancer and other cancers, this gene may be involved in the development of hepatocellular carcinoma (HCC. Methods RUNX3 expression was analyzed by immunoblot and immunohistochemistry in HCC cells and tissues, respectively. Hep3B cells, lacking endogenous RUNX3, were introduced with RUNX3 constructs. Cell proliferation was measured using the MTT assay and apoptosis was evaluated using DAPI staining. Apoptosis signaling was assessed by immunoblot analysis. Results RUNX3 protein expression was frequently inactivated in the HCC cell lines (91% and tissues (90%. RUNX3 expression inhibited 90 ± 8% of cell growth at 72 h in serum starved Hep3B cells. Forty-eight hour serum starvation-induced apoptosis and the percentage of apoptotic cells reached 31 ± 4% and 4 ± 1% in RUNX3-expressing Hep3B and control cells, respectively. Apoptotic activity was increased by Bim expression and caspase-3 and caspase-9 activation. Conclusion RUNX3 expression enhanced serum starvation-induced apoptosis in HCC cell lines. RUNX3 is deleted or weakly expressed in HCC, which leads to tumorigenesis by escaping apoptosis.

  15. Comparison of Inventory Systems with Service, Positive Lead-Time, Loss, and Retrial of Customers

    Directory of Open Access Journals (Sweden)

    A. Krishnamoorthy

    2007-01-01

    Full Text Available We analyze and compare three (s,S inventory systems with positive service time and retrial of customers. In all of these systems, arrivals of customers form a Poisson process and service times are exponentially distributed. When the inventory level depletes to s due to services, an order of replenishment is placed. The lead-time follows an exponential distribution. In model I, an arriving customer, finding the inventory dry or server busy, proceeds to an orbit with probability γ and is lost forever with probability (1−γ. A retrial customer in the orbit, finding the inventory dry or server busy, returns to the orbit with probability δ and is lost forever with probability (1−δ. In addition to the description in model I, we provide a buffer of varying (finite capacity equal to the current inventory level for model II and another having capacity equal to the maximum inventory level S for model III. In models II and III, an arriving customer, finding the buffer full, proceeds to an orbit with probability γ and is lost forever with probability (1−γ. A retrial customer in the orbit, finding the buffer full, returns to the orbit with probability δ and is lost forever with probability (1−δ. In all these models, the interretrial times are exponentially distributed with linear rate. Using matrix-analytic method, we study these inventory models. Some measures of the system performance in the steady state are derived. A suitable cost function is defined for all three cases and analyzed using graphical illustrations.

  16. 大数据引领我们走向智能化时代%Big data lead us into intelligent era

    Institute of Scientific and Technical Information of China (English)

    张振兴; 牟如玲

    2014-01-01

    With the coming of the era of big data, all kinds of intelligent new things emerge in endlessly. By understanding the true spirit of big data, analyzing the present situation and the difficulty of big data faced by, summarizing the deep meaning of big data analysis, we have given the related suggestions about technology and method of big data analysis and of big data work.%随着大数据时代的来临,各类智能化新生事物层出不穷。该文通过了解大数据的真正内涵,分析大数据的现状和面临的困境,总结大数据分析的深刻意义,我们对大数据分析的技术与方法以及大数据工作的开展给出了相关建议。

  17. Lack of autophagy in the hematopoietic system leads to loss of hematopoietic stem cell function and dysregulated myeloid proliferation.

    Science.gov (United States)

    Mortensen, Monika; Watson, Alexander Scarth; Simon, Anna Katharina

    2011-09-01

    The regulated lysosomal degradation pathway of autophagy prevents cellular damage and thus protects from malignant transformation. Autophagy is also required for the maturation of various hematopoietic lineages, namely the erythroid and lymphoid ones, yet its role in adult hematopoietic stem cells (HSCs) remained unexplored. While normal HSCs sustain life-long hematopoiesis, malignant transformation of HSCs or early progenitors leads to leukemia. Mechanisms protecting HSCs from cellular damage are therefore essential to prevent hematopoietic malignancies. By conditionally deleting the essential autophagy gene Atg7 in the hematopoietic system, we found that autophagy is required for the maintenance of true HSCs and therefore also of downstream hematopoietic progenitors. Loss of autophagy in HSCs leads to the expansion of a progenitor cell population in the bone marrow, giving rise to a severe, invasive myeloproliferation, which strongly resembles human acute myeloid leukemia (AML).

  18. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  19. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  20. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  1. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  2. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay between big...

  3. Loss of cargo binding in the human myosin VI deafness mutant (R1166X) leads to increased actin filament binding.

    Science.gov (United States)

    Arden, Susan D; Tumbarello, David A; Butt, Tariq; Kendrick-Jones, John; Buss, Folma

    2016-10-01

    Mutations in myosin VI have been associated with autosomal-recessive (DFNB37) and autosomal-dominant (DFNA22) deafness in humans. Here, we characterise an myosin VI nonsense mutation (R1166X) that was identified in a family with hereditary hearing loss in Pakistan. This mutation leads to the deletion of the C-terminal 120 amino acids of the myosin VI cargo-binding domain, which includes the WWY-binding motif for the adaptor proteins LMTK2, Tom1 as well as Dab2. Interestingly, compromising myosin VI vesicle-binding ability by expressing myosin VI with the R1166X mutation or with single point mutations in the adaptor-binding sites leads to increased F-actin binding of this myosin in vitro and in vivo As our results highlight the importance of cargo attachment for regulating actin binding to the motor domain, we perform a detailed characterisation of adaptor protein binding and identify single amino acids within myosin VI required for binding to cargo adaptors. We not only show that the adaptor proteins can directly interact with the cargo-binding tail of myosin VI, but our in vitro studies also suggest that multiple adaptor proteins can bind simultaneously to non-overlapping sites in the myosin VI tail. In conclusion, our characterisation of the human myosin VI deafness mutant (R1166X) suggests that defects in cargo binding may leave myosin VI in a primed/activated state with an increased actin-binding ability.

  4. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo;

    and locations, having a diverse knowledge set and capable of tackling more and more complex problems. This prose the question if Big Egos continues to dominate in this rising paradigm of big science. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize...

  5. Big and Small

    CERN Document Server

    Ekers, R D

    2010-01-01

    Technology leads discovery in astronomy, as in all other areas of science, so growth in technology leads to the continual stream of new discoveries which makes our field so fascinating. Derek de Solla Price had analysed the discovery process in science in the 1960s and he introduced the terms 'Little Science' and 'Big Science' as part of his discussion of the role of exponential growth in science. I will show how the development of astronomical facilities has followed this same trend from 'Little Science' to 'Big Science' as a field matures. We can see this in the discoveries resulting in Nobel Prizes in astronomy. A more detailed analysis of discoveries in radio astronomy shows the same effect. I include a digression to look at how science progresses, comparing the roles of prediction, serendipity, measurement and explanation. Finally I comment on the differences between the 'Big Science' culture in Physics and in Astronomy.

  6. Big Data

    OpenAIRE

    2013-01-01

    Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google,...

  7. Loss of vps54 function leads to vesicle traffic impairment, protein mis-sorting and embryonic lethality.

    Science.gov (United States)

    Karlsson, Páll; Droce, Aida; Moser, Jakob M; Cuhlmann, Simon; Padilla, Carolina Ortiz; Heimann, Peter; Bartsch, Jörg W; Füchtbauer, Annette; Füchtbauer, Ernst-Martin; Schmitt-John, Thomas

    2013-01-01

    The identification of the mutation causing the phenotype of the amyotrophic lateral sclerosis (ALS) model mouse, wobbler, has linked motor neuron degeneration with retrograde vesicle traffic. The wobbler mutation affects protein stability of Vps54, a ubiquitously expressed vesicle-tethering factor and leads to partial loss of Vps54 function. Moreover, the Vps54 null mutation causes embryonic lethality, which is associated with extensive membrane blebbing in the neural tube and is most likely a consequence of impaired vesicle transport. Investigation of cells derived from wobbler and Vps54 null mutant embryos demonstrates impaired retrograde transport of the Cholera-toxin B subunit to the trans-Golgi network and mis-sorting of mannose-6-phosphate receptors and cargo proteins dependent on retrograde vesicle transport. Endocytosis assays demonstrate no difference between wobbler and wild type cells, indicating that the retrograde vesicle traffic to the trans-Golgi network, but not endocytosis, is affected in Vps54 mutant cells. The results obtained on wobbler cells were extended to test the use of cultured skin fibroblasts from human ALS patients to investigate the retrograde vesicle traffic. Analysis of skin fibroblasts of ALS patients will support the investigation of the critical role of the retrograde vesicle transport in ALS pathogenesis and might yield a diagnostic prospect.

  8. Overexpression of galectin-7 in mouse epidermis leads to loss of cell junctions and defective skin repair.

    Directory of Open Access Journals (Sweden)

    Gaëlle Gendronneau

    Full Text Available The proteins of the galectin family are implicated in many cellular processes, including cell interactions, polarity, intracellular trafficking, and signal transduction. In human and mouse, galectin-7 is almost exclusively expressed in stratified epithelia, notably in the epidermis. Galectin-7 expression is also altered in several human tumors of epithelial origin. This study aimed at dissecting the consequences of galectin-7 overexpression on epidermis structure and functions in vivo.We established transgenic mice specifically overexpressing galectin-7 in the basal epidermal keratinocytes and analyzed the consequences on untreated skin and after UVB irradiation or mechanical injury.The intercellular cohesion of the epidermis is impaired in transgenic animals, with gaps developing between adjacent keratinocytes, associated with loss of adherens junctions. The epidermal architecture is aberrant with perturbations in the multilayered cellular organisation of the tissue, and structural defects in the basement membrane. These transgenic animals displayed a reduced re-epithelialisation potential following superficial wound, due to a defective collective migration of keratinocytes. Finally, a single mild dose of UVB induced an abnormal apoptotic response in the transgenic epidermis.These results indicate that an excess of galectin-7 leads to a destabilisation of adherens junctions associated with defects in epidermal repair. As this phenotype shares similarities with that of galectin-7 null mutant mice, we conclude that a critical level of this protein is required for maintaining proper epidermal homeostasis. This study brings new insight into the mode of action of galectins in normal and pathological situations.

  9. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  10. Mitotic defects lead to pervasive aneuploidy and accompany loss of RB1 activity in mouse LmnaDhe dermal fibroblasts.

    Directory of Open Access Journals (Sweden)

    C Herbert Pratt

    Full Text Available BACKGROUND: Lamin A (LMNA is a component of the nuclear lamina and is mutated in several human diseases, including Emery-Dreifuss muscular dystrophy (EDMD; OMIM ID# 181350 and the premature aging syndrome Hutchinson-Gilford progeria syndrome (HGPS; OMIM ID# 176670. Cells from progeria patients exhibit cell cycle defects in both interphase and mitosis. Mouse models with loss of LMNA function have reduced Retinoblastoma protein (RB1 activity, leading to aberrant cell cycle control in interphase, but how mitosis is affected by LMNA is not well understood. RESULTS: We examined the cell cycle and structural phenotypes of cells from mice with the Lmna allele, Disheveled hair and ears (Lmna(Dhe. We found that dermal fibroblasts from heterozygous Lmna(Dhe (Lmna(Dhe/+ mice exhibit many phenotypes of human laminopathy cells. These include severe perturbations to the nuclear shape and lamina, increased DNA damage, and slow growth rates due to mitotic delay. Interestingly, Lmna(Dhe/+ fibroblasts also had reduced levels of hypophosphorylated RB1 and the non-SMC condensin II-subunit D3 (NCAP-D3, a mitosis specific centromere condensin subunit that depends on RB1 activity. Mitotic check point control by mitotic arrest deficient-like 1 (MAD2L1 also was perturbed in Lmna(Dhe/+ cells. Lmna(Dhe/+ fibroblasts were consistently aneuploid and had higher levels of micronuclei and anaphase bridges than normal fibroblasts, consistent with chromosome segregation defects. CONCLUSIONS: These data indicate that RB1 may be a key regulator of cellular phenotype in laminopathy-related cells, and suggest that the effects of LMNA on RB1 include both interphase and mitotic cell cycle control.

  11. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  12. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  13. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  14. Human motor neuron progenitor transplantation leads to endogenous neuronal sparing in 3 models of motor neuron loss.

    Science.gov (United States)

    Wyatt, Tanya J; Rossi, Sharyn L; Siegenthaler, Monica M; Frame, Jennifer; Robles, Rockelle; Nistor, Gabriel; Keirstead, Hans S

    2011-01-01

    Motor neuron loss is characteristic of many neurodegenerative disorders and results in rapid loss of muscle control, paralysis, and eventual death in severe cases. In order to investigate the neurotrophic effects of a motor neuron lineage graft, we transplanted human embryonic stem cell-derived motor neuron progenitors (hMNPs) and examined their histopathological effect in three animal models of motor neuron loss. Specifically, we transplanted hMNPs into rodent models of SMA (Δ7SMN), ALS (SOD1 G93A), and spinal cord injury (SCI). The transplanted cells survived and differentiated in all models. In addition, we have also found that hMNPs secrete physiologically active growth factors in vivo, including NGF and NT-3, which significantly enhanced the number of spared endogenous neurons in all three animal models. The ability to maintain dying motor neurons by delivering motor neuron-specific neurotrophic support represents a powerful treatment strategy for diseases characterized by motor neuron loss.

  15. 数字技术开辟牙体牙髓创新之路%Overall digitalization: leading innovation of endodontics in big data era

    Institute of Scientific and Technical Information of China (English)

    凌均棨

    2016-01-01

    In big data era,digital technologies bring great challenges and opportunities to modern stomatology.The applications of digital technologies,such as cone-beam CT(CBCT),computer aided design,(CAD) and computer aided manufacture(CAM),3D printing and digital approaches for education,provide new concepts and patterns to the treatment and study of endodontic diseases.This review provides an overview of the application and prospect of commonly used digital technologies in the development of endodontics.%在大数据时代,数字技术为现代口腔医学带来了新的挑战和机遇.锥形束CT、计算机辅助设计与制作、3D打印、显微CT和数字化教学等现代数字技术的应用,为牙体牙髓病诊疗和研究提供了新的理念和模式,概述常用数字技术在牙体牙髓病学科发展中的应用与展望.

  16. Activation of the canonical Wnt pathway leads to loss of hematopoietic stem cell repopulation and multilineage differentiation block

    DEFF Research Database (Denmark)

    Kirstetter, Peggy; Anderson, Kristina; Porse, Bo T;

    2006-01-01

    of hematopoietic stem cell function was associated with decreased expression of Cdkn1a (encoding the cell cycle inhibitor p21(cdk)), Sfpi1, Hoxb4 and Bmi1 (encoding the transcription factors PU.1, HoxB4 and Bmi-1, respectively) and altered integrin expression in Lin(-)Sca-1(+)c-Kit(+) cells, whereas PU.1......Wnt signaling increases hematopoietic stem cell self-renewal and is activated in both myeloid and lymphoid malignancies, indicating involvement in both normal and malignant hematopoiesis. We report here activated canonical Wnt signaling in the hematopoietic system through conditional expression...... of a stable form of beta-catenin. This enforced expression led to hematopoietic failure associated with loss of myeloid lineage commitment at the granulocyte-macrophage progenitor stage; blocked erythrocyte differentiation; disruption of lymphoid development; and loss of repopulating stem cell activity. Loss...

  17. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  18. Human Motor Neuron Progenitor Transplantation Leads to Endogenous Neuronal Sparing in 3 Models of Motor Neuron Loss

    Directory of Open Access Journals (Sweden)

    Tanya J. Wyatt

    2011-01-01

    Full Text Available Motor neuron loss is characteristic of many neurodegenerative disorders and results in rapid loss of muscle control, paralysis, and eventual death in severe cases. In order to investigate the neurotrophic effects of a motor neuron lineage graft, we transplanted human embryonic stem cell-derived motor neuron progenitors (hMNPs and examined their histopathological effect in three animal models of motor neuron loss. Specifically, we transplanted hMNPs into rodent models of SMA (Δ7SMN, ALS (SOD1 G93A, and spinal cord injury (SCI. The transplanted cells survived and differentiated in all models. In addition, we have also found that hMNPs secrete physiologically active growth factors in vivo, including NGF and NT-3, which significantly enhanced the number of spared endogenous neurons in all three animal models. The ability to maintain dying motor neurons by delivering motor neuron-specific neurotrophic support represents a powerful treatment strategy for diseases characterized by motor neuron loss.

  19. Loss of lysosomal ion channel transient receptor potential channel mucolipin-1 (TRPML1) leads to cathepsin B-dependent apoptosis.

    Science.gov (United States)

    Colletti, Grace A; Miedel, Mark T; Quinn, James; Andharia, Neel; Weisz, Ora A; Kiselyov, Kirill

    2012-03-09

    Mucolipidosis type IV (MLIV) is a lysosomal storage disease caused by mutations in the gene MCOLN1, which codes for the transient receptor potential family ion channel TRPML1. MLIV has an early onset and is characterized by developmental delays, motor and cognitive deficiencies, gastric abnormalities, retinal degeneration, and corneal cloudiness. The degenerative aspects of MLIV have been attributed to cell death, whose mechanisms remain to be delineated in MLIV and in most other storage diseases. Here we report that an acute siRNA-mediated loss of TRPML1 specifically causes a leak of lysosomal protease cathepsin B (CatB) into the cytoplasm. CatB leak is associated with apoptosis, which can be prevented by CatB inhibition. Inhibition of the proapoptotic protein Bax prevents TRPML1 KD-mediated apoptosis but does not prevent cytosolic release of CatB. This is the first evidence of a mechanistic link between acute TRPML1 loss and cell death.

  20. Bacteriophage selection against a plasmid-encoded sex apparatus leads to the loss of antibiotic-resistance plasmids

    OpenAIRE

    Jalasvuori, Matti; Friman, Ville-Petri; Nieminen, Anne; Jaana K.H. Bamford; Buckling, Angus

    2011-01-01

    Antibiotic-resistance genes are often carried by conjugative plasmids, which spread within and between bacterial species. It has long been recognized that some viruses of bacteria (bacteriophage; phage) have evolved to infect and kill plasmid-harbouring cells. This raises a question: can phages cause the loss of plasmid-associated antibiotic resistance by selecting for plasmid-free bacteria, or can bacteria or plasmids evolve resistance to phages in other ways? Here, we show that multiple ant...

  1. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  2. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  3. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  4. Natural experiment demonstrates that bird loss leads to cessation of dispersal of native seeds from intact to degraded forests.

    Science.gov (United States)

    Caves, Eleanor M; Jennings, Summer B; Hillerislambers, Janneke; Tewksbury, Joshua J; Rogers, Haldre S

    2013-01-01

    In healthy forests, vertebrate frugivores move seeds from intact to degraded forests, aiding in the passive regeneration of degraded forests. Yet vertebrate frugivores are declining around the world, and little is known about the impact of this loss on regeneration of degraded areas. Here, we use a unique natural experiment to assess how complete vertebrate frugivore loss affects native seed rain in degraded forest. All native vertebrate frugivores (which were primarily avian frugivores) have been functionally extirpated from the island of Guam by the invasive brown tree snake (Boiga irregularis), whereas the nearby island of Saipan has a relatively intact vertebrate frugivore community. We captured seed rain along transects extending from intact into degraded forest and compared the species richness, density and condition of the seed rain from native bird-dispersed tree species between the two islands. Considering seeds from native bird-dispersed species, approximately 1.66 seeds landed per 26 days in each square meter of degraded forest on Saipan, whereas zero seeds landed per 26 days per square meter in degraded forest on Guam. Additionally, on Saipan, 69% of native bird-dispersed seeds in intact forest and 77% of seeds in degraded forest lacked fleshy fruit pulp, suggesting ingestion by birds, compared to 0% of all seeds on Guam. Our results show an absence of seed rain in degraded forests on Guam, correlated with the absence of birds, whereas on Saipan, frugivorous birds regularly disperse seeds into degraded forests, providing a mechanism for re-colonization by native plants. These results suggest that loss of frugivores will slow regeneration of degraded forests on Guam.

  5. Mountain Hike North of Big Cottonwood Canyon Road, Begining at the S-Turn at Mill B., Near Hidden Falls, and Taking Trail Leading to Mt. Raymond and Other Intersting Places.

    Science.gov (United States)

    McDonald, Keith L.

    2004-11-01

    Our first objective is to leave the highway via Mill B North Fork by taking the Big Cottonwood Canyon trail that leads to Maxfield Basin, where 3 trails intersect, just s. of Mount Raymond (Elev. 10,241 ft.) the n. trail takes us down to the Mill Creek Canyon Road, at about 1 mi. (+) east of intersection with Church Park Picnic Ground road. At Maxfield Basin, again, the east trail skirts around Mt. Raymond and has another intersection with a trail running n. thru the area of Gobblers Knob (elev. 10,246 ft.), to White Fir Pass and turns w. at Bowman Fk. until it connects with Porter Fork and then the Mill Creek Road. The remaining trail at Mill A Basin, just e. of Mount Raymond, long before Gobblers Knob is seen, runs east past a spring, and connects to Butler Fork (which begins at 3.775 mi., measured along highway from Mill B, North Fork), which leads directly to Dog Lake. Evidently both Dog Lake and Lake Desolation (changing U.S. Geological Survey maps from Mount Aire, Utah to Park City West, Utah) have connected outlets, at least during certain times of the year. Following the trail s. e. (down) that follows near Summit Co. and Salt Lake County, we pass by the radio transmitters shown on Park City, West, Utah, map and finally enter the Brighton, Utah map with Scott Hill, Scott Pass, the important highway leading to Midway Reservoir, and beyond, Bloods Lake ( 9500 ft.), Clayton Peak (10,721 ft.) and Lake Lackawaxen ( 9980 ft.), our final destination showing through. One may easily walk the distance to lake Lackawaxen from Bloods Lake by staying south of the ridgecrest and by following the hollow down for a while. This completes our destination. Recall that the main roadway here was already passed over about 1/2 mile n. of Bloods Lake; this thoroughfare has its beginning at about 0.4 miles below (or North) of the Brighton Loop, where the road to city of Midway leaves the main Big Cottonwood Highway going n. and runs e., on the average, going past Midway Reservoir

  6. Rewiring yeast acetate metabolism through MPC1 loss of function leads to mitochondrial damage and decreases chronological lifespan

    Directory of Open Access Journals (Sweden)

    Ivan Orlandi

    2014-11-01

    Full Text Available During growth on fermentable substrates, such as glucose, pyruvate, which is the end-product of glycolysis, can be used to generate acetyl-CoA in the cytosol via acetaldehyde and acetate, or in mitochondria by direct oxidative decarboxylation. In the latter case, the mitochondrial pyruvate carrier (MPC is responsible for pyruvate transport into mitochondrial matrix space. During chronological aging, yeast cells which lack the major structural subunit Mpc1 display a reduced lifespan accompanied by an age-dependent loss of autophagy. Here, we show that the impairment of pyruvate import into mitochondria linked to Mpc1 loss is compensated by a flux redirection of TCA cycle intermediates through the malic enzyme-dependent alternative route. In such a way, the TCA cycle operates in a “branched” fashion to generate pyruvate and is depleted of intermediates. Mutant cells cope with this depletion by increasing the activity of glyoxylate cycle and of the pathway which provides the nucleocytosolic acetyl-CoA. Moreover, cellular respiration decreases and ROS accumulate in the mitochondria which, in turn, undergo severe damage. These acquired traits in concert with the reduced autophagy restrict cell survival of the mpc1∆ mutant during chronological aging. Conversely, the activation of the carnitine shuttle by supplying acetyl-CoA to the mitochondria is sufficient to abrogate the short-lived phenotype of the mutant.

  7. Bacteriophage selection against a plasmid-encoded sex apparatus leads to the loss of antibiotic-resistance plasmids.

    Science.gov (United States)

    Jalasvuori, Matti; Friman, Ville-Petri; Nieminen, Anne; Bamford, Jaana K H; Buckling, Angus

    2011-12-23

    Antibiotic-resistance genes are often carried by conjugative plasmids, which spread within and between bacterial species. It has long been recognized that some viruses of bacteria (bacteriophage; phage) have evolved to infect and kill plasmid-harbouring cells. This raises a question: can phages cause the loss of plasmid-associated antibiotic resistance by selecting for plasmid-free bacteria, or can bacteria or plasmids evolve resistance to phages in other ways? Here, we show that multiple antibiotic-resistance genes containing plasmids are stably maintained in both Escherichia coli and Salmonella enterica in the absence of phages, while plasmid-dependent phage PRD1 causes a dramatic reduction in the frequency of antibiotic-resistant bacteria. The loss of antibiotic resistance in cells initially harbouring RP4 plasmid was shown to result from evolution of phage resistance where bacterial cells expelled their plasmid (and hence the suitable receptor for phages). Phages also selected for a low frequency of plasmid-containing, phage-resistant bacteria, presumably as a result of modification of the plasmid-encoded receptor. However, these double-resistant mutants had a growth cost compared with phage-resistant but antibiotic-susceptible mutants and were unable to conjugate. These results suggest that bacteriophages could play a significant role in restricting the spread of plasmid-encoded antibiotic resistance.

  8. Loss of neurogenesis in Hydra leads to compensatory regulation of neurogenic and neurotransmission genes in epithelial cells.

    Science.gov (United States)

    Wenger, Y; Buzgariu, W; Galliot, B

    2016-01-05

    Hydra continuously differentiates a sophisticated nervous system made of mechanosensory cells (nematocytes) and sensory-motor and ganglionic neurons from interstitial stem cells. However, this dynamic adult neurogenesis is dispensable for morphogenesis. Indeed animals depleted of their interstitial stem cells and interstitial progenitors lose their active behaviours but maintain their developmental fitness, and regenerate and bud when force-fed. To characterize the impact of the loss of neurogenesis in Hydra, we first performed transcriptomic profiling at five positions along the body axis. We found neurogenic genes predominantly expressed along the central body column, which contains stem cells and progenitors, and neurotransmission genes predominantly expressed at the extremities, where the nervous system is dense. Next, we performed transcriptomics on animals depleted of their interstitial cells by hydroxyurea, colchicine or heat-shock treatment. By crossing these results with cell-type-specific transcriptomics, we identified epithelial genes up-regulated upon loss of neurogenesis: transcription factors (Dlx, Dlx1, DMBX1/Manacle, Ets1, Gli3, KLF11, LMX1A, ZNF436, Shox1), epitheliopeptides (Arminins, PW peptide), neurosignalling components (CAMK1D, DDCl2, Inx1), ligand-ion channel receptors (CHRNA1, NaC7), G-Protein Coupled Receptors and FMRFRL. Hence epitheliomuscular cells seemingly enhance their sensing ability when neurogenesis is compromised. This unsuspected plasticity might reflect the extended multifunctionality of epithelial-like cells in early eumetazoan evolution.

  9. Unraveling dielectric and electrical properties of ultralow-loss lead magnesium niobate titanate pyrochlore dielectric thin films for capacitive applications

    Science.gov (United States)

    Zhu, X. H.; Defaÿ, E.; Suhm, A.; Fribourg-Blanc, E.; Aïd, M.; Zhu, J. L.; Xiao, D. Q.; Zhu, J. G.

    2010-05-01

    PbO-MgO-Nb2O5-TiO2 (PMNT) pyrochlore thin films were prepared on Pt-coated silicon substrates by radio-frequency magnetron sputtering and postdeposition annealing method. Very interestingly, these pyrochlore-structured PMNT thin films exhibited ultralow dielectric losses, with a typical loss tangent as low as 0.001, and relatively high dielectric constants, typically ɛr˜170. It was found that the relative permittivity slightly but continuously increased upon cooling without any signature of a structural phase transition, displaying a quantum paraelectriclike behavior; meanwhile, the PMNT pyrochlore thin films did not show any noticeable dielectric dispersion in the real part of permittivity over a wide temperature range (77-400 K). Their dielectric responses could, however, be efficiently tuned by applying a dc electric field. A maximum applied bias field of 1 MV/cm resulted in a ˜20% tunability of the dielectric permittivity, giving rise to a fairly large coefficient of the dielectric nonlinearity, ˜2.5×109 J C-4 m-5. Moreover, the PMNT pyrochlore films exhibited superior electrical insulation properties with a relatively high breakdown field (Ebreakdown˜1.5 MV/cm) and a very low leakage current density of about 8.2×10-7 A/cm2 obtained at an electric field intensity as high as 500 kV/cm.

  10. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  11. Big Man

    Institute of Scientific and Technical Information of China (English)

    郑秀文

    2012-01-01

    <正>梁炳"Edmond"说他演唱会后会跟太太去旅行。无论飞机降落在地球的哪角,有伴在旁就是幸福。他的concert名字是big man,初时我看错是big mac演唱会:心想干吗是大汉堡演唱会?嘻!后来才知看错。但其实细想,在成长路上,谁不曾是活得像个傻傻的面包,一团面粉暴露在这大千世界,时间和各式人生经历就是酵母,多少年月日,你我都会发酵成长。友情也是激发彼此成长的酵母,看到对方早已经从男仔成了男人,我都原来一早已不再能够以"女仔"称呼自己。在我眼中,他的改变是大的,爱玩外向的个性收窄了,现在的我们,

  12. The influence of zinc on the uptake and loss of cadmium and lead in the woodlouse, Porcellio scaber (Isopoda, Oniscidea).

    Science.gov (United States)

    Witzel, B

    2000-09-01

    Uptake of cadmium, lead, and zinc was studied in juvenile Porcellio scaber in feeding experiments over 5 months. The metals were offered separately and in different combinations and concentrations in the food. The ability of P. scaber to eliminate the accumulated metals was studied subsequently for 3 months on uncontaminated food. Characteristic patterns of accumulation are described for the three metals. The combination of lead and zinc resulted in only minor differences in these patterns. On the other hand, the combination of zinc and cadmium at high concentrations completely changed the accumulation patterns for both metals. Not only cadmium but also zinc was excreted by P. scaber exclusively when the animals had been contaminated with both metals. In contrast both metals were stored permanently when offered separately. Possible reasons for the interactions of cadmium and zinc are discussed.

  13. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  14. Plakophilin3 loss leads to an increase in PRL3 levels promoting K8 dephosphorylation, which is required for transformation and metastasis.

    Directory of Open Access Journals (Sweden)

    Nileema Khapare

    Full Text Available The desmosome anchors keratin filaments in epithelial cells leading to the formation of a tissue wide IF network. Loss of the desmosomal plaque protein plakophilin3 (PKP3 in HCT116 cells, leads to an increase in neoplastic progression and metastasis, which was accompanied by an increase in K8 levels. The increase in levels was due to an increase in the protein levels of the Phosphatase of Regenerating Liver 3 (PRL3, which results in a decrease in phosphorylation on K8. The increase in PRL3 and K8 protein levels could be reversed by introduction of an shRNA resistant PKP3 cDNA. Inhibition of K8 expression in the PKP3 knockdown clone S10, led to a decrease in cell migration and lamellipodia formation. Further, the K8 PKP3 double knockdown clones showed a decrease in colony formation in soft agar and decreased tumorigenesis and metastasis in nude mice. These results suggest that a stabilisation of K8 filaments leading to an increase in migration and transformation may be one mechanism by which PKP3 loss leads to tumor progression and metastasis.

  15. Loss of zebrafish lgi1b leads to hydrocephalus and sensitization to pentylenetetrazol induced seizure-like behavior.

    Directory of Open Access Journals (Sweden)

    Yong Teng

    Full Text Available Mutations in the LGI1 gene predispose to a hereditary epilepsy syndrome and is the first gene associated with this disease which does not encode an ion channel protein. In zebrafish, there are two paralogs of the LGI1 gene, lgi1a and lgi1b. Knockdown of lgi1a results in a seizure-like hyperactivity phenotype with associated developmental abnormalities characterized by cellular loss in the eyes and brain. We have now generated knockdown morphants for the lgi1b gene which also show developmental abnormalities but do not show a seizure-like behavior. Instead, the most striking phenotype involves significant enlargement of the ventricles (hydrocephalus. As shown for the lgi1a morphants, however, lgi1b morphants are also sensitized to PTZ-induced hyperactivity. The different phenotypes between the two lgi1 morphants support a subfunctionalization model for the two paralogs.

  16. Activation of PKA leads to mesenchymal-to-epithelial transition and loss of tumor-initiating ability.

    Science.gov (United States)

    Pattabiraman, Diwakar R; Bierie, Brian; Kober, Katharina Isabelle; Thiru, Prathapan; Krall, Jordan A; Zill, Christina; Reinhardt, Ferenc; Tam, Wai Leong; Weinberg, Robert A

    2016-03-04

    The epithelial-to-mesenchymal transition enables carcinoma cells to acquire malignancy-associated traits and the properties of tumor-initiating cells (TICs). TICs have emerged in recent years as important targets for cancer therapy, owing to their ability to drive clinical relapse and enable metastasis. Here, we propose a strategy to eliminate mesenchymal TICs by inducing their conversion to more epithelial counterparts that have lost tumor-initiating ability. We report that increases in intracellular levels of the second messenger, adenosine 3',5'-monophosphate, and the subsequent activation of protein kinase A (PKA) induce a mesenchymal-to-epithelial transition (MET) in mesenchymal human mammary epithelial cells. PKA activation triggers epigenetic reprogramming of TICs by the histone demethylase PHF2, which promotes their differentiation and loss of tumor-initiating ability. This study provides proof-of-principle for inducing an MET as differentiation therapy for TICs and uncovers a role for PKA in enforcing and maintaining the epithelial state.

  17. Genetic substitution of Cdk1 by Cdk2 leads to embryonic lethality and loss of meiotic function of Cdk2.

    Science.gov (United States)

    Satyanarayana, Ande; Berthet, Cyril; Lopez-Molina, Javier; Coppola, Vincenzo; Tessarollo, Lino; Kaldis, Philipp

    2008-10-01

    It was believed that Cdk2-cyclin E complexes are essential to drive cells through the G1-S phase transition. However, it was discovered recently that the mitotic kinase Cdk1 (Cdc2a) compensates for the loss of Cdk2. In the present study, we tested whether Cdk2 can compensate for the loss of Cdk1. We generated a knockin mouse in which the Cdk2 cDNA was knocked into the Cdk1 locus (Cdk1Cdk2KI). Substitution of both copies of Cdk1 by Cdk2 led to early embryonic lethality, even though Cdk2 was expressed from the Cdk1 locus. In addition, we generated Cdk2-/- Cdk1+/Cdk2KI mice in which one copy of Cdk2 and one copy of Cdk1 were expressed from the Cdk1 locus and the Cdk2 gene was deleted from the endogenous Cdk2 locus. We found that both male and female Cdk2-/- Cdk1+/Cdk2KI mice were sterile, similar to Cdk2-/- mice, even though they expressed the Cdk2 protein from the Cdk1 locus in testes. The translocational and cell cycle properties of knockin Cdk2 in Cdk2-/- Cdk1+/Cdk2KI cells were comparable to those of endogenous Cdk2, but we detected premature transcriptional activation of Cdk1 during liver regeneration in the absence of Cdk2. This study provides evidence of the molecular differences between Cdk2 and Cdk1 and highlights that the timing of transcriptional activation and the genetic locus play important roles in determining the function of Cdk proteins in vivo.

  18. DKK1 mediated inhibition of Wnt signaling in postnatal mice leads to loss of TEC progenitors and thymic degeneration.

    Directory of Open Access Journals (Sweden)

    Masako Osada

    Full Text Available BACKGROUND: Thymic epithelial cell (TEC microenvironments are essential for the recruitment of T cell precursors from the bone marrow, as well as the subsequent expansion and selection of thymocytes resulting in a mature self-tolerant T cell repertoire. The molecular mechanisms, which control both the initial development and subsequent maintenance of these critical microenvironments, are poorly defined. Wnt signaling has been shown to be important to the development of several epithelial tissues and organs. Regulation of Wnt signaling has also been shown to impact both early thymocyte and thymic epithelial development. However, early blocks in thymic organogenesis or death of the mice have prevented analysis of a role of canonical Wnt signaling in the maintenance of TECs in the postnatal thymus. METHODOLOGY/PRINCIPAL FINDINGS: Here we demonstrate that tetracycline-regulated expression of the canonical Wnt inhibitor DKK1 in TECs localized in both the cortex and medulla of adult mice, results in rapid thymic degeneration characterized by a loss of DeltaNP63(+ Foxn1(+ and Aire(+ TECs, loss of K5K8DP TECs thought to represent or contain an immature TEC progenitor, decreased TEC proliferation and the development of cystic structures, similar to an aged thymus. Removal of DKK1 from DKK1-involuted mice results in full recovery, suggesting that canonical Wnt signaling is required for the differentiation or proliferation of TEC populations needed for maintenance of properly organized adult thymic epithelial microenvironments. CONCLUSIONS/SIGNIFICANCE: Taken together, the results of this study demonstrate that canonical Wnt signaling within TECs is required for the maintenance of epithelial microenvironments in the postnatal thymus, possibly through effects on TEC progenitor/stem cell populations. Downstream targets of Wnt signaling, which are responsible for maintenance of these TEC progenitors may provide useful targets for therapies aimed at

  19. Big Data Knowledge Mining

    Directory of Open Access Journals (Sweden)

    Huda Umar Banuqitah

    2016-11-01

    Full Text Available Big Data (BD era has been arrived. The ascent of big data applications where information accumulation has grown beyond the ability of the present programming instrument to catch, manage and process within tolerable short time. The volume is not only the characteristic that defines big data, but also velocity, variety, and value. Many resources contain BD that should be processed. The biomedical research literature is one among many other domains that hides a rich knowledge. MEDLINE is a huge biomedical research database which remain a significantly underutilized source of biological information. Discovering the useful knowledge from such huge corpus leading to many problems related to the type of information such as the related concepts of the domain of texts and the semantic relationship associated with them. In this paper, an agent-based system of two–level for Self-supervised relation extraction from MEDLINE using Unified Medical Language System (UMLS Knowledgebase, has been proposed . The model uses a Self-supervised Approach for Relation Extraction (RE by constructing enhanced training examples using information from UMLS with hybrid text features. The model incorporates Apache Spark and HBase BD technologies with multiple data mining and machine learning technique with the Multi Agent System (MAS. The system shows a better result in comparison with the current state of the art and naïve approach in terms of Accuracy, Precision, Recall and F-score.

  20. Activating mutation in a mucolipin transient receptor potential channel leads to melanocyte loss in varitint-waddler mice.

    Science.gov (United States)

    Xu, Haoxing; Delling, Markus; Li, Linyu; Dong, Xianping; Clapham, David E

    2007-11-13

    Transient receptor potential (TRP) genes of the mucolipin subfamily (TRPML1-3 and MCOLN1-3) are presumed to encode ion channel proteins of intracellular endosomes and lysosomes. Mutations in human TRPML1 (mucolipin 1/MCOLN1) result in mucolipidosis type IV, a severe inherited neurodegenerative disease associated with defective lysosomal biogenesis and trafficking. A mutation in mouse TRPML3 (A419P; TRPML3(Va)) results in the varitint-waddler (Va) phenotype. Va mice are deaf, exhibit circling behavior due to vestibular defects, and have variegated/dilute coat color as a result of pigmentation defects. Prior electrophysiological studies of presumed TRPML plasma membrane channels are contradictory and inconsistent with known TRP channel properties. Here, we report that the Va mutation produces a gain-of-function that allows TRPML1 and TRPML3 to be measured and identified as inwardly rectifying, proton-impermeant, Ca(2+)-permeant cation channels. TRPML3 is highly expressed in normal melanocytes. Melanocyte markers are lost in the Va mouse, suggesting that their variegated and hypopigmented fur is caused by severe alteration of melanocyte function or cell death. TRPML3(Va) expression in melanocyte cell lines results in high resting Ca(2+) levels, rounded, poorly adherent cells, and loss of membrane integrity. We conclude that the Va phenotype is caused by mutation-induced TRPML3 gain-of-function, resulting in cell death.

  1. Loss of Lkb1 and Pten Leads to Lung Squamous Cell Carcinoma with Elevated PD-L1 Expression

    Science.gov (United States)

    Xu, Chunxiao; Fillmore, Christine M.; Koyama, Shohei; Wu, Hongbo; Zhao, Yanqiu; Chen, Zhao; Herter-Sprie, Grit S.; Akbay, Esra A.; Tchaicha, Jeremy H.; Altabef, Abigail; Reibel, Jacob B.; Walton, Zandra; Ji, Hongbin; Watanabe, Hideo; Jänne, Pasi A.; Castrillon, Diego H.; Rustgi, Anil K.; Bass, Adam J.; Freeman, Gordon J.; Padera, Robert F.; Dranoff, Glenn; Hammerman, Peter S.; Kim, Carla F.; Wong, Kwok-Kin

    2014-01-01

    SUMMARY Lung squamous cell carcinoma (SCC) is a deadly disease for which current treatments are inadequate. We demonstrate that biallelic inactivation of Lkb1 and Pten in the mouse lung leads to SCC that recapitulates the histology, gene expression, and microenvironment found in human disease. Lkb1;Pten null (LP) tumors expressed the squamous markers KRT5, p63 and SOX2, and transcriptionally resembled the basal subtype of human SCC. In contrast to mouse adenocarcinomas, the LP tumors contained immune populations enriched for tumor-associated neutrophils. SCA1+NGFR+ fractions were enriched for tumor-propagating cells (TPCs) that could serially transplant the disease in orthotopic assays. TPCs in the LP model and NGFR+ cells in human SCCs highly expressed Pd-ligand-1 (PD-L1), suggesting a mechanism of immune escape for TPCs. PMID:24794706

  2. Impaired mechanical response of an EDMD mutation leads to motility phenotypes that are repaired by loss of prenylation.

    Science.gov (United States)

    Zuela, Noam; Zwerger, Monika; Levin, Tal; Medalia, Ohad; Gruenbaum, Yosef

    2016-05-01

    There are roughly 14 distinct heritable autosomal dominant diseases associated with mutations in lamins A/C, including Emery-Dreifuss muscular dystrophy (EDMD). The mechanical model proposes that the lamin mutations change the mechanical properties of muscle nuclei, leading to cell death and tissue deterioration. Here, we developed an experimental protocol that analyzes the effect of disease-linked lamin mutations on the response of nuclei to mechanical strain in living Caenorhabditis elegans We found that the EDMD mutation L535P disrupts the nuclear mechanical response specifically in muscle nuclei. Inhibiting lamin prenylation rescued the mechanical response of the EDMD nuclei, reversed the muscle phenotypes and led to normal motility. The LINC complex and emerin were also required to regulate the mechanical response of C. elegans nuclei. This study provides evidence to support the mechanical model and offers a potential future therapeutic approach towards curing EDMD.

  3. Loss of cyclin-dependent kinase 5 from parvalbumin interneurons leads to hyperinhibition, decreased anxiety, and memory impairment.

    Science.gov (United States)

    Rudenko, Andrii; Seo, Jinsoo; Hu, Ji; Su, Susan C; de Anda, Froylan Calderon; Durak, Omer; Ericsson, Maria; Carlén, Marie; Tsai, Li-Huei

    2015-02-11

    Perturbations in fast-spiking parvalbumin (PV) interneurons are hypothesized to be a major component of various neuropsychiatric disorders; however, the mechanisms regulating PV interneurons remain mostly unknown. Recently, cyclin-dependent kinase 5 (Cdk5) has been shown to function as a major regulator of synaptic plasticity. Here, we demonstrate that genetic ablation of Cdk5 in PV interneurons in mouse brain leads to an increase in GABAergic neurotransmission and impaired synaptic plasticity. PVCre;fCdk5 mice display a range of behavioral abnormalities, including decreased anxiety and memory impairment. Our results reveal a central role of Cdk5 expressed in PV interneurons in gating inhibitory neurotransmission and underscore the importance of such regulation during behavioral tasks. Our findings suggest that Cdk5 can be considered a promising therapeutic target in a variety of conditions attributed to inhibitory interneuronal dysfunction, such as epilepsy, anxiety disorders, and schizophrenia.

  4. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  5. Organizational Design Challenges Resulting From Big Data

    Directory of Open Access Journals (Sweden)

    Jay R. Galbraith

    2014-04-01

    Full Text Available Business firms and other types of organizations are feverishly exploring ways of taking advantage of the big data phenomenon. This article discusses firms that are at the leading edge of developing a big data analytics capability. Firms that are currently enjoying the most success in this area are able to use big data not only to improve their existing businesses but to create new businesses as well. Putting a strategic emphasis on big data requires adding an analytics capability to the existing organization. This transformation process results in power shifting to analytics experts and in decisions being made in real time.

  6. Loss of keratin 8 phosphorylation leads to increased tumor progression and correlates with clinico-pathological parameters of OSCC patients.

    Directory of Open Access Journals (Sweden)

    Hunain Alam

    Full Text Available BACKGROUND: Keratins are cytoplasmic intermediate filament proteins expressed in tissue specific and differentiation dependent manner. Keratins 8 and 18 (K8 and K18 are predominantly expressed in simple epithelial tissues and perform both mechanical and regulatory functions. Aberrant expression of K8 and K18 is associated with neoplastic progression, invasion and poor prognosis in human oral squamous cell carcinomas (OSCCs. K8 and K18 undergo several post-translational modifications including phosphorylation, which are known to regulate their functions in various cellular processes. Although, K8 and K18 phosphorylation is known to regulate cell cycle, cell growth and apoptosis, its significance in cell migration and/or neoplastic progression is largely unknown. In the present study we have investigated the role of K8 phosphorylation in cell migration and/or neoplastic progression in OSCC. METHODOLOGY AND PRINCIPAL FINDINGS: To understand the role of K8 phosphorylation in neoplastic progression of OSCC, shRNA-resistant K8 phospho-mutants of Ser73 and Ser431 were overexpressed in K8-knockdown human AW13516 cells (derived from SCC of tongue; generated previously. Wound healing assays and tumor growth in NOD-SCID mice were performed to analyze the cell motility and tumorigenicity respectively in overexpressed clones. The overexpressed K8 phospho-mutants clones showed significant increase in cell migration and tumorigenicity as compared with K8 wild type clones. Furthermore, loss of K8 Ser73 and Ser431 phosphorylation was also observed in human OSCC tissues analyzed by immunohistochemistry, where their dephosphorylation significantly correlated with size, lymph node metastasis and stage of the tumor. CONCLUSION AND SIGNIFICANCE: Our results provide first evidence of a potential role of K8 phosphorylation in cell migration and/or tumorigenicity in OSCC. Moreover, correlation studies of K8 dephosphorylation with clinico-pathological parameters of OSCC

  7. Loss of Lrig1 leads to expansion of Brunner glands followed by duodenal adenomas with gastric metaplasia.

    Science.gov (United States)

    Wang, Yang; Shi, Chanjuan; Lu, Yuanyuan; Poulin, Emily J; Franklin, Jeffery L; Coffey, Robert J

    2015-04-01

    Leucine-rich repeats and immunoglobulin-like domains 1 (LRIG1) is a pan-ErbB negative regulator and intestinal stem cell marker down-regulated in many malignancies. We previously reported that 14 of 16 Lrig1-CreERT2/CreERT2 (Lrig1(-/-)) mice developed duodenal adenomas, providing the first in vivo evidence that Lrig1 acts as a tumor suppressor. We extended this study to a larger cohort and found that 49 of 54 Lrig1(-/-) mice develop duodenal adenomas beginning at 3 months. Most adenomas were histologically low grade and overlaid expanded Brunner glands. There was morphologic and biochemical blurring of the boundary between the epithelium and Brunner glands with glandular coexpression of ErbB2, which is normally restricted to the epithelium, and the Brunner gland marker Mucin6. Some adenomas were high grade with reduced Brunner glands. At age 4 to 5 weeks, before adenoma formation, we observed enhanced proliferation in Brunner glands and, at 2 months, an increase in the size of the Brunner gland compartment. Elevated expression of the epidermal growth factor receptor (Egfr) ligands amphiregulin and β-cellulin, as well as Egfr and phosphorylated Egfr, was detected in adenomas compared with adjacent normal tissue. These adenomas expressed the gastric-specific genes gastrokine1 and mucin5ac, indicating gastric metaplasia. Moreover, we found that a subset of human duodenal tumors exhibited features of LRIG1(-/-) adenomas, including loss of LRIG1, gastric metaplasia (MUCIN5AC and MUCIN6), and increased amphiregulin and Egfr activity.

  8. Arsenite binding-induced zinc loss from PARP-1 is equivalent to zinc deficiency in reducing PARP-1 activity, leading to inhibition of DNA repair

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xi; Zhou, Xixi [Department of Pharmaceutical Sciences, College of Pharmacy, University of New Mexico Health Sciences Center, Albuquerque, NM 87131 (United States); Du, Libo [Center for Molecular Science, Institute of Chemistry, Chinese Academy of Sciences, Beijing 100190 (China); Liu, Wenlan [Department of Pharmaceutical Sciences, College of Pharmacy, University of New Mexico Health Sciences Center, Albuquerque, NM 87131 (United States); Liu, Yang [Center for Molecular Science, Institute of Chemistry, Chinese Academy of Sciences, Beijing 100190 (China); Hudson, Laurie G. [Department of Pharmaceutical Sciences, College of Pharmacy, University of New Mexico Health Sciences Center, Albuquerque, NM 87131 (United States); Liu, Ke Jian, E-mail: kliu@salud.unm.edu [Department of Pharmaceutical Sciences, College of Pharmacy, University of New Mexico Health Sciences Center, Albuquerque, NM 87131 (United States)

    2014-01-15

    Inhibition of DNA repair is a recognized mechanism for arsenic enhancement of ultraviolet radiation-induced DNA damage and carcinogenesis. Poly(ADP-ribose) polymerase-1 (PARP-1), a zinc finger DNA repair protein, has been identified as a sensitive molecular target for arsenic. The zinc finger domains of PARP-1 protein function as a critical structure in DNA recognition and binding. Since cellular poly(ADP-ribosyl)ation capacity has been positively correlated with zinc status in cells, we hypothesize that arsenite binding-induced zinc loss from PARP-1 is equivalent to zinc deficiency in reducing PARP-1 activity, leading to inhibition of DNA repair. To test this hypothesis, we compared the effects of arsenite exposure with zinc deficiency, created by using the membrane-permeable zinc chelator TPEN, on 8-OHdG formation, PARP-1 activity and zinc binding to PARP-1 in HaCat cells. Our results show that arsenite exposure and zinc deficiency had similar effects on PARP-1 protein, whereas supplemental zinc reversed these effects. To investigate the molecular mechanism of zinc loss induced by arsenite, ICP-AES, near UV spectroscopy, fluorescence, and circular dichroism spectroscopy were utilized to examine arsenite binding and occupation of a peptide representing the first zinc finger of PARP-1. We found that arsenite binding as well as zinc loss altered the conformation of zinc finger structure which functionally leads to PARP-1 inhibition. These findings suggest that arsenite binding to PARP-1 protein created similar adverse biological effects as zinc deficiency, which establishes the molecular mechanism for zinc supplementation as a potentially effective treatment to reverse the detrimental outcomes of arsenic exposure. - Highlights: • Arsenite binding is equivalent to zinc deficiency in reducing PARP-1 function. • Zinc reverses arsenic inhibition of PARP-1 activity and enhancement of DNA damage. • Arsenite binding and zinc loss alter the conformation of zinc finger

  9. Loss of GCN5 leads to increased neuronal apoptosis by upregulating E2F1- and Egr-1-dependent BH3-only protein Bim.

    Science.gov (United States)

    Wu, Yanna; Ma, Shanshan; Xia, Yong; Lu, Yangpeng; Xiao, Shiyin; Cao, Yali; Zhuang, Sidian; Tan, Xiangpeng; Fu, Qiang; Xie, Longchang; Li, Zhiming; Yuan, Zhongmin

    2017-01-26

    Cellular acetylation homeostasis is a kinetic balance precisely controlled by histone acetyl-transferase (HAT) and histone deacetylase (HDAC) activities. The loss of the counterbalancing function of basal HAT activity alters the precious HAT:HDAC balance towards enhanced histone deacetylation, resulting in a loss of acetylation homeostasis, which is closely associated with neuronal apoptosis. However, the critical HAT member whose activity loss contributes to neuronal apoptosis remains to be identified. In this study, we found that inactivation of GCN5 by either pharmacological inhibitors, such as CPTH2 and MB-3, or by inactivation with siRNAs leads to a typical apoptosis in cultured cerebellar granule neurons. Mechanistically, the BH3-only protein Bim is transcriptionally upregulated by activated Egr-1 and E2F1 and mediates apoptosis following GCN5 inhibition. Furthermore, in the activity withdrawal- or glutamate-evoked neuronal apoptosis models, GCN5 loses its activity, in contrast to Bim induction. Adenovirus-mediated overexpression of GCN5 suppresses Bim induction and apoptosis. Interestingly, the loss of GCN5 activity and the induction of Egr-1, E2F1 and Bim are involved in the early brain injury (EBI) following subarachnoid haemorrhage (SAH) in rats. HDAC inhibition not only significantly rescues Bim expression and apoptosis induced by either potassium deprivation or GCN5 inactivation but also ameliorates these events and EBI in SAH rats. Taken together, our results highlight a new mechanism by which the loss of GCN5 activity promotes neuronal apoptosis through the transcriptional upregulation of Bim, which is probably a critical event in triggering neuronal death when cellular acetylation homeostasis is impaired.

  10. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  11. "Big Data" : big gaps of knowledge in the field of internet science

    OpenAIRE

    Snijders, CCP Chris; Matzat, U Uwe; Reips, UD

    2012-01-01

    Research on so-called 'Big Data' has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as 'small world' properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in...

  12. Big data=Big marketing?!

    Institute of Scientific and Technical Information of China (English)

    肖明超

    2012-01-01

    <正>互联网刚刚兴起的时候,有句话很流行:"在网上,没人知道你是一条狗。"但是,在20多年后的今天,这句话已经早被扔进了历史的垃圾堆,因为在技术的推动下,随着移动互联、社交网络、电子商务等的迅速发展,消费者的"行踪"变得越来越容易被把握,消费者在互联网上的眼球、行为轨迹、谈论、喜好、购物经历等等都可能被捕捉到,消费者进入一个几乎透明化生存的"大数据时代"(Age of Big Data)。数据不仅仅正在变得更加可用,人工智能(AI)技术,包括自然语言处理、模式识别和机器学习等技术的发展,正在让数据变得更加容易被计算机所理解,

  13. Loss of hypoxia-inducible factor 2 alpha in the lung alveolar epithelium of mice leads to enhanced eosinophilic inflammation in cobalt-induced lung injury.

    Science.gov (United States)

    Proper, Steven P; Saini, Yogesh; Greenwood, Krista K; Bramble, Lori A; Downing, Nathaniel J; Harkema, Jack R; Lapres, John J

    2014-02-01

    Hard metal lung disease (HMLD) is an occupational lung disease specific to inhalation of cobalt-containing particles whose mechanism is largely unknown. Cobalt is a known hypoxia mimic and stabilizer of the alpha subunits of hypoxia-inducible factors (HIFs). Previous work revealed that though HIF1α contrib utes to cobalt toxicity in vitro, loss of HIF1α in the alveolar epithelial cells does not provide in vivo protection from cobalt-induced lung inflammation. HIF1α and HIF2α show unique tissue expression profiles, and HIF2α is known to be the predominant HIF mRNA isoform in the adult lung. Thus, if HIF2α activation by cobalt contributes to pathophysiology of HMLD, we hypothesized that loss of HIF2α in lung epithelium would provide protection from cobalt-induced inflammation. Mice with HIF2α-deficiency in Club and alveolar type II epithelial cells (ATIIs) (HIF2α(Δ/Δ)) were exposed to cobalt (60 µg/day) or saline using a subacute occupational exposure model. Bronchoalveolar lavage cellularity, cytokines, qRT-PCR, and histopathology were analyzed. Results show that loss of HIF2α leads to enhanced eosinophilic inflammation and increased goblet cell metaplasia. Additionally, control mice demonstrated a mild recovery from cobalt-induced lung injury compared with HIF2α(Δ/Δ) mice, suggesting a role for epithelial HIF2α in repair mechanisms. The expression of important cytokines, such as interleukin (IL)-5 and IL-10, displayed significant differences following cobalt exposure when HIF2α(Δ/Δ) and control mice were compared. In summary, our data suggest that although loss of HIF2α does not afford protection from cobalt-induced lung inflammation, epithelial HIF2α signaling does play an important role in modulating the inflammatory and repair response in the lung.

  14. Early-life lead exposure recapitulates the selective loss of parvalbumin-positive GABAergic interneurons and subcortical dopamine system hyperactivity present in schizophrenia.

    Science.gov (United States)

    Stansfield, K H; Ruby, K N; Soares, B D; McGlothan, J L; Liu, X; Guilarte, T R

    2015-03-10

    Environmental factors have been associated with psychiatric disorders and recent epidemiological studies suggest an association between prenatal lead (Pb(2+)) exposure and schizophrenia (SZ). Pb(2+) is a potent antagonist of the N-methyl-D-aspartate receptor (NMDAR) and converging evidence indicates that NMDAR hypofunction has a key role in the pathophysiology of SZ. The glutamatergic hypothesis of SZ posits that NMDAR hypofunction results in the loss of parvalbumin (PV)-positive GABAergic interneurons (PVGI) in the brain. Loss of PVGI inhibitory control to pyramidal cells alters the excitatory drive to midbrain dopamine neurons increasing subcortical dopaminergic activity. We hypothesized that if Pb(2+) exposure in early life is an environmental risk factor for SZ, it should recapitulate the loss of PVGI and reproduce subcortical dopaminergic hyperactivity. We report that on postnatal day 50 (PN50), adolescence rats chronically exposed to Pb(2+) from gestation through adolescence exhibit loss of PVGI in SZ-relevant brain regions. PV and glutamic acid decarboxylase 67 kDa (GAD67) protein were significantly decreased in Pb(2+) exposed rats with no apparent change in calretinin or calbindin protein levels suggesting a selective effect on the PV phenotype of GABAergic interneurons. We also show that Pb(2+) animals exhibit a heightened locomotor response to cocaine and express significantly higher levels of dopamine metabolites and D2-dopamine receptors relative to controls indicative of subcortical dopaminergic hyperactivity. Our results show that developmental Pb(2+) exposure reproduces specific neuropathology and functional dopamine system changes present in SZ. We propose that exposure to environmental toxins that produce NMDAR hypofunction during critical periods of brain development may contribute significantly to the etiology of mental disorders.

  15. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  16. Structural study of the effects of mutations in proteins to identify the molecular basis of the loss of local structural fluidity leading to the onset of autoimmune diseases.

    Science.gov (United States)

    Ali, Ananya; Ghosh, Semanti; Bagchi, Angshuman

    2017-02-26

    Protein-Protein Interactions (PPIs) are crucial in most of the biological processes and PPI dysfunctions are known to be associated with the onsets of various diseases. One of such diseases is the auto-immune disease. Auto-immune diseases are one among the less studied group of diseases with very high mortality rates. Thus, we tried to correlate the appearances of mutations with their probable biochemical basis of the molecular mechanisms leading to the onset of the disease phenotypes. We compared the effects of the Single Amino Acid Variants (SAVs) in the wild type and mutated proteins to identify any structural deformities that might lead to altered PPIs leading ultimately to disease onset. For this we used Relative Solvent Accessibility (RSA) as a spatial parameter to compare the structural perturbation in mutated and wild type proteins. We observed that the mutations were capable to increase intra-chain PPIs whereas inter-chain PPIs would remain mostly unaltered. This might lead to more intra-molecular friction causing a deleterious alteration of protein's normal function. A Lyapunov exponent analysis, using the altered RSA values due to polymorphic and disease causing mutations, revealed polymorphic mutations have a positive mean value for the Lyapunov exponent while disease causing mutations have a negative mean value. Thus, local spatial stochasticity has been lost due to disease causing mutations, indicating a loss of structural fluidity. The amino acid conversion plot also showed a clear tendency of altered surface patch residue conversion propensity than polymorphic conversions. So far, this is the first report that compares the effects of different kinds of mutations (disease and non-disease causing polymorphic mutations) in the onset of autoimmune diseases.

  17. Reduced skeletal muscle mitochondrial respiration and improved glucose metabolism in nondiabetic obese women during a very low calorie dietary intervention leading to rapid weight loss

    DEFF Research Database (Denmark)

    Rabøl, Rasmus; Svendsen, Pernille F; Skovbro, Mette

    2009-01-01

    Reduced oxidative capacity of skeletal muscle has been proposed to lead to accumulation of intramyocellular triglyceride (IMTG) and insulin resistance. We have measured mitochondrial respiration before and after a 10% low-calorie-induced weight loss in young obese women to examine the relationship...... between mitochondrial function, IMTG, and insulin resistance. Nine obese women (age, 32.3 years [SD, 3.0]; body mass index, 33.4 kg/m(2) [SD, 2.6]) completed a 53-day (SE, 3.8) very low calorie diet (VLCD) of 500 to 600 kcal/d without altering physical activity. The target of the intervention was a 10.......79 (SE, 0.02) (P calorie diet; but mitochondrial function decreased, and IMTG remained unchanged. Our results do not support a direct relationship between mitochondrial function and insulin...

  18. Deletion of the glycosyltransferase bgsB of Enterococcus faecalis leads to a complete loss of glycolipids from the cell membrane and to impaired biofilm formation

    Directory of Open Access Journals (Sweden)

    Grohmann Elisabeth

    2011-04-01

    Full Text Available Abstract Background Deletion of the glycosyltransferase bgsA in Enterococcus faecalis leads to loss of diglucosyldiacylglycerol from the cell membrane and accumulation of its precursor monoglucosyldiacylglycerol, associated with impaired biofilm formation and reduced virulence in vivo. Here we analyzed the function of a putative glucosyltransferase EF2890 designated biofilm-associated glycolipid synthesis B (bgsB immediately downstream of bgsA. Results A deletion mutant was constructed by targeted mutagenesis in E. faecalis strain 12030. Analysis of cell membrane extracts revealed a complete loss of glycolipids from the cell membrane. Cell walls of 12030ΔbgsB contained approximately fourfold more LTA, and 1H-nuclear magnetic resonance (NMR spectroscopy suggested that the higher content of cellular LTA was due to increased length of the glycerol-phosphate polymer of LTA. 12030ΔbgsB was not altered in growth, cell morphology, or autolysis. However, attachment to Caco-2 cells was reduced to 50% of wild-type levels, and biofilm formation on polystyrene was highly impaired. Despite normal resistance to cationic antimicrobial peptides, complement and antibody-mediated opsonophagocytic killing in vitro, 12030ΔbgsB was cleared more rapidly from the bloodstream of mice than wild-type bacteria. Overall, the phenotype resembles the respective deletion mutant in the bgsA gene. Our findings suggest that loss of diglucosyldiacylglycerol or the altered structure of LTA in both mutants account for phenotypic changes observed. Conclusions In summary, BgsB is a glucosyltransferase that synthesizes monoglucosyldiacylglycerol. Its inactivation profoundly affects cell membrane composition and has secondary effects on LTA biosynthesis. Both cell-membrane amphiphiles are critical for biofilm formation and virulence of E. faecalis.

  19. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  20. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  1. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  2. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  3. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  4. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  5. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  6. Loss-of-function mutation in the X-linked TBX22 promoter disrupts an ETS-1 binding site and leads to cleft palate.

    Science.gov (United States)

    Fu, Xiazhou; Cheng, Yibin; Yuan, Jia; Huang, Chunhua; Cheng, Hanhua; Zhou, Rongjia

    2015-02-01

    The cleft palate only (CPO) is a common congenital defect with complex etiology in humans. The molecular etiology of the CPO remains unknown. Here, we report a loss-of-function mutation in X-linked TBX22 gene (T-box 22) in a six-generation family of the CPO with obvious phenotypes of both cleft palate and hyper-nasal speech. We identify a functional -73G>A mutation in the promoter of TBX22, which is located at the core-binding site of transcription factor ETS-1 (v-ets avian erythroblastosis virus E26 oncogene homolog 1). Phylogenetic analysis showed that the sequence around the -73G>A mutation site is specific in primates. The mutation was detected in all five affected male members cosegregating with the affected phenotype and heterozygote occurred only in some unaffected females of the family, suggesting an X-linked transmission of the mutation in the family. The -73G>A variant is a novel single nucleotide mutation. Cell co-transfections indicated that ETS-1 could activate the TBX22 promoter. Moreover, EMSA and ChIP assays demonstrated that the allele A disrupts the binding site of ETS-1, thus markedly decreases the activity of the TBX22 promoter, which is likely to lead to the birth defect of the CPO without ankyloglossia. These results suggest that a loss-of-function mutation in the X-linked TBX22 promoter may cause the cleft palate through disruption of TBX22-ETS-1 pathway.

  7. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  8. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  9. Expression of HIV-1 Vpu leads to loss of the viral restriction factor CD317/Tetherin from lipid rafts and its enhanced lysosomal degradation.

    Directory of Open Access Journals (Sweden)

    Ruth Rollason

    Full Text Available CD317/tetherin (aka BST2 or HM1.24 antigen is an interferon inducible membrane protein present in regions of the lipid bilayer enriched in sphingolipids and cholesterol (often termed lipid rafts. It has been implicated in an eclectic mix of cellular processes including, most notably, the retention of fully formed viral particles at the surface of cells infected with HIV and other enveloped viruses. Expression of the HIV viral accessory protein Vpu has been shown to lead to intracellular sequestration and degradation of tetherin, thereby counteracting the inhibition of viral release. There is evidence that tetherin interacts directly with Vpu, but it remains unclear where in the cell this interaction occurs or if Vpu expression affects the lipid raft localisation of tetherin. We have addressed these points using biochemical and cell imaging approaches focused on endogenous rather than ectopically over-expressed tetherin. We find i no evidence for an interaction between Vpu and endogenous tetherin at the cell surface, ii the vast majority of endogenous tetherin that is at the cell surface in control cells is in lipid rafts, iii internalised tetherin is present in non-raft fractions, iv expression of Vpu in cells expressing endogenous tetherin leads to the loss of tetherin from lipid rafts, v internalised tetherin enters early endosomes, and late endosomes, in both control cells and cells expressing Vpu, but the proportion of tetherin molecules destined for degradation rather than recycling is increased in cells expressing Vpu vi lysosomes are the primary site for degradation of endogenous tetherin in cells expressing Vpu. Our studies underlie the importance of studying endogenous tetherin and let us propose a model in which Vpu intercepts newly internalised tetherin and diverts it for lysosomal destruction rather than recycling to the cell surface.

  10. Loss-of-Function Mutations in YY1AP1 Lead to Grange Syndrome and a Fibromuscular Dysplasia-Like Vascular Disease.

    Science.gov (United States)

    Guo, Dong-Chuan; Duan, Xue-Yan; Regalado, Ellen S; Mellor-Crummey, Lauren; Kwartler, Callie S; Kim, Dong; Lieberman, Kenneth; de Vries, Bert B A; Pfundt, Rolph; Schinzel, Albert; Kotzot, Dieter; Shen, Xuetong; Yang, Min-Lee; Bamshad, Michael J; Nickerson, Deborah A; Gornik, Heather L; Ganesh, Santhi K; Braverman, Alan C; Grange, Dorothy K; Milewicz, Dianna M

    2017-01-05

    Fibromuscular dysplasia (FMD) is a heterogeneous group of non-atherosclerotic and non-inflammatory arterial diseases that primarily involves the renal and cerebrovascular arteries. Grange syndrome is an autosomal-recessive condition characterized by severe and early-onset vascular disease similar to FMD and variable penetrance of brachydactyly, syndactyly, bone fragility, and learning disabilities. Exome-sequencing analysis of DNA from three affected siblings with Grange syndrome identified compound heterozygous nonsense variants in YY1AP1, and homozygous nonsense or frameshift YY1AP1 variants were subsequently identified in additional unrelated probands with Grange syndrome. YY1AP1 encodes yin yang 1 (YY1)-associated protein 1 and is an activator of the YY1 transcription factor. We determined that YY1AP1 localizes to the nucleus and is a component of the INO80 chromatin remodeling complex, which is responsible for transcriptional regulation, DNA repair, and replication. Molecular studies revealed that loss of YY1AP1 in vascular smooth muscle cells leads to cell cycle arrest with decreased proliferation and increased levels of the cell cycle regulator p21/WAF/CDKN1A and disrupts TGF-β-driven differentiation of smooth muscle cells. Identification of YY1AP1 mutations as a cause of FMD indicates that this condition can result from underlying genetic variants that significantly alter the phenotype of vascular smooth muscle cells.

  11. Loss of extracellular superoxide dismutase leads to acute lung damage in the presence of ambient air: a potential mechanism underlying adult respiratory distress syndrome.

    Science.gov (United States)

    Gongora, Maria Carolina; Lob, Heinrich E; Landmesser, Ulf; Guzik, Tomasz J; Martin, W David; Ozumi, Kiyoski; Wall, Susan M; Wilson, David Scott; Murthy, Niren; Gravanis, Michael; Fukai, Tohru; Harrison, David G

    2008-10-01

    The extracellular superoxide dismutase 3 (SOD3) is highly expressed in both blood vessels and lungs. In different models of pulmonary injury, SOD3 is reduced; however, it is unclear whether this contributes to lung injury. To study the role of acute SOD3 reduction in lung injury, the SOD3 gene was deleted in adult mice by using the Cre-Lox technology. Acute reduction of SOD3 led to a fivefold increase in lung superoxide, marked inflammatory cell infiltration, a threefold increase in the arterial-alveolar gradient, respiratory acidosis, histological changes similar to those observed in adult respiratory distress syndrome, and 85% mortality. Treatment with the SOD mimetic MnTBAP and intranasal administration of SOD-containing polyketal microparticles reduced mortality, prevented the histological alterations, and reduced lung superoxide levels. To understand how mice with the SOD3 embryonic deletion survived without lung injury, gene array analysis was performed. These data demonstrated the up-regulation of 37 genes and down-regulation of nine genes, including those involved in cell signaling, inflammation, and gene transcription in SOD3-/- mice compared with either mice with acute SOD3 reduction or wild-type controls. These studies show that SOD3 is essential for survival in the presence of ambient oxygen and that acute loss of this enzyme can lead to severe lung damage. Strategies either to prevent SOD3 inactivation or to augment its levels might prove useful in the treatment of acute lung injury.

  12. Transient loss of the Y-chromosome in an elderly man with anemia and lead poisoning: chance occurrence or a clonal marker of the underlying hematological abnormality?

    Science.gov (United States)

    Mantadakis, Elpis; Boula, Anna M; Girakis, George; Xilouri, Irini M; Foudoulakis, Andreas M; Samonis, George

    2006-08-01

    One of the most important environmental and occupational pollutants is lead. Cytogenetic damage is known to occur to many individuals exposed to lead, e.g., outdoor and car painters, traffic policemen, gasoline station attendants, etc.1 Chronic lead exposure affects many organ systems leading to a gradual decline in the so-called safe blood lead levels over time.

  13. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  14. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  15. Weld manufacturing of big heat-exchangers

    Energy Technology Data Exchange (ETDEWEB)

    Braeutigam, M.; Huppertz, P.H.

    1986-06-24

    The topic of this article are big heat exchangers, which are developed and constructed to minimize energy losses in plants of process engineering. Some welding specifications are discussed in detail. Some constructive details, as e.g. materials selection and vibration safety complete this contribution.

  16. Mutations in the HLA class II genes leading to loss of expression of HLA-DR and HLA-DQ in diffuse large B-cell lymphoma

    NARCIS (Netherlands)

    Jordanova, ES; Philippo, K; Giphart, MJ; Schuuring, E; Kluin, PM

    2003-01-01

    Loss of expression of human leukocyte antigen (HLA) class II molecules on tumor cells affects the onset and modulation of the immune response through lack of activation of CD4(+) T lymphocytes. Previously, we showed that the frequent loss of expression of HLA class II in diffuse large B-cell lymphom

  17. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  18. Big Boss Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper big boss interval games are introduced and various characterizations are given. The structure of the core of a big boss interval game is explicitly described and plays an important role relative to interval-type bi-monotonic allocation schemes for such games. Specifically, each element

  19. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  20. "Big Data": Big Knowledge Gaps in the Field of Internet Science

    Directory of Open Access Journals (Sweden)

    Ulf-Dietrich Reips

    2012-01-01

    Full Text Available Research on so-called ‘Big Data’ has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as ‘small world’ properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in addition a different strategy that leads to knowledge about micro-processes that match with actual online behavior. This knowledge can then be used for the selection of mathematically-tractable models of online network formation and evolution. Insight from social and behavioral research is needed for pursuing this strategy of knowledge generation about micro-processes. Accordingly, our proposal points to a unique role that social scientists could play in Big Data research. ...

  1. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  2. Big Data Analytics for Disaster Preparedness and Response of Mobile Communication Infrastructure during Natural Hazards

    Science.gov (United States)

    Zhong, L.; Takano, K.; Ji, Y.; Yamada, S.

    2015-12-01

    The disruption of telecommunications is one of the most critical disasters during natural hazards. As the rapid expanding of mobile communications, the mobile communication infrastructure plays a very fundamental role in the disaster response and recovery activities. For this reason, its disruption will lead to loss of life and property, due to information delays and errors. Therefore, disaster preparedness and response of mobile communication infrastructure itself is quite important. In many cases of experienced disasters, the disruption of mobile communication networks is usually caused by the network congestion and afterward long-term power outage. In order to reduce this disruption, the knowledge of communication demands during disasters is necessary. And big data analytics will provide a very promising way to predict the communication demands by analyzing the big amount of operational data of mobile users in a large-scale mobile network. Under the US-Japan collaborative project on 'Big Data and Disaster Research (BDD)' supported by the Japan Science and Technology Agency (JST) and National Science Foundation (NSF), we are going to investigate the application of big data techniques in the disaster preparedness and response of mobile communication infrastructure. Specifically, in this research, we have considered to exploit the big amount of operational information of mobile users for predicting the communications needs in different time and locations. By incorporating with other data such as shake distribution of an estimated major earthquake and the power outage map, we are able to provide the prediction information of stranded people who are difficult to confirm safety or ask for help due to network disruption. In addition, this result could further facilitate the network operators to assess the vulnerability of their infrastructure and make suitable decision for the disaster preparedness and response. In this presentation, we are going to introduce the

  3. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  4. Differential Privacy Preserving in Big Data Analytics for Connected Health.

    Science.gov (United States)

    Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei

    2016-04-01

    In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.

  5. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  6. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  7. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  8. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  9. Inhomogeneous Big Bang Cosmology

    CERN Document Server

    Wagh, S M

    2002-01-01

    In this letter, we outline an inhomogeneous model of the Big Bang cosmology. For the inhomogeneous spacetime used here, the universe originates in the infinite past as the one dominated by vacuum energy and ends in the infinite future as the one consisting of "hot and relativistic" matter. The spatial distribution of matter in the considered inhomogeneous spacetime is {\\em arbitrary}. Hence, observed structures can arise in this cosmology from suitable "initial" density contrast. Different problems of the standard model of Big Bang cosmology are also resolved in the present inhomogeneous model. This inhomogeneous model of the Big Bang Cosmology predicts "hot death" for the universe.

  10. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  11. Chronic dietary n-6 PUFA deprivation leads to conservation of arachidonic acid and more rapid loss of DHA in rat brain phospholipids.

    Science.gov (United States)

    Lin, Lauren E; Chen, Chuck T; Hildebrand, Kayla D; Liu, Zhen; Hopperton, Kathryn E; Bazinet, Richard P

    2015-02-01

    To determine how the level of dietary n-6 PUFA affects the rate of loss of arachidonic acid (ARA) and DHA in brain phospholipids, male rats were fed either a deprived or adequate n-6 PUFA diet for 15 weeks postweaning, and then subjected to an intracerebroventricular infusion of (3)H-ARA or (3)H-DHA. Brains were collected at fixed times over 128 days to determine half-lives and the rates of loss from brain phospholipids (J out). Compared with the adequate n-6 PUFA rats, the deprived n-6-PUFA rats had a 15% lower concentration of ARA and an 18% higher concentration of DHA in their brain total phospholipids. Loss half-lives of ARA in brain total phospholipids and fractions (except phosphatidylserine) were longer in the deprived n-6 PUFA rats, whereas the J out was decreased. In the deprived versus adequate n-6 PUFA rats, the J out of DHA was higher. In conclusion, chronic n-6 PUFA deprivation decreases the rate of loss of ARA and increases the rate of loss of DHA in brain phospholipids. Thus, a low n-6 PUFA diet can be used to target brain ARA and DHA metabolism.

  12. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  13. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  14. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  15. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  16. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  17. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  18. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  19. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  20. Toward effective software solutions for big biology

    NARCIS (Netherlands)

    Prins, Pjotr; de Ligt, Joep; Tarasov, Artem; Jansen, Ritsert C; Cuppen, Edwin; Bourne, Philip E

    2015-01-01

    Leading scientists tell us that the problem of large data and data integration, referred to as 'big data', is acute and hurting research. Recently, Snijder et al.1 suggested a culture change in which scientists would aim to share high-dimensional data among laboratories. It is important to realize t

  1. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  2. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  3. ANALYTICS OF BIG DATA

    OpenAIRE

    Asst. Prof. Shubhada Talegaon

    2014-01-01

    Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, senti...

  4. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  5. Big data need big theory too

    OpenAIRE

    Coveney, Peter V.; Dougherty, Edward R; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  6. Big data need big theory too.

    OpenAIRE

    Coveney, P. V.; Dougherty, E. R.; Highfield, R. R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  7. Big data need big theory too

    Science.gov (United States)

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  8. ALICE: Simulated lead-lead collision

    CERN Document Server

    2003-01-01

    This track is an example of simulated data modelled for the ALICE detector on the Large Hadron Collider (LHC) at CERN, which will begin taking data in 2008. ALICE will focus on the study of collisions between nuclei of lead, a heavy element that produces many different particles when collided. It is hoped that these collisions will produce a new state of matter known as the quark-gluon plasma, which existed billionths of a second after the Big Bang.

  9. Activation of the prefrontal cortex by unilateral transcranial direct current stimulation leads to an asymmetrical effect on risk preference in frames of gain and loss.

    Science.gov (United States)

    Ye, Hang; Huang, Daqiang; Wang, Siqi; Zheng, Haoli; Luo, Jun; Chen, Shu

    2016-10-01

    Previous brain imaging and brain stimulation studies have suggested that the dorsolateral prefrontal cortex may be critical in regulating risk-taking behavior, although its specific causal effect on people's risk preference remains controversial. This paper studied the independent modulation of the activity of the right and left dorsolateral prefrontal cortex using various configurations of transcranial direct current stimulation. We designed a risk-measurement table and adopted a within-subject design to compare the same participant's risk preference before and after unilateral stimulation when presented with different frames of gain and loss. The results confirmed a hemispheric asymmetry and indicated that the right dorsolateral prefrontal cortex has an asymmetric effect on risk preference regarding frames of gain and loss. Enhancing the activity of the right dorsolateral prefrontal cortex significantly decreased the participants' degree of risk aversion in the gain frame, whereas it increased the participants' degree of risk aversion in the loss frame. Our findings provide important information regarding the impact of transcranial direct current stimulation on the risk preference of healthy participants. The effects observed in our experiment compared with those of previous studies provide further evidence of the effects of hemispheric and frame-dependent asymmetry. These findings may be helpful in understanding the neural basis of risk preference in humans, especially when faced with decisions involving possible gain or loss relative to the status quo.

  10. Lack of dystrophin leads to the selective loss of superior cervical ganglion neurons projecting to muscular targets in genetically dystrophic mdx mice.

    Science.gov (United States)

    De Stefano, M Egle; Leone, Lucia; Lombardi, Loredana; Paggi, Paola

    2005-12-01

    Autonomic imbalance is a pathological aspect of Duchenne muscular dystrophy. Here, we show that the sympathetic superior cervical ganglion (SCG) of mdx mice, which lack dystrophin (Dp427), has 36% fewer neurons than that of wild-type animals. Cell loss occurs around P10 and affects those neurons innervating muscular targets (heart and iris), which, differently from the submandibular gland (non-muscular target), are precociously damaged by the lack of Dp427. In addition, although we reveal altered axonal defasciculation in the submandibular gland and reduced terminal sprouting in all SCG target organs, poor adrenergic innervation is observed only in the heart and iris. These alterations, detected as early as P5, when neuronal loss has not yet occurred, suggest that in mdx mice the absence of Dp427 directly impairs the axonal growth and terminal sprouting of sympathetic neurons. However, when these intrinsic alterations combine with structural and/or functional damages of muscular targets, neuronal death occurs.

  11. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  12. Big Data Issues: Performance, Scalability, Availability

    Directory of Open Access Journals (Sweden)

    Laura Matei

    2014-03-01

    Full Text Available Nowadays, Big Data is probably one of the most discussed topics not only in the area of data analysis, but, I believe, in the whole realm of information technology. The simple typing of the words „big data” on an online search engine like Google will retrieve approximately 1,660,000,000 results. Having such a buzz gathered around this term, I could not help but wonder what this phenomenon means.The ever greater portion that the combination of Internet, Cloud Computing and mobile devices has been occupying in our lives, lead to an ever increasing amount of data that must be captured, communicated, aggregated, stored, and analyzed. These sets of data that we are generating are called Big Data.

  13. Acute effect of weight loss on levels of total bilirubin in obese, cardiovascular high-risk patients: an analysis from the lead-in period of the Sibutramine Cardiovascular Outcome trial

    DEFF Research Database (Denmark)

    Andersson, Charlotte; Weeke, Peter; Fosbøl, Emil Loldrup;

    2009-01-01

    Low levels of bilirubin are associated with an increased risk of cardiovascular adverse events. Weight reduction is known to reduce several cardiovascular risk factors, but effects on bilirubin levels have not been reported. We studied the response of weight loss therapy with sibutramine...... and lifestyle change on levels of total bilirubin in an overweight or obese, cardiovascular high-risk population. Data from the first 4 weeks of the lead-in period of the Sibutramine Cardiovascular Outcome study were analyzed. A total of 10 198 patients provided body weight measurements before and after 4 weeks......, respectively. At screening, bilirubin concentrations were similar between weight loss groups (around 11 micromol/L, P = .7) and increased linearly as a function of weight loss. The effect was significantly more pronounced in men compared with women (P for interaction = .003). Adjusted for multiple variables...

  14. Physical training and weight loss in dogs lead to transcriptional changes in genes involved in the glucose-transport pathway in muscle and adipose tissues.

    Science.gov (United States)

    Herrera Uribe, Juber; Vitger, Anne D; Ritz, Christian; Fredholm, Merete; Bjørnvad, Charlotte R; Cirera, Susanna

    2016-02-01

    Obesity is a worldwide problem in humans and domestic animals. Interventions, including a combination of dietary management and exercise, have proven to be effective for inducing weight loss in humans. In companion animals, the role of exercise in the management of obesity has received relatively little attention. The aim of the present study was to investigate changes in the transcriptome of key energy metabolism genes in muscle and adipose tissues in response to diet-induced weight loss alone, or combined with exercise in dogs. Overweight pet dogs were enrolled on a weight loss programme, based on calorie restriction and physical training (FD group, n = 5) or calorie restriction alone (DO group, n = 7). mRNA expression of 12 genes and six microRNAs were investigated using quantitative real-time PCR (qPCR). In the FD group, FOXO1 and RAC1 were expressed at lower levels in adipose tissue, whereas ESRRA and AKT2 were more highly expressed in muscle, when compared with the DO group. Comparing expression before and after the intervention, in the DO group, nine genes and three microRNAs showed significant altered expression in adipose tissue (PPARG, ADIPOQ and FOXO1; P ESRRA, AKT2, PGC1a and mir-23; P < 0.001) in muscle. Thus, calorie restriction causes regulation of several metabolic genes in both tissues. The mild exercise, incorporated into this study design, was sufficient to elicit transcriptional changes in adipose and muscle tissues, suggesting a positive effect on glucose metabolism. The study findings support inclusion of exercise in management of canine obesity.

  15. Downregulation of L-type Ca2+ channel in rat mesenteric arteries leads to loss of smooth muscle contractile phenotype and inward hypertrophic remodeling.

    Science.gov (United States)

    Kudryavtseva, Olga; Herum, Kate Møller; Dam, Vibeke Secher; Straarup, Marthe Simonsen; Kamaev, Dmitry; Briggs Boedtkjer, Donna M; Matchkov, Vladimir V; Aalkjær, Christian

    2014-05-01

    L-type Ca(2+) channels (LTCCs) are important for vascular smooth muscle cell (VSMC) contraction, as well as VSMC differentiation, as indicated by loss of LTCCs during VSMC dedifferentiation. However, it is not clear whether loss of LTCCs is a primary event underlying phenotypic modulation or whether loss of LTCCs has significance for vascular structure. We used small interference RNA (siRNA) transfection in vivo to investigate the role of LTCCs in VSMC phenotypic expression and structure of rat mesenteric arteries. siRNA reduced LTCC mRNA and protein expression in rat mesenteric arteries 3 days after siRNA transfection to 12.7 ± 0.7% and 47.3 ± 13%, respectively: this was associated with an increased resting intracellular Ca(2+) concentration ([Ca(2+)]i). Despite the high [Ca(2+)]i, the contractility was reduced (tension development to norepinephrine was 3.5 ± 0.2 N/m and 0.8 ± 0.2 N/m for sham-transfected and downregulated arteries respectively; P arteries downregulated for LTCCs. Phenotypic changes were associated with a 45% increase in number of VSMCs and a consequent increase of media thickness and media area. Ten days after siRNA transfection arterial structure was again normalized. The contractile responses of LTCC-siRNA transfected arteries were elevated in comparison with matched controls 10 days after transfection. The study provides strong evidence for causal relationships between LTCC expression and VSMC contractile phenotype, as well as novel data addressing the complex relationship between VSMC contractility, phenotype, and vascular structure. These findings are relevant for understanding diseases, associated with phenotype changes of VSMC and vascular remodeling, such as atherosclerosis and hypertension.

  16. The contribution of leading diseases and risk factors to excess losses of healthy life in eastern Europe: burden of disease study

    Directory of Open Access Journals (Sweden)

    Vander Hoorn Stephen

    2005-11-01

    Full Text Available Abstract Background The East/West gradient in health across Europe has been described often, but not using metrics as comprehensive and comparable as those of the Global Burden of Disease 2000 and Comparative Risk Assessment studies. Methods Comparisons are made across 3 epidemiological subregions of the WHO region for Europe – A (very low child and adult mortality, B (low child and low adult mortality and C (low child and high adult mortality – with populations in 2000 of 412, 218 and 243 millions respectively, and using the following measures: 1. Probabilities of death by sex and causal group across 7 age intervals; 2. Loss of healthy life (DALYs to diseases and injuries per thousand population; 3. Loss of healthy life (DALYs attributable to selected risk factors across 3 age ranges. Results Absolute differences in mortality are most marked in males and in younger adults, and for deaths from vascular diseases and from injuries. Dominant contributions to east-west differences come from the nutritional/physiological group of risk factors (blood pressure, cholesterol concentration, body mass index, low fruit and vegetable consumption and inactivity contributing to vascular disease and from the legal drugs – tobacco and alcohol. Conclusion The main requirements for reducing excess health losses in the east of Europe are: 1 favorable shifts in all amenable vascular risk factors (irrespective of their current levels by population-wide and personal measures; 2 intensified tobacco control; 3 reduced alcohol consumption and injury control strategies (for example, for road traffic injuries. Cost effective strategies are broadly known but local institutional support for them needs strengthening.

  17. Differential changes in serum uric acid concentrations in sibutramine promoted weight loss in diabetes: results from four weeks of the lead-in period of the SCOUT trial

    DEFF Research Database (Denmark)

    Andersson, Charlotte; Weeke, Peter; Brendorp, Bente;

    2009-01-01

    , but greater weight loss and diabetes were associated with smaller falls in blood uric acid levels; decreasing fasting and urinary glucose concentrations in diabetes were associated with increases in uric acid levels. CONCLUSION: A four week daily intake of sibutramine and life style changes was associated...... (mean +/- standard deviation) at screening were significantly higher among patients with CVD compared to patients without CVD (p ... with significant reductions in mean uric acid levels. Changes in renal glucose load in diabetes seem to counteract a potential uricosuric effect of sibutramine. TRIAL REGISTRATION: The trial is registered at ClinicalTrial.gov number: NCT00234832....

  18. 发电机进相试验引起失磁保护动作分析%Analysis of the tripping accident caused by the loss of excitation protection during generator leading phase testing

    Institute of Scientific and Technical Information of China (English)

    陈和龙

    2012-01-01

    介绍了一起发电机进相试验引起失磁保护动作的事件,由此分析比较了发电机失磁保护采用不同的阻抗圆定值对发电机进相深度的限制情况,提出用发电机励磁低励限制曲线整定来防止发电机失磁保护由于发电机进相误动作的方法。%Error acting of excitation loss protection during generator leading phase testing is introduced. This thesis anaIyzed and compared different impedance circle constant values adopted by generator loss of excitation protection with corresponding limiting cases of genetator leading phase depth. It is put forward a method by using generator low excitation limit curve to prevent generator loss of excitation protection caused by genetator leading phase misoperation.

  19. Focus : big data, little questions?

    OpenAIRE

    Uprichard, Emma

    2013-01-01

    Big data. Little data. Deep data. Surface data. Noisy, unstructured data. Big. The world of data has gone from being analogue and digital, qualitative and quantitative, transactional and a by-product, to, simply, BIG. It is as if we couldn’t quite deal with its omnipotence and just ran out of adjectives. BIG. With all the data power it is supposedly meant to entail, one might have thought that a slightly better descriptive term might have been latched onto. But, no. BIG. Just BIG.

  20. Selective deletion of the membrane-bound colony stimulating factor 1 isoform leads to high bone mass but does not protect against estrogen-deficiency bone loss.

    Science.gov (United States)

    Yao, Gang-Qing; Wu, Jian-Jun; Troiano, Nancy; Zhu, Mei-Ling; Xiao, Xiao-Yan; Insogna, Karl

    2012-07-01

    To better define the biologic function of membrane-bound CSF1 (mCSF1) in vivo, we have generated mCSF1 knockout (k/o) mice. Spinal bone density (BMD) was 15.9% higher in k/o mice compared to wild-type (wt) controls (P bone marrow isolated from mCSF1 k/o mice was reduced compared to wt marrow. There were no defects in osteoblast number or function suggesting that the basis for the high bone mass phenotype was reduced resorption. In addition to a skeletal phenotype, k/o mice had significantly elevated serum triglyceride levels (123 ± 7 vs. 88 ± 3.2 mg/dl; k/o vs. wt, P bone loss following ovariectomy (OVX). OVX induced a significant fourfold increase in the expression of the soluble CSF1 isoform (sCSF1) in the bones of wt mice while expression of mCSF1 was unchanged. These findings indicate that mCSF1 is essential for normal bone remodeling since, in its absence, BMD is increased. Membrane-bound CSF1 does not appear to be required for estrogen-deficiency bone loss while in contrast; our data suggest that sCSF1 could play a key role in this pathologic process. The reasons why mCSF1 k/o mice have hypertriglyceridemia are currently under study.

  1. Age-Associated Methylation Suppresses SPRY1, Leading to a Failure of Re-quiescence and Loss of the Reserve Stem Cell Pool in Elderly Muscle

    Directory of Open Access Journals (Sweden)

    Anne Bigot

    2015-11-01

    Full Text Available The molecular mechanisms by which aging affects stem cell number and function are poorly understood. Murine data have implicated cellular senescence in the loss of muscle stem cells with aging. Here, using human cells and by carrying out experiments within a strictly pre-senescent division count, we demonstrate an impaired capacity for stem cell self-renewal in elderly muscle. We link aging to an increased methylation of the SPRY1 gene, a known regulator of muscle stem cell quiescence. Replenishment of the reserve cell pool was modulated experimentally by demethylation or siRNA knockdown of SPRY1. We propose that suppression of SPRY1 by age-associated methylation in humans inhibits the replenishment of the muscle stem cell pool, contributing to a decreased regenerative response in old age. We further show that aging does not affect muscle stem cell senescence in humans.

  2. Age-Associated Methylation Suppresses SPRY1, Leading to a Failure of Re-quiescence and Loss of the Reserve Stem Cell Pool in Elderly Muscle.

    Science.gov (United States)

    Bigot, Anne; Duddy, William J; Ouandaogo, Zamalou G; Negroni, Elisa; Mariot, Virginie; Ghimbovschi, Svetlana; Harmon, Brennan; Wielgosik, Aurore; Loiseau, Camille; Devaney, Joe; Dumonceaux, Julie; Butler-Browne, Gillian; Mouly, Vincent; Duguez, Stéphanie

    2015-11-10

    The molecular mechanisms by which aging affects stem cell number and function are poorly understood. Murine data have implicated cellular senescence in the loss of muscle stem cells with aging. Here, using human cells and by carrying out experiments within a strictly pre-senescent division count, we demonstrate an impaired capacity for stem cell self-renewal in elderly muscle. We link aging to an increased methylation of the SPRY1 gene, a known regulator of muscle stem cell quiescence. Replenishment of the reserve cell pool was modulated experimentally by demethylation or siRNA knockdown of SPRY1. We propose that suppression of SPRY1 by age-associated methylation in humans inhibits the replenishment of the muscle stem cell pool, contributing to a decreased regenerative response in old age. We further show that aging does not affect muscle stem cell senescence in humans.

  3. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  4. How should we do the history of Big Data?

    Directory of Open Access Journals (Sweden)

    David Beer

    2016-04-01

    Full Text Available Taking its lead from Ian Hacking’s article ‘How should we do the history of statistics?’, this article reflects on how we might develop a sociologically informed history of Big Data. It argues that within the history of social statistics we have a relatively well developed history of the material phenomenon of Big Data. Yet this article argues that we now need to take the concept of ‘Big Data’ seriously, there is a pressing need to explore the type of work that is being done by that concept. The article suggests a programme for work that explores the emergence of the concept of Big Data so as to track the institutional, organisational, political and everyday adoption of this term. It argues that the term Big Data has the effect of making-up data and, as such, is powerful in framing our understanding of those data and the possibilities that they afford.

  5. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  6. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  7. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  8. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  9. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  10. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  11. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  12. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  13. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  14. Chronologically scheduled snacking with high-protein products within the habitual diet in type-2 diabetes patients leads to a fat mass loss: a longitudinal study

    Directory of Open Access Journals (Sweden)

    Martínez J Alfredo

    2011-07-01

    Full Text Available Abstract Background Obesity is the most relevant overnutrition disease worldwide and is associated to different metabolic disorders such as insulin resistance and type-2 diabetes. Low glycemic load foods and diets and moderately high protein intake have been shown to reduce body weight and fat mass, exerting also beneficial effects on LDL-cholesterol, triglyceride concentrations, postprandial glucose curve and HDL-cholesterol levels. The present study aimed at studying the potential functionality of a series of low glycemic index products with moderately high protein content, as possible coadjuvants in the control of type-2 diabetes and weight management following a chronologically planned snacking offer (morning and afternoon. Methods The current trial followed a single group, sequential, longitudinal design, with two consecutive periods of 4 weeks each. A total of 17 volunteers participated in the study. The first period was a free living period, with volunteers' habitual ad libitum dietary pattern, while the second period was a free-living period with structured meal replacements at breakfast, morning snack and afternoon snack, which were exchanged by specific products with moderately high protein content and controlled low glycemic index, following a scheduled temporal consumption. Blood extractions were performed at the beginning and at the end of each period (free-living and intervention. Parameters analysed were: fasting glucose, insulin, glycosylated hemoglobin, total-, HDL- and LDL-cholesterol, triglyceride, C - reactive protein and Homocysteine concentrations. Postprandial glucose and insulin were also measured. Anthropometrical parameters were monitored each 2 weeks during the whole study. Results A modest but significant (p = 0.002 reduction on body weight (1 kg was observed during the intervention period, mainly due to the fat mass loss (0.8 kg, p = 0.02. This weight reduction was observed without apparently associated changes in

  15. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  16. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  17. Loss of MeCP2 in Parvalbumin-and Somatostatin-Expressing Neurons in Mice Leads to Distinct Rett Syndrome-like Phenotypes.

    Science.gov (United States)

    Ito-Ishida, Aya; Ure, Kerstin; Chen, Hongmei; Swann, John W; Zoghbi, Huda Y

    2015-11-18

    Inhibitory neurons are critical for proper brain function, and their dysfunction is implicated in several disorders, including autism, schizophrenia, and Rett syndrome. These neurons are heterogeneous, and it is unclear which subtypes contribute to specific neurological phenotypes. We deleted Mecp2, the mouse homolog of the gene that causes Rett syndrome, from the two most populous subtypes, parvalbumin-positive (PV+) and somatostatin-positive (SOM+) neurons. Loss of MeCP2 partially impairs the affected neuron, allowing us to assess the function of each subtype without profound disruption of neuronal circuitry. We found that mice lacking MeCP2 in either PV+ or SOM+ neurons have distinct, non-overlapping neurological features: mice lacking MeCP2 in PV+ neurons developed motor, sensory, memory, and social deficits, whereas those lacking MeCP2 in SOM+ neurons exhibited seizures and stereotypies. Our findings indicate that PV+ and SOM+ neurons contribute complementary aspects of the Rett phenotype and may have modular roles in regulating specific behaviors.

  18. Oxygen vacancies lead to loss of domain order, particle fracture, and rapid capacity fade in lithium manganospinel (LiMn₂O₄) batteries.

    Science.gov (United States)

    Hao, Xiaoguang; Lin, Xianke; Lu, Wei; Bartlett, Bart M

    2014-07-23

    Spinel-structured lithium manganese oxide (LiMn2O4) has attracted much attention because of its high energy density, low cost, and environmental impact. In this article, structural analysis methods such as powder neutron diffraction (PND), X-ray diffraction (XRD), and high-resolution transmission and scanning electron microscopies (TEM & SEM) reveal the capacity fading mechanism of LiMn2O4 as it relates to the mechanical degradation of the material. Micro-fractures form after the first charge (to 4.45 V vs. Li(+/0)) of a commercial lithium manganese oxide phase, best represented by the formula LiMn2O3.88. Diffraction methods show that the grain size decreases and multiple phases form after 850 electrochemical cycles at 0.2 C current. The microfractures are directly observed through microscopy studies as particle cracks propagate along the (1 1 1) planes, with clear lattice twisting observed along this direction. Long-term galvanostatic cycling results in increased charge-transfer resistance and capacity loss. Upon preparing samples with controlled oxygen contents, LiMn2O4.03 and LiMn2O3.87, the mechanical failure of the lithium manganese oxide can be correlated to the oxygen vacancies in the materials, providing guidance for better synthesis methods.

  19. The loss of Gnai2 and Gnai3 in B cells eliminates B lymphocyte compartments and leads to a hyper-IgM like syndrome.

    Directory of Open Access Journals (Sweden)

    Il-Young Hwang

    Full Text Available B lymphocytes are compartmentalized within lymphoid organs. The organization of these compartments depends upon signaling initiated by G-protein linked chemoattractant receptors. To address the importance of the G-proteins Gαi2 and Gαi3 in chemoattractant signaling we created mice lacking both proteins in their B lymphocytes. While bone marrow B cell development and egress is grossly intact; mucosal sites, splenic marginal zones, and lymph nodes essentially lack B cells. There is a partial block in splenic follicular B cell development and a 50-60% reduction in splenic B cells, yet normal numbers of splenic T cells. The absence of Gαi2 and Gαi3 in B cells profoundly disturbs the architecture of lymphoid organs with loss of B cell compartments in the spleen, thymus, lymph nodes, and gastrointestinal tract. This results in a severe disruption of B cell function and a hyper-IgM like syndrome. Beyond the pro-B cell stage, B cells are refractory to chemokine stimulation, and splenic B cells are poorly responsive to antigen receptor engagement. Gαi2 and Gαi3 are therefore critical for B cell chemoattractant receptor signaling and for normal B cell function. These mice provide a worst case scenario of the consequences of losing chemoattractant receptor signaling in B cells.

  20. Effects of Fastac 50 EC on bumble bee Bombus terrestris L. respiration: DGE disappearance does not lead to increasing water loss.

    Science.gov (United States)

    Muljar, Riin; Karise, Reet; Viik, Eneli; Kuusik, Aare; Williams, Ingrid; Metspalu, Luule; Hiiesaar, Külli; Must, Anne; Luik, Anne; Mänd, Marika

    2012-11-01

    Sublethal effects of pesticides in insects can be observed through physiological changes, which are commonly estimated by metabolic rate and respiratory patterns, more precisely by the patterns of discontinuous gas-exchange (DGE) cycles. The aim of the present research was to study the effect of some low concentrations of Fastac 50 EC on the cycles of CO(2) release and respiratory water loss rates (WLR) in bumble bee Bombus terrestris L. foragers. Bumble bees were dipped into 0.004% and 0.002% Fastac 50 EC solution. Flow-through respirometry was used to record the respiration and WLR 3h before and after the treatment. The respirometry was combined with infrared actography to enable simultaneous recording of abdominal movements. Our results show that Fastac 50 EC has an after-effect on bumble bee respiratory rhythms and muscle activity but does not affect WLR. Treatment with 0.004% Fastac 50 EC solution resulted in disappearance of the respiration cycles; also the lifespan of treated bumble bees was significantly shorter. Treatment with 0.002% Fastac 50 EC solution had no significant effect on respiration patterns or longevity. We found no evidence for the DGE cycles functioning as a water saving mechanism.

  1. Skipping breakfast leads to weight loss but also elevated cholesterol compared with consuming daily breakfasts of oat porridge or frosted cornflakes in overweight individuals: a randomised controlled trial.

    Science.gov (United States)

    Geliebter, Allan; Astbury, Nerys M; Aviram-Friedman, Roni; Yahav, Eric; Hashim, Sami

    2014-01-01

    Eating breakfast may reduce appetite, body weight and CVD risk factors, but the breakfast type that produces the greatest health benefits remains unclear. We compared the effects of consuming a high-fibre breakfast, a non-fibre breakfast, or no-breakfast control on body weight, CVD risk factors and appetite. A total of thirty-six overweight participants (eighteen men and eighteen women) (mean age 33·9 (sd 7·5) years, mean BMI 32·8 (sd 4·7) kg/m(2)) were randomly assigned to consume oat porridge (n = 12), frosted cornflakes (n = 12) or a water control (n = 12) breakfast daily for 4 weeks. Appetite ratings were collected on the first day and weekly thereafter. Before and after the intervention, body weight, composition, blood pressure and resting energy expenditure (REE) were measured and a fasting blood sample was collected. Across the 4 weeks, fullness was higher and hunger was lower in the oat porridge group compared with the control group (P skipping breakfast led to weight loss, it also resulted in increased total cholesterol concentrations compared with eating either oat porridge or frosted cornflakes for breakfast.

  2. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  3. Natural Loss of eyeless/Pax6 Expression in Eyes of Bicyclus anynana Adult Butterflies Likely Leads to Exponential Decrease of Eye Fluorescence in Transgenics

    Science.gov (United States)

    Monteiro, Antónia

    2015-01-01

    Commonly used visible markers for transgenesis use fluorescent proteins expressed at the surface of the body, such as in eyes. One commonly used marker is the 3xP3-EGFP cassette containing synthetic binding sites for the eyeless/Pax6 conserved transcription factor. This marker cassette leads to fluorescent eyes in a variety of animals tested so far. Here we show that upon reaching adulthood, transgenic Bicyclus anynana butterflies containing this marker cassette exponentially loose fluorescence in their eyes. After 12 days, transgenic individuals are no longer distinguishable from wild type individuals. The decreased eye fluorescence is likely due to significantly decreased or halted eyeless/Pax6 expression observed in wild type animals upon adult emergence. Implications from these findings include care in screening transgenic animals before these reach adulthood, or shortly thereafter, and in using adult animals of the same age for quantitative screening of likely homozygote and heterozygote individuals. PMID:26173066

  4. Natural Loss of eyeless/Pax6 Expression in Eyes of Bicyclus anynana Adult Butterflies Likely Leads to Exponential Decrease of Eye Fluorescence in Transgenics.

    Science.gov (United States)

    Das Gupta, Mainak; Chan, Sam Kok Sim; Monteiro, Antónia

    2015-01-01

    Commonly used visible markers for transgenesis use fluorescent proteins expressed at the surface of the body, such as in eyes. One commonly used marker is the 3xP3-EGFP cassette containing synthetic binding sites for the eyeless/Pax6 conserved transcription factor. This marker cassette leads to fluorescent eyes in a variety of animals tested so far. Here we show that upon reaching adulthood, transgenic Bicyclus anynana butterflies containing this marker cassette exponentially loose fluorescence in their eyes. After 12 days, transgenic individuals are no longer distinguishable from wild type individuals. The decreased eye fluorescence is likely due to significantly decreased or halted eyeless/Pax6 expression observed in wild type animals upon adult emergence. Implications from these findings include care in screening transgenic animals before these reach adulthood, or shortly thereafter, and in using adult animals of the same age for quantitative screening of likely homozygote and heterozygote individuals.

  5. Phylogeography of postglacial range expansion in Juglans mandshurica (Juglandaceae) reveals no evidence of bottleneck, loss of genetic diversity, or isolation by distance in the leading-edge populations.

    Science.gov (United States)

    Wang, Wen-Ting; Xu, Bing; Zhang, Da-Yong; Bai, Wei-Ning

    2016-09-01

    The past studies of postglacial recolonization patterns in high latitude regions have revealed a significant role of dispersal capacity in shaping the genetic diversity and population structure of temperate trees. However, most of these studies have focused on species with long-distance dispersal followed by exponential population growth and were therefore unable to reveal the patterns in the case of a gradual expansion. Here we studied the impacts of postglacial range expansions on the distribution of genetic diversity in the Manchurian walnut (Juglans mandshurica), a common tree of East Asian cool-temperate deciduous forests that apparently lacks long-distance seed dispersal ability. The genetic diversity and structure of 19 natural walnut populations in Northeast China and the Korean Peninsula were examined using 17 nuclear simple sequence repeat (SSR) loci. Potential habitats under current and past climatic conditions were predicted using the ecological niche modelling (ENM) method. Bayesian clustering analysis revealed three groups, which were inferred to have diverged through multiple glacial-interglacial cycles in multiple refugia during the Quaternary Period. ENM estimated a southward range shift at the LGM, but high suitability scores still occurred in the western parts of the Changbai Mountains (Northeast China), the Korean peninsula and the exposed seafloor of the Yellow Sea. In contrast to most other cool-temperate trees co-occurring in the same region, the Manchurian walnut did not show any evidence of a population bottleneck, loss of genetic diversity or isolation by distance during the postglacial expansion. Our study clearly indicates that current northern populations originated from one glacial lineage and recolonization via a gradually advancing front due to the lack of a long-distance seed dispersal mechanism led to no latitudinal decrease in genetic diversity.

  6. Long-term nitrogen addition leads to loss of species richness due to litter accumulation and soil acidification in a temperate steppe.

    Directory of Open Access Journals (Sweden)

    Ying Fang

    Full Text Available BACKGROUND: Although community structure and species richness are known to respond to nitrogen fertilization dramatically, little is known about the mechanisms underlying specific species replacement and richness loss. In an experiment in semiarid temperate steppe of China, manipulative N addition with five treatments was conducted to evaluate the effect of N addition on the community structure and species richness. METHODOLOGY/PRINCIPAL FINDINGS: Species richness and biomass of community in each plot were investigated in a randomly selected quadrat. Root element, available and total phosphorus (AP, TP in rhizospheric soil, and soil moisture, pH, AP, TP and inorganic N in the soil were measured. The relationship between species richness and the measured factors was analyzed using bivariate correlations and stepwise multiple linear regressions. The two dominant species, a shrub Artemisia frigida and a grass Stipa krylovii, responded differently to N addition such that the former was gradually replaced by the latter. S. krylovii and A. frigida had highly-branched fibrous and un-branched tap root systems, respectively. S. krylovii had higher height than A. frigida in both control and N added plots. These differences may contribute to the observed species replacement. In addition, the analysis on root element and AP contents in rhizospheric soil suggests that different calcium acquisition strategies, and phosphorus and sodium responses of the two species may account for the replacement. Species richness was significantly reduced along the five N addition levels. Our results revealed a significant relationship between species richness and soil pH, litter amount, soil moisture, AP concentration and inorganic N concentration. CONCLUSIONS/SIGNIFICANCE: Our results indicate that litter accumulation and soil acidification accounted for 52.3% and 43.3% of the variation in species richness, respectively. These findings would advance our knowledge on the

  7. Loss of αT-catenin alters the hybrid adhering junctions in the heart and leads to dilated cardiomyopathy and ventricular arrhythmia following acute ischemia.

    Science.gov (United States)

    Li, Jifen; Goossens, Steven; van Hengel, Jolanda; Gao, Erhe; Cheng, Lan; Tyberghein, Koen; Shang, Xiying; De Rycke, Riet; van Roy, Frans; Radice, Glenn L

    2012-02-15

    It is generally accepted that the intercalated disc (ICD) required for mechano-electrical coupling in the heart consists of three distinct junctional complexes: adherens junctions, desmosomes and gap junctions. However, recent morphological and molecular data indicate a mixing of adherens junctional and desmosomal components, resulting in a 'hybrid adhering junction' or 'area composita'. The α-catenin family member αT-catenin, part of the N-cadherin-catenin adhesion complex in the heart, is the only α-catenin that interacts with the desmosomal protein plakophilin-2 (PKP2). Thus, it has been postulated that αT-catenin might serve as a molecular integrator of the two adhesion complexes in the area composita. To investigate the role of αT-catenin in the heart, gene targeting technology was used to delete the Ctnna3 gene, encoding αT-catenin, in the mouse. The αT-catenin-null mice are viable and fertile; however, the animals exhibit progressive cardiomyopathy. Adherens junctional and desmosomal proteins were unaffected by loss of αT-catenin, with the exception of the desmosomal protein PKP2. Immunogold labeling at the ICD demonstrated in the αT-catenin-null heart a preferential reduction of PKP2 at the area composita compared with the desmosome. Furthermore, gap junction protein Cx43 was reduced at the ICD, including its colocalization with N-cadherin. Gap junction remodeling in αT-catenin-knockout hearts was associated with an increased incidence of ventricular arrhythmias after acute ischemia. This novel animal model demonstrates for the first time how perturbation in αT-catenin can affect both PKP2 and Cx43 and thereby highlights the importance of understanding the crosstalk between the junctional proteins of the ICD and its implications for arrhythmogenic cardiomyopathy.

  8. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  9. Research on Privacy Protection in Big Data Environment

    Directory of Open Access Journals (Sweden)

    Gang Zeng

    2015-05-01

    Full Text Available Now big data has become a hot topic in academia and industry, it is affecting the mode of thinking and working, daily life. But there are many security risks in data collection, storage and use. Privacy leakage caused serious problems to the user, false data will lead to error results of big data analysis. This paper first introduces the security problems faced by big data,analyzes the causes of privacy problems,discussesthe principle to solve the problem. Finally,discusses technical means for privacy protection.

  10. How should we do the history of big data?

    OpenAIRE

    Beer, David Gareth

    2016-01-01

    Taking its lead from Ian Hacking’s article ‘How should we do the history of statistics?’, this article reflects on how we might develop a sociologically informed history of Big Data. It argues that within the history of social statistics we have a relatively well developed history of the material phenomenon of Big Data. Yet this article argues that we now need to take the concept of ‘Big Data’ seriously, there is a pressing need to explore the type of work that is being done by that concept. ...

  11. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  12. Big Data: Philosophy, emergence, crowdledge, and science education

    Directory of Open Access Journals (Sweden)

    Renato P. dos Santos

    2016-02-01

    Full Text Available Big Data already passed out of hype, is now a field that deserves serious academic investigation, and natural scientists should also become familiar with Analytics. On the other hand, there is little empirical evidence that any science taught in school is helping people to lead happier, more prosperous, or more politically well-informed lives. In this work, we seek support in the Philosophy and Constructionism literatures to discuss the realm of the concepts of Big Data and its philosophy, the notions of ‘emergence’ and crowdledge, and how we see learning-with-Big-Data as a promising new way to learn Science.

  13. Biallelic Mutations in PDE10A Lead to Loss of Striatal PDE10A and a Hyperkinetic Movement Disorder with Onset in Infancy

    Science.gov (United States)

    Diggle, Christine P.; Sukoff Rizzo, Stacey J.; Popiolek, Michael; Hinttala, Reetta; Schülke, Jan-Philip; Kurian, Manju A.; Carr, Ian M.; Markham, Alexander F.; Bonthron, David T.; Watson, Christopher; Sharif, Saghira Malik; Reinhart, Veronica; James, Larry C.; Vanase-Frawley, Michelle A.; Charych, Erik; Allen, Melanie; Harms, John; Schmidt, Christopher J.; Ng, Joanne; Pysden, Karen; Strick, Christine; Vieira, Päivi; Mankinen, Katariina; Kokkonen, Hannaleena; Kallioinen, Matti; Sormunen, Raija; Rinne, Juha O.; Johansson, Jarkko; Alakurtti, Kati; Huilaja, Laura; Hurskainen, Tiina; Tasanen, Kaisa; Anttila, Eija; Marques, Tiago Reis; Howes, Oliver; Politis, Marius; Fahiminiya, Somayyeh; Nguyen, Khanh Q.; Majewski, Jacek; Uusimaa, Johanna; Sheridan, Eamonn; Brandon, Nicholas J.

    2016-01-01

    Deficits in the basal ganglia pathways modulating cortical motor activity underlie both Parkinson disease (PD) and Huntington disease (HD). Phosphodiesterase 10A (PDE10A) is enriched in the striatum, and animal data suggest that it is a key regulator of this circuitry. Here, we report on germline PDE10A mutations in eight individuals from two families affected by a hyperkinetic movement disorder due to homozygous mutations c.320A>G (p.Tyr107Cys) and c.346G>C (p.Ala116Pro). Both mutations lead to a reduction in PDE10A levels in recombinant cellular systems, and critically, positron-emission-tomography (PET) studies with a specific PDE10A ligand confirmed that the p.Tyr107Cys variant also reduced striatal PDE10A levels in one of the affected individuals. A knock-in mouse model carrying the homologous p.Tyr97Cys variant had decreased striatal PDE10A and also displayed motor abnormalities. Striatal preparations from this animal had an impaired capacity to degrade cyclic adenosine monophosphate (cAMP) and a blunted pharmacological response to PDE10A inhibitors. These observations highlight the critical role of PDE10A in motor control across species. PMID:27058446

  14. Loss of HGF/c-Met signaling in pancreatic β-cells leads to incomplete maternal β-cell adaptation and gestational diabetes mellitus.

    Science.gov (United States)

    Demirci, Cem; Ernst, Sara; Alvarez-Perez, Juan C; Rosa, Taylor; Valle, Shelley; Shridhar, Varsha; Casinelli, Gabriella P; Alonso, Laura C; Vasavada, Rupangi C; García-Ocana, Adolfo

    2012-05-01

    Hepatocyte growth factor (HGF) is a mitogen and insulinotropic agent for the β-cell. However, whether HGF/c-Met has a role in maternal β-cell adaptation during pregnancy is unknown. To address this issue, we characterized glucose and β-cell homeostasis in pregnant mice lacking c-Met in the pancreas (PancMet KO mice). Circulating HGF and islet c-Met and HGF expression were increased in pregnant mice. Importantly, PancMet KO mice displayed decreased β-cell replication and increased β-cell apoptosis at gestational day (GD)15. The decreased β-cell replication was associated with reductions in islet prolactin receptor levels, STAT5 nuclear localization and forkhead box M1 mRNA, and upregulation of p27. Furthermore, PancMet KO mouse β-cells were more sensitive to dexamethasone-induced cytotoxicity, whereas HGF protected human β-cells against dexamethasone in vitro. These detrimental alterations in β-cell proliferation and death led to incomplete maternal β-cell mass expansion in PancMet KO mice at GD19 and early postpartum periods. The decreased β-cell mass was accompanied by increased blood glucose, decreased plasma insulin, and impaired glucose tolerance. PancMet KO mouse islets failed to upregulate GLUT2 and pancreatic duodenal homeobox-1 mRNA, insulin content, and glucose-stimulated insulin secretion during gestation. These studies indicate that HGF/c-Met signaling is essential for maternal β-cell adaptation during pregnancy and that its absence/attenuation leads to gestational diabetes mellitus.

  15. Big data challenges: Impact, potential responses and research needs

    OpenAIRE

    Bachlechner , Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterpr...

  16. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  17. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  18. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  19. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  20. New 'bigs' in cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Yurov, Artyom V. [I. Kant Russian State University, Theoretical Physics Department, 14 Al. Nevsky St., Kaliningrad 236041 (Russian Federation); Martin-Moruno, Prado [Colina de los Chopos, Centro de Fisica ' Miguel A. Catalan' , Instituto de Matematicas y Fisica Fundamental, Consejo Superior de Investigaciones Cientificas, Serrano 121, 28006 Madrid (Spain); Gonzalez-Diaz, Pedro F. [Colina de los Chopos, Centro de Fisica ' Miguel A. Catalan' , Instituto de Matematicas y Fisica Fundamental, Consejo Superior de Investigaciones Cientificas, Serrano 121, 28006 Madrid (Spain)]. E-mail: p.gonzalezdiaz@imaff.cfmac.csic.es

    2006-12-18

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions.

  1. DARPA's Big Mechanism program.

    Science.gov (United States)

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  2. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  3. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  4. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  5. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  6. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  7. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  8. Hearing loss

    Science.gov (United States)

    Decreased hearing; Deafness; Loss of hearing; Conductive hearing loss; Sensorineural hearing loss; Presbycusis ... Symptoms of hearing loss may include: Certain sounds seeming too ... conversations when two or more people are talking Difficulty ...

  9. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  10. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  11. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  12. STRUCTURAL ASPECTS OF PLASTICITY LOWERING OF HIGH-STRENGTH WIRE AT BIG CUMULATIVE COMPRESSIONS

    Directory of Open Access Journals (Sweden)

    V. P. Fetisov

    2012-01-01

    Full Text Available It is shown that decrease of plasticity of high-strength wire at big total cobbings is connected with reduction of mobility of dislocations in the substructure formed at loss of perlite lamellar structure.

  13. Distributed Weighted Parameter Averaging for SVM Training on Big Data

    OpenAIRE

    Das, Ayan; Bhattacharya, Sourangshu

    2015-01-01

    Two popular approaches for distributed training of SVMs on big data are parameter averaging and ADMM. Parameter averaging is efficient but suffers from loss of accuracy with increase in number of partitions, while ADMM in the feature space is accurate but suffers from slow convergence. In this paper, we report a hybrid approach called weighted parameter averaging (WPA), which optimizes the regularized hinge loss with respect to weights on parameters. The problem is shown to be same as solving...

  14. Can Pleasant Goat and Big Big Wolf Save China's Animation Industry?

    Institute of Scientific and Technical Information of China (English)

    Guo Liqin

    2009-01-01

    "My dreamed husband is big big wolf," claimed Miss Fang, a young lady who works in KPMG Beijing Office. This big big wolf is a lovely cartoon wolf appeared in a Pleasant Goat and Big Big Wolf produced independently by Chinese.

  15. New Framework for Improving Big Data Analysis Using Mobile Agent

    Directory of Open Access Journals (Sweden)

    Youssef M. ESSA

    2014-01-01

    Full Text Available the rising number of applications serving millions of users and dealing with terabytes of data need to a faster processing paradigms. Recently, there is growing enthusiasm for the notion of big data analysis. Big data analysis becomes a very important aspect for growth productivity, reliability and quality of services (QoS. Processing of big data using a powerful machine is not efficient solution. So, companies focused on using Hadoop software for big data analysis. This is because Hadoop designed to support parallel and distributed data processing. Hadoop provides a distributed file processing system that stores and processes a large scale of data. It enables a fault tolerant by replicating data on three or more machines to avoid data loss.Hadoop is based on client server model and used single master machine called NameNode. However, Hadoop has several drawbacks affecting on its performance and reliability against big data analysis. In this paper, a new framework is proposed to improve big data analysis and overcome specified drawbacks of Hadoop. These drawbacks are replication tasks, Centralized node and nodes failure. The proposed framework is called MapReduce Agent Mobility (MRAM. MRAM is developed by using mobile agent and MapReduce paradigm under Java Agent Development Framework (JADE.

  16. Asteroids Were Born Big

    CERN Document Server

    Morbidelli, Alessandro; Nesvorny, David; Levison, Harold F

    2009-01-01

    How big were the first planetesimals? We attempt to answer this question by conducting coagulation simulations in which the planetesimals grow by mutual collisions and form larger bodies and planetary embryos. The size frequency distribution (SFD) of the initial planetesimals is considered a free parameter in these simulations, and we search for the one that produces at the end objects with a SFD that is consistent with asteroid belt constraints. We find that, if the initial planetesimals were small (e.g. km-sized), the final SFD fails to fulfill these constraints. In particular, reproducing the bump observed at diameter D~100km in the current SFD of the asteroids requires that the minimal size of the initial planetesimals was also ~100km. This supports the idea that planetesimals formed big, namely that the size of solids in the proto-planetary disk ``jumped'' from sub-meter scale to multi-kilometer scale, without passing through intermediate values. Moreover, we find evidence that the initial planetesimals ...

  17. Knockdown of ornithine decarboxylase antizyme 1 causes loss of uptake regulation leading to increased N1, N11-bis(ethyl)norspermine (BENSpm) accumulation and toxicity in NCI H157 lung cancer cells.

    Science.gov (United States)

    Fraser, Alison V; Goodwin, Andrew C; Hacker-Prietz, Amy; Sugar, Elizabeth; Woster, Patrick M; Casero, Robert A

    2012-02-01

    Ornithine decarboxylase antizyme 1 (AZ1) is a major regulatory protein responsible for the regulation and degradation of ornithine decarboxylase (ODC). To better understand the role of AZ1 in polyamine metabolism and in modulating the response to anticancer polyamine analogues, a small interfering RNA strategy was used to create a series of stable clones in human H157 non-small cell lung cancer cells that expressed less than 5-10% of basal AZ1 levels. Antizyme 1 knockdown clones accumulated greater amounts of the polyamine analogue N (1),N (11)-bis(ethyl)norspermine (BENSpm) and were more sensitive to analogue treatment. The possibility of a loss of polyamine uptake regulation in the knockdown clones was confirmed by polyamine uptake analysis. These results are consistent with the hypothesis that AZ1 knockdown leads to dysregulation of polyamine uptake, resulting in increased analogue accumulation and toxicity. Importantly, there appears to be little difference between AZ1 knockdown cells and cells with normal levels of AZ1 with respect to ODC regulation, suggesting that another regulatory protein, potentially AZ2, compensates for the loss of AZ1. The results of these studies are important for the understanding of both the regulation of polyamine homeostasis and in understanding the factors that regulate tumor cell sensitivity to the anti-tumor polyamine analogues.

  18. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  19. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  20. Capture reactions on C-14 in nonstandard big bang nucleosynthesis

    Science.gov (United States)

    Wiescher, Michael; Gorres, Joachim; Thielemann, Friedrich-Karl

    1990-01-01

    Nonstandard big bang nucleosynthesis leads to the production of C-14. The further reaction path depends on the depletion of C-14 by either photon, alpha, or neutron capture reactions. The nucleus C-14 is of particular importance in these scenarios because it forms a bottleneck for the production of heavier nuclei A greater than 14. The reaction rates of all three capture reactions at big bang conditions are discussed, and it is shown that the resulting reaction path, leading to the production of heavier elements, is dominated by the (p, gamma) and (n, gamma) rates, contrary to earlier suggestions.

  1. The Rise of Big Data in Neurorehabilitation.

    Science.gov (United States)

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  2. BIG DATA AND STATISTICS

    Science.gov (United States)

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies.

  3. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  4. Big Hero 6

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    看《超能陆战队》如何让普通人变身超级英雄拯救城市!Hiro Hamada,14,lives in the future city of San Fransokyo.He has a robot(机器人)friend Baymax.Baymax is big and soft.His job is to nurse sick(生病的)people.One day,a bad man wants to take control of(控制)SanFransokyo.Hiro hopes to save(挽救)the city with Baymax.ButBaymax is just a nursing robot.This is not a problem for Hiro,(ho一we套ve盔r.甲He)knows a lot about robots.He makes a suit of armorfor Baymax and turns him into a super robot!

  5. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  6. Avoiding a Big Catastrophe

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Before last October,the South China tiger had almost slipped into mythi- cal status as it had been absent for so long from the public eye.In the previous 20-plus years,these tigers could not be found in the wild in China and the number of those in captivity numbered only around 60. The species—a direct descendent of the earliest tigers thought to have originat- ed in China 2 million years ago—is functionally extinct,according to experts. The big cat’s return to the media spotlight was completely unexpected. On October 12,2007,a digital picture,showing a wild South China tiger

  7. Rotational inhomogeneities from pre-big bang?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    The evolution of the rotational inhomogeneities is investigated in the specific framework of four-dimensional pre-big bang models. While minimal (dilaton-driven) scenarios do not lead to rotational fluctuations, in the case of non-minimal (string-driven) models, fluid sources are present in the pre-big bang phase. The rotational modes of the geometry, coupled to the divergenceless part of the velocity field, can then be amplified depending upon the value of the barotropic index of the perfect fluids. In the light of a possible production of rotational inhomogeneities, solutions describing the coupled evolution of the dilaton field and of the fluid sources are scrutinized in both the string and Einstein frames. In semi-realistic scenarios, where the curvature divergences are regularized by means of a non-local dilaton potential, the rotational inhomogeneities are amplified during the pre-big bang phase but they decay later on. Similar analyses can also be performed when a contraction occurs directly in the str...

  8. Managing Research Data in Big Science

    CERN Document Server

    Gray, Norman; Woan, Graham

    2012-01-01

    The project which led to this report was funded by JISC in 2010--2011 as part of its 'Managing Research Data' programme, to examine the way in which Big Science data is managed, and produce any recommendations which may be appropriate. Big science data is different: it comes in large volumes, and it is shared and exploited in ways which may differ from other disciplines. This project has explored these differences using as a case-study Gravitational Wave data generated by the LSC, and has produced recommendations intended to be useful variously to JISC, the funding council (STFC) and the LSC community. In Sect. 1 we define what we mean by 'big science', describe the overall data culture there, laying stress on how it necessarily or contingently differs from other disciplines. In Sect. 2 we discuss the benefits of a formal data-preservation strategy, and the cases for open data and for well-preserved data that follow from that. This leads to our recommendations that, in essence, funders should adopt rather lig...

  9. Tax Expert Offers Ideas for Monitoring Big Spending on College Sports

    Science.gov (United States)

    Sander, Libby

    2009-01-01

    The federal government could take a cue from its regulation of charitable organizations in monitoring the freewheeling fiscal habits of big-time college athletics, a leading tax lawyer says. The author reports on the ideas offered by John D. Colombo, a professor at the University of Illinois College of Law, for monitoring big spending on college…

  10. Natural regeneration processes in big sagebrush (Artemisia tridentata)

    Science.gov (United States)

    Schlaepfer, Daniel R.; Lauenroth, William K.; Bradford, John B.

    2014-01-01

    Big sagebrush, Artemisia tridentata Nuttall (Asteraceae), is the dominant plant species of large portions of semiarid western North America. However, much of historical big sagebrush vegetation has been removed or modified. Thus, regeneration is recognized as an important component for land management. Limited knowledge about key regeneration processes, however, represents an obstacle to identifying successful management practices and to gaining greater insight into the consequences of increasing disturbance frequency and global change. Therefore, our objective is to synthesize knowledge about natural big sagebrush regeneration. We identified and characterized the controls of big sagebrush seed production, germination, and establishment. The largest knowledge gaps and associated research needs include quiescence and dormancy of embryos and seedlings; variation in seed production and germination percentages; wet-thermal time model of germination; responses to frost events (including freezing/thawing of soils), CO2 concentration, and nutrients in combination with water availability; suitability of microsite vs. site conditions; competitive ability as well as seedling growth responses; and differences among subspecies and ecoregions. Potential impacts of climate change on big sagebrush regeneration could include that temperature increases may not have a large direct influence on regeneration due to the broad temperature optimum for regeneration, whereas indirect effects could include selection for populations with less stringent seed dormancy. Drier conditions will have direct negative effects on germination and seedling survival and could also lead to lighter seeds, which lowers germination success further. The short seed dispersal distance of big sagebrush may limit its tracking of suitable climate; whereas, the low competitive ability of big sagebrush seedlings may limit successful competition with species that track climate. An improved understanding of the

  11. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies

    Directory of Open Access Journals (Sweden)

    Alexandre G. de Brevern

    2015-01-01

    Full Text Available Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  12. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies.

    Science.gov (United States)

    de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  13. Big data algorithms, analytics, and applications

    CERN Document Server

    Li, Kuan-Ching; Yang, Laurence T; Cuzzocrea, Alfredo

    2015-01-01

    Data are generated at an exponential rate all over the world. Through advanced algorithms and analytics techniques, organizations can harness this data, discover hidden patterns, and use the findings to make meaningful decisions. Containing contributions from leading experts in their respective fields, this book bridges the gap between the vastness of big data and the appropriate computational methods for scientific and social discovery. It also explores related applications in diverse sectors, covering technologies for media/data communication, elastic media/data storage, cross-network media/

  14. Big Bang of Massenergy and Negative Big Bang of Spacetime

    Science.gov (United States)

    Cao, Dayong

    2017-01-01

    There is a balance between Big Bang of Massenergy and Negative Big Bang of Spacetime in the universe. Also some scientists considered there is an anti-Big Bang who could produce the antimatter. And the paper supposes there is a structure balance between Einstein field equation and negative Einstein field equation, a balance between massenergy structure and spacetime structure, a balance between an energy of nucleus of the stellar matter and a dark energy of nucleus of the dark matter-dark energy, and a balance between the particle and the wave-a balance system between massenergy (particle) and spacetime (wave). It should explain of the problems of the Big Bang. http://meetings.aps.org/Meeting/APR16/Session/M13.8

  15. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  16. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  17. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  18. Big Data: present and future

    OpenAIRE

    Mircea Raducu TRIFU; Mihaela Laura IVAN

    2014-01-01

    The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ...

  19. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  20. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  1. Big Data is invading big places as CERN

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  2. A view on big data and its relation to Informetrics

    Institute of Scientific and Technical Information of China (English)

    Ronald; ROUSSEAU

    2012-01-01

    Purpose:Big data offer a huge challenge.Their very existence leads to the contradiction that the more data we have the less accessible they become,as the particular piece of information one is searching for may be buried among terabytes of other data.In this contribution we discuss the origin of big data and point to three challenges when big data arise:Data storage,data processing and generating insights.Design/methodology/approach:Computer-related challenges can be expressed by the CAP theorem which states that it is only possible to simultaneously provide any two of the three following properties in distributed applications:Consistency(C),availability(A)and partition tolerance(P).As an aside we mention Amdahl’s law and its application for scientific collaboration.We further discuss data mining in large databases and knowledge representation for handling the results of data mining exercises.We further offer a short informetric study of the field of big data,and point to the ethical dimension of the big data phenomenon.Findings:There still are serious problems to overcome before the field of big data can deliver on its promises.Implications and limitations:This contribution offers a personal view,focusing on the information science aspects,but much more can be said about software aspects.Originality/value:We express the hope that the information scientists,including librarians,will be able to play their full role within the knowledge discovery,data mining and big data communities,leading to exciting developments,the reduction of scientific bottlenecks and really innovative applications.

  3. The Big Chills

    Science.gov (United States)

    Bond, G. C.; Dwyer, G. S.; Bauch, H. A.

    2002-12-01

    At the end of the last glacial, the Earth's climate system abruptly shifted into the Younger Dryas, a 1500-year long cold snap known in the popular media as the Big Chill. Following an abrupt warming ending the Younger Dryas about 11,600 years ago, the climate system has remained in an interglacial state, thought to have been relatively stable and devoid, with possibly one or two exceptions, of abrupt climate change. A growing amount of evidence suggests that this benign view of interglacial climate is incorrect. High resolution records of North Atlantic ice rafted sediment, now regarded as evidence of extreme multiyear sea ice drift, reveal abrupt shifts on centennial and millennial time scales. These have been traced from the end of the Younger Dryas to the present, revealing evidence of significant climate variability through all of the last two millennia. Correlatives of these events have been found in drift ice records from the Arctic's Laptev Sea, in the isotopic composition of North Grip ice, and in dissolved K from the GISP2 ice core, attesting to their regional extent and imprint in proxies of very different origins. Measurements of Mg/Ca ratios in planktic foraminifera over the last two millennia in the eastern North Atlantic demonstrate that increases in drifting multiyear sea ice were accompanied by abrupt decreases in sea surface temperatures, especially during the Little Ice Age. Estimated rates of temperature change are on the order of two degrees centigrade, more than thirty percent of the regional glacial to interglacial change, within a few decades. When compared at the same resolution, these interglacial variations are as abrupt as the last glacial's Dansgaard-Oeschger cycles. The interglacial abrupt changes are especially striking because they occurred within the core of the warm North Atlantic Current. The changes may have been triggered by variations in solar irradiance, but if so their large magnitude and regional extent requires amplifying

  4. Big Data and Perioperative Nursing.

    Science.gov (United States)

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient.

  5. Big, Fat World of Lipids

    Science.gov (United States)

    ... Science Home Page The Big, Fat World of Lipids By Emily Carlson Posted August 9, 2012 Cholesterol ... ways to diagnose and treat lipid-related conditions. Lipid Encyclopedia Just as genomics and proteomics spurred advances ...

  6. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  7. Comments on Thomas Wartenberg's "Big Ideas for Little Kids"

    Science.gov (United States)

    Goering, Sara

    2012-01-01

    This short commentary offers praise for Tom Wartenberg's book "Big Ideas for Little Kids" and raises questions about who is best qualified to lead a philosophy discussion with children, and how we are to assess the benefits of doing philosophy with children.

  8. Big Bang Nucleosynthesis: 2015

    CERN Document Server

    Cyburt, Richard H; Olive, Keith A; Yeh, Tsung-Han

    2015-01-01

    Big-bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. We briefly overview the essentials of this physics, and present new calculations of light element abundances through li6 and li7, with updated nuclear reactions and uncertainties including those in the neutron lifetime. We provide fits to these results as a function of baryon density and of the number of neutrino flavors, N_nu. We review recent developments in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom, n_eff. These measurements allow for a tight test of BBN and of cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. We include a ...

  9. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  10. Big Data Comes to School

    OpenAIRE

    Bill Cope; Mary Kalantzis

    2016-01-01

    The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-me...

  11. Big Data for Precision Medicine

    OpenAIRE

    Daniel Richard Leff; Guang-Zhong Yang

    2015-01-01

    This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of onl...

  12. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  13. Lead Poisoning

    Science.gov (United States)

    Lead is a metal that occurs naturally in the earth's crust. Lead can be found in all parts of our ... from human activities such as mining and manufacturing. Lead used to be in paint; older houses may ...

  14. Powering Big Data for Nursing Through Partnership.

    Science.gov (United States)

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  15. Lead Toxicity

    Science.gov (United States)

    ... including some imported jewelry. What are the health effects of lead? • More commonly, lower levels of lead in children over time may lead to reduced IQ, slow learning, Attention Deficit Hyperactivity Disorder (ADHD), or behavioral issues. • Lead also affects other ...

  16. Where Are the Logical Errors in the Theory of Big Bang?

    Science.gov (United States)

    Kalanov, Temur Z.

    2015-04-01

    The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.

  17. Hair Loss

    Science.gov (United States)

    ... loss at the scarred areas. These conditions include lichen planus, some types of lupus and sarcoidosis. Hair- ... increase your risk of hair loss, including: Family history Age Poor nutrition Certain medical conditions, such as ...

  18. Hearing Loss

    Science.gov (United States)

    ... effects on your hearing — ringing in the ear (tinnitus) or hearing loss — can occur if you take ... adults with hearing loss, commonly reported problems include: Depression Anxiety An often false sense that others are ...

  19. Big data challenges: impact, potential responses and research needs

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  20. Can companies benefit from Big Science? Science and Industry

    CERN Document Server

    Autio, Erkko; Bianchi-Streit, M

    2003-01-01

    Several studies have indicated that there are significant returns on financial investment via "Big Science" centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields - for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm's organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings i...

  1. Confinement loss scaling law analysis in tube lattice fibers for terahertz applications

    Science.gov (United States)

    Masruri, M.; Vincetti, L.; Molardi, C.; Coscelli, E.; Cucinotta, A.; Selleri, S.

    2014-03-01

    The development of low loss, small size and flexible waveguides is one of the most challenging issues of THz research due to the poor characteristics of both metal and dielectrics in this frequency range. Hollow core tube lattice fibers (HCTLFs) have been recently proposed and experimentally demonstrated to overcome this problem. However, they require very large hollow core size leading to big and hardly flexible fibers. Scaling law analysis plays an important role in determining the best trade-off between low loss and small fiber diameter. The dependence of the confinement on frequency and core radius are here numerically investigated. Results show that confinement loss exhibits a stronger dependence on core size and frequency with respect to other hollow core fibers proposed for THz waveguiding, such as Bragg, Tube, and Kagome fibers.

  2. The use of big data in transfusion medicine.

    Science.gov (United States)

    Pendry, K

    2015-06-01

    'Big data' refers to the huge quantities of digital information now available that describe much of human activity. The science of data management and analysis is rapidly developing to enable organisations to convert data into useful information and knowledge. Electronic health records and new developments in Pathology Informatics now support the collection of 'big laboratory and clinical data', and these digital innovations are now being applied to transfusion medicine. To use big data effectively, we must address concerns about confidentiality and the need for a change in culture and practice, remove barriers to adopting common operating systems and data standards and ensure the safe and secure storage of sensitive personal information. In the UK, the aim is to formulate a single set of data and standards for communicating test results and so enable pathology data to contribute to national datasets. In transfusion, big data has been used for benchmarking, detection of transfusion-related complications, determining patterns of blood use and definition of blood order schedules for surgery. More generally, rapidly available information can monitor compliance with key performance indicators for patient blood management and inventory management leading to better patient care and reduced use of blood. The challenges of enabling reliable systems and analysis of big data and securing funding in the restrictive financial climate are formidable, but not insurmountable. The promise is that digital information will soon improve the implementation of best practice in transfusion medicine and patient blood management globally.

  3. Neuroinflammation - using big data to inform clinical practice.

    Science.gov (United States)

    Dendrou, Calliope A; McVean, Gil; Fugger, Lars

    2016-12-01

    Neuroinflammation is emerging as a central process in many neurological conditions, either as a causative factor or as a secondary response to nervous system insult. Understanding the causes and consequences of neuroinflammation could, therefore, provide insight that is needed to improve therapeutic interventions across many diseases. However, the complexity of the pathways involved necessitates the use of high-throughput approaches to extensively interrogate the process, and appropriate strategies to translate the data generated into clinical benefit. Use of 'big data' aims to generate, integrate and analyse large, heterogeneous datasets to provide in-depth insights into complex processes, and has the potential to unravel the complexities of neuroinflammation. Limitations in data analysis approaches currently prevent the full potential of big data being reached, but some aspects of big data are already yielding results. The implementation of 'omics' analyses in particular is becoming routine practice in biomedical research, and neuroimaging is producing large sets of complex data. In this Review, we evaluate the impact of the drive to collect and analyse big data on our understanding of neuroinflammation in disease. We describe the breadth of big data that are leading to an evolution in our understanding of this field, exemplify how these data are beginning to be of use in a clinical setting, and consider possible future directions.

  4. From big data to deep insight in developmental science.

    Science.gov (United States)

    Gilmore, Rick O

    2016-01-01

    The use of the term 'big data' has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data 'big' and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science.

  5. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  6. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  7. Improving the Success of Strategic Management Using Big Data.

    Science.gov (United States)

    Desai, Sapan S; Wilkerson, James; Roberts, Todd

    2016-01-01

    Strategic management involves determining organizational goals, implementing a strategic plan, and properly allocating resources. Poor access to pertinent and timely data misidentifies clinical goals, prevents effective resource allocation, and generates waste from inaccurate forecasting. Loss of operational efficiency diminishes the value stream, adversely impacts the quality of patient care, and hampers effective strategic management. We have pioneered an approach using big data to create competitive advantage by identifying trends in clinical practice, accurately anticipating future needs, and strategically allocating resources for maximum impact.

  8. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  9. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  10. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  11. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  12. 淀粉Big Bang!

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Big Bang,也叫"大爆炸",指的是宇宙诞生时期从密度极大且温度极高的太初状态开始发生不断膨胀的过程。换句话说,从Big Bang开始,我们现在的宇宙慢慢形成了。0K,从本期开始,"少电"将在微博引发Big Bang!——淀粉大爆炸!具体怎么爆呢?我想,看到本页版式的你已经明白了七八分了吧?

  13. Multiwavelength astronomy and big data

    Science.gov (United States)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  14. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  15. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  16. Was There A Big Bang?

    CERN Document Server

    Soberman, Robert K

    2008-01-01

    The big bang hypothesis is widely accepted despite numerous physics conflicts. It rests upon two experimental supports, galactic red shift and the cosmic microwave background. Both are produced by dark matter, shown here to be hydrogen dominated aggregates with a few percent of helium nodules. Scattering from these non-radiating intergalactic masses produce a red shift that normally correlates with distance. Warmed by our galaxy to an Eigenvalue of 2.735 K, drawn near the Earth, these bodies, kept cold by ablation, resonance radiate the Planckian microwave signal. Several tests are proposed that will distinguish between this model and the big bang.

  17. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  18. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  19. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  20. Little Science to Big Science: Big Scientists to Little Scientists?

    Science.gov (United States)

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  1. Clearing the Big Smog

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Beijing is trying to clean up its sky with a new standard for vehicle emissions From March 1,Beijing has imple- mented a new stricter vehicle emission standard that could lead to cleaner air,but also force thousands of cars off the road. StandardⅣ,the latest in a series of measures aimed at clearing the per- sistent smog,will match the current standard of the European Union. All the new light petro vehicles that are on sale in the Beijing market shall

  2. Data Partitioning View of Mining Big Data

    OpenAIRE

    Zhang, Shichao

    2016-01-01

    There are two main approximations of mining big data in memory. One is to partition a big dataset to several subsets, so as to mine each subset in memory. By this way, global patterns can be obtained by synthesizing all local patterns discovered from these subsets. Another is the statistical sampling method. This indicates that data partitioning should be an important strategy for mining big data. This paper recalls our work on mining big data with a data partitioning and shows some interesti...

  3. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge...: r3planning@fws.gov . Include ``Big Stone Draft CCP/ EA'' in the subject line of the message. Fax:...

  4. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  5. The Big European Bubble Chamber

    CERN Multimedia

    1977-01-01

    The 3.70 metre Big European Bubble Chamber (BEBC), dismantled on 9 August 1984. During operation it was one of the biggest detectors in the world, producing direct visual recordings of particle tracks. 6.3 million photos of interactions were taken with the chamber in the course of its existence.

  6. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  7. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  8. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  9. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  10. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  11. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data Revolutio

  12. Big sagebrush seed bank densities following wildfires

    Science.gov (United States)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  13. A survey of big data research

    OpenAIRE

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  14. Study on Yield Loss of Summer Maize Due to Lodging at the Big Flare Stage and Grain Filling Stage%大喇叭口及灌浆期倒伏对夏玉米产量损失的研究

    Institute of Scientific and Technical Information of China (English)

    李树岩; 马玮; 彭记永; 陈忠民

    2015-01-01

    差异不显著,其中FSH(2011年)处理秃尖率最高,达27.4%。倒伏灾害同时影响穗粒数和百粒重,倒伏后均表现出减少趋势,与产量的相关系数分别为0.729和0.842(P<0.01)。受2年试验条件差异影响,2011年各倒伏处理的穗粒数显著低于对照,2012年仅灌浆期茎倒伏与对照差异显著。倒伏影响百粒重,其中灌浆期处理最显著(P<0.05)。穗粒数和百粒重与产量显著相关,相关系数分别为0.729和0.842(P<0.01)。倒伏显著降低产量,除大喇叭口期轻度根倒伏BR1外,其他倒伏处理的产量均显著低于对照(P<0.05)。大喇叭口期倒伏 BR2、BSL 和 BSH 产量损失率2年平均分别为13.9%、27.9%和27.1%;灌浆期倒伏FR1、FR2、FSL和FSH产量损失率,2年平均分别为29.0%、38.4%、45.0%和48.3%。【结论】相同倒伏类型,灌浆期倒伏较大喇叭口期倒伏影响更大;在同一生育时期,茎倒伏比根倒伏影响更明显,但茎倒伏高、低节位处理之间产量损失差异不显著。各倒伏处理的产量损失表现为灌浆期茎倒伏最高,大喇叭口期根倒伏最低。%Objective]The objective of this study was to analyze the effects of lodging on dry matter, yield components and final yield of summer maize, and further to evaluate the yield losses due to different types of lodging at different growing stages.[Method]The field experiments were conducted during two growing seasons of summer maize with the variety of Xundan 20 from 2011 to 2012 at Zhengzhou Agro-meteorological Experiment Station. The lodging was implemented by kicking down the maize plants artificially at two growing stage, the big flare stage (B) and the grain filling stage (F). And at each stage, the lodging included four types, light lodging of roots (R1, in which the angle between the lodged stem and ground was between 30 and 60 degree), serious lodging of roots (R2, in which the angle between the lodged stem and ground was lower

  15. Lead Test

    Science.gov (United States)

    ... months, and at 3, 4, 5, and 6 years of age. A blood lead level test should be done only if the risk ... recommended if the person is symptomatic at any level below 70 mcg/dL. Because lead will pass through the blood to an unborn child, pregnant ...

  16. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  17. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  18. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  19. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  20. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  1. Solution of a Braneworld Big Crunch/Big Bang Cosmology

    CERN Document Server

    McFadden, P; Turok, N G; Fadden, Paul Mc; Steinhardt, Paul J.; Turok, Neil

    2005-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)^2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly-separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  2. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  3. The Obstacles in Big Data Process

    Directory of Open Access Journals (Sweden)

    Rasim M. Alguliyev

    2017-04-01

    Full Text Available The increasing amount of data and a need to analyze the given data in a timely manner for multiple purposes has created a serious barrier in the big data analysis process. This article describes the challenges that big data creates at each step of the big data analysis process. These problems include typical analytical problems as well as the most uncommon challenges that are futuristic for the big data only. The article breaks down problems for each step of the big data analysis process and discusses these problems separately at each stage. It also offers some simplistic ways to solve these problems.

  4. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  5. Particle physics catalysis of thermal big bang nucleosynthesis.

    Science.gov (United States)

    Pospelov, Maxim

    2007-06-08

    We point out that the existence of metastable, tau>10(3) s, negatively charged electroweak-scale particles (X-) alters the predictions for lithium and other primordial elemental abundances for A>4 via the formation of bound states with nuclei during big bang nucleosynthesis. In particular, we show that the bound states of X- with helium, formed at temperatures of about T=10(8) K, lead to the catalytic enhancement of 6Li production, which is 8 orders of magnitude more efficient than the standard channel. In particle physics models where subsequent decay of X- does not lead to large nonthermal big bang nucleosynthesis effects, this directly translates to the level of sensitivity to the number density of long-lived X- particles (tau>10(5) s) relative to entropy of nX-/s less, approximately <3x10(-17), which is one of the most stringent probes of electroweak scale remnants known to date.

  6. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  7. Memory loss

    Science.gov (United States)

    A person with memory loss needs a lot of support. It helps to show the person familiar objects, music, or and photos or play familiar music. Write down when the person should take any medicine or do other ...

  8. Hair Loss

    Science.gov (United States)

    ... enough protein from non-meat sources. And some athletes are at higher risk for hair loss because they may be more likely to develop iron-deficiency anemia. Disruption of the hair growth cycle. ...

  9. Hair Loss

    Science.gov (United States)

    ... Situations Pets and Animals myhealthfinder Food and Nutrition Healthy Food Choices Weight Loss and Diet Plans Nutrients and Nutritional Info Sugar and Sugar Substitutes Exercise and Fitness Exercise Basics Sports Safety Injury Rehabilitation Emotional Well- ...

  10. Experiencing Loss

    DEFF Research Database (Denmark)

    Kristiansen, Maria; Younis, Tarek; Hassani, Amani

    2015-01-01

    In this article, we explore how Islam, minority status and refugee experiencesintersect in shaping meaning-making processes following bereavement. We do this througha phenomenological analysis of a biographical account of personal loss told by Aisha, a Muslim Palestinian refugee living in Denmark......, thus highlightingthe complex way in which religious beliefs, minority status and migration historycome together in shaping meaning-making processes, and the importance of reciprocity innarrative studies......., who narrates her experience of losing herhusband to lung cancer. By drawing on a religious framework, Aisha creates meaning fromher loss, which enables her to incorporate this loss into her life history and sustain agency.Her narrative invites wider audiences to witness her tale of overcoming loss...

  11. Losses in Ferroelectric Materials.

    Science.gov (United States)

    Liu, Gang; Zhang, Shujun; Jiang, Wenhua; Cao, Wenwu

    2015-03-01

    Ferroelectric materials are the best dielectric and piezoelectric materials known today. Since the discovery of barium titanate in the 1940s, lead zirconate titanate ceramics in the 1950s and relaxor-PT single crystals (such as lead magnesium niobate-lead titanate and lead zinc niobate-lead titanate) in the 1980s and 1990s, perovskite ferroelectric materials have been the dominating piezoelectric materials for electromechanical devices, and are widely used in sensors, actuators and ultrasonic transducers. Energy losses (or energy dissipation) in ferroelectrics are one of the most critical issues for high power devices, such as therapeutic ultrasonic transducers, large displacement actuators, SONAR projectors, and high frequency medical imaging transducers. The losses of ferroelectric materials have three distinct types, i.e., elastic, piezoelectric and dielectric losses. People have been investigating the mechanisms of these losses and are trying hard to control and minimize them so as to reduce performance degradation in electromechanical devices. There are impressive progresses made in the past several decades on this topic, but some confusions still exist. Therefore, a systematic review to define related concepts and clear up confusions is urgently in need. With this objective in mind, we provide here a comprehensive review on the energy losses in ferroelectrics, including related mechanisms, characterization techniques and collections of published data on many ferroelectric materials to provide a useful resource for interested scientists and engineers to design electromechanical devices and to gain a global perspective on the complex physical phenomena involved. More importantly, based on the analysis of available information, we proposed a general theoretical model to describe the inherent relationships among elastic, dielectric, piezoelectric and mechanical losses. For multi-domain ferroelectric single crystals and ceramics, intrinsic and extrinsic energy

  12. Lead Poisoning

    Science.gov (United States)

    ... menopause.) Once the lead is released from the mother's bones, it re-enters the blood stream and ... drinks. Avoid eating off any colorfully painted ceramic plates, and avoid drinking from any ceramic mugs unless ...

  13. Lead Poisoning

    Science.gov (United States)

    ... Topics Environment & Health Healthy Living Pollution Reduce, Reuse, Recycle Science – How It Works The Natural World Games ... OTHERS: Lead has recently been found in some plastic mini-blinds and vertical blinds which were made ...

  14. 大数据时代下的情报分析与挖掘技术研究——电信客户流失情况分析%Research on Information Analysis and Data Mining in the Age of Big Data:Analysis of Customer Loss in Telecom

    Institute of Scientific and Technical Information of China (English)

    王晓佳; 杨善林; 陈志强

    2013-01-01

    大数据时代下的信息具有体量大、复杂性高、更新速度快的特点,从具有如此复杂特性的信息中挖掘出用户所需的情报,难度较以往有了很大的提升.要在发展中抢占先机,在大数据时代获取竞争优势,就必须对原有的情报分析思路进行必要的升级改造,以满足信息的情报属性.文章在介绍了大数据以及大数据环境下情报内涵转变的原因之后,提出了一种在大数据背景下的情报分析与挖掘的建模机理,首先应用MapReduce建立情报任务分解概念模型,然后针对分解后的某一单任务数据表进行预处理和数据挖掘工作,利用数学模型、人工智能等方法构造大数据时代下情报分析与数据挖掘的新思路.最后利用仿真实验来验证这一新思路的可行性和合理性.%Large scale, high complexity and update fast are three characteristic of information under the age of big data, the difficulty of mining valuable information form such complex data set has been greatly improved. In order to seize the opportunity in development and gain competitive advantage under the age of big data, it is must to update the original information analysis method and meet the data satisfy the information attribute. Based on the introduction of big data and the change reason of information content under the age of big data, this paper put forward a modeling mechanism of information analysis and mining under the age of big data, the modeling mechanism is, first, construct the model of task decomposition of information by MapReduce tool, then, make data preprocessing and mining operation according to the single task data sheet, use mathematical model, artificial intelligence and other methods to construct the new ideas of information analysis and data mining under the age of big data, finally, a case study presented to demonstrate the feasibility and rationality of our approach.

  15. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  16. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  17. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  18. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  19. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  20. From big data to deep insight in developmental science

    Science.gov (United States)

    2016-01-01

    The use of the term ‘big data’ has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data ‘big’ and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. WIREs Cogn Sci 2016, 7:112–126. doi: 10.1002/wcs.1379 For further resources related to this article, please visit the WIREs website. PMID:26805777

  1. Chromosomal aberration leads to recurrent pregnancy loss and partial trisomy of 5p12-15.3 in the offspring: report of a Syrian couple and review of the literature .

    Science.gov (United States)

    Al-Achkar, Walid; Moassass, Faten; Al-Ablog, Ayman; Liehr, Thomas; Fan, Xiaobo; Wafa, Abdulsamad

    2015-03-01

    Here we describe a Syrian couple having recurrent pregnancy loss in the first trimester, fetal malformations, and/or neonatal death. The father had a balanced chromosomal translocation t(5;15), an sY125 microdeletion of locus b in the azoospermia factor (AZF) gene, and an MTHFR C677T homozygous polymorphism with normal phenotype. Interestingly, his healthy wife had another MTHFR A1298C homozygous polymorphism. The couple experienced two pregnancy losses and had two stillborn children with severe malformations due to partial trisomy of the short arm of chromosome 5. The couple does not have any living offspring after 10 years of marriage.

  2. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  3. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  4. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  5. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  6. Ecotoxicology: Lead

    Science.gov (United States)

    Scheuhammer, A.M.; Beyer, W.N.; Schmitt, C.J.; Jorgensen, Sven Erik; Fath, Brian D.

    2008-01-01

    Lead (Pb) is a naturally occurring metallic element; trace concentrations are found in all environmental media and in all living things. However, certain human activities, especially base metal mining and smelting; combustion of leaded gasoline; the use of Pb in hunting, target shooting, and recreational angling; the use of Pb-based paints; and the uncontrolled disposal of Pb-containing products such as old vehicle batteries and electronic devices have resulted in increased environmental levels of Pb, and have created risks for Pb exposure and toxicity in invertebrates, fish, and wildlife in some ecosystems.

  7. How do big rivers come to be different?

    Science.gov (United States)

    Ashworth, Philip J.; Lewin, John

    2012-08-01

    Big rivers dominate the world's continental surface, yet we are still learning about how they operate and whether they are explicably different, not only from each other, but also from smaller rivers. This paper uses global satellite imagery and ground field-experience to explain and illustrate why and how big rivers are strongly differentiated. At the largest scale, trans-continent sized rivers do not possess unified valley systems created by fluvial erosion but instead involve chains of interlinked domains with contrasted fluvial functions. Alluvial settings are dependent on mainstream and tributary inputs of water and sediment, but big river channel pattern variety is determined by contrasts in sediment feed-rates and differences in the rates and routes of sediment exchange. Four modes of alluvial exchange are recognised: (i) deposition on the floodplain (e.g., levees, infilled palaeochannels and floodbasins), (ii) exchanges involving main channels (e.g., bank erosion and accretion), (iii) deposition within main channels (e.g. bedforms from metres to 10s of kilometres in size), and (iv) material input from tributaries (sediment-rich or sediment-poor). Different combinations of sedimentation activity lead to floodplain morphologies for big rivers that can be classified into four types: (i) lacustrine-dominated, (ii) mainstream-dominated, (iii) tributary or accessory-stream dominated, and (iv) confined or bedrock-dominated. Channel patterning involves a range of main-channel, branch and floodplain styles promoted by variable sediment feeds, complex bed sediment transfers, variable lateral sediment exchanges, plural channel systems and incomplete mineral sedimentation of the hydraulic corridors set by tectonics and prior-valley trenching. In some of the world's largest rivers it is accessory and tributary channels, rather than main-river branches, which determine patterns of floodplain morphology. In some big rivers, but certainly not all, ponded lacustrine

  8. Lead grids

    CERN Multimedia

    1974-01-01

    One of the 150 lead grids used in the multiwire proportional chamber g-ray detector. The 0.75 mm diameter holes are spaced 1 mm centre to centre. The grids were made by chemical cutting techniques in the Godet Workshop of the SB Physics.

  9. Leading men

    DEFF Research Database (Denmark)

    Bekker-Nielsen, Tønnes

    2016-01-01

    Through a systematic comparison of c. 50 careers leading to the koinarchate or high priesthood of Asia, Bithynia, Galatia, Lycia, Macedonia and coastal Pontus, as described in funeral or honorary inscriptions of individual koinarchs, it is possible to identify common denominators but also...

  10. The research and application of the power big data

    Science.gov (United States)

    Zhang, Suxiang; Zhang, Dong; Zhang, Yaping; Cao, Jinping; Xu, Huiming

    2017-01-01

    Facing the increasing environment crisis, how to improve energy efficiency is the important problem. Power big data is main support tool to realize demand side management and response. With the promotion of smart power consumption, distributed clean energy and electric vehicles etc get wide application; meanwhile, the continuous development of the Internet of things technology, more applications access the endings in the grid power link, which leads to that a large number of electric terminal equipment, new energy access smart grid, and it will produce massive heterogeneous and multi-state electricity data. These data produce the power grid enterprise's precious wealth, as the power big data. How to transform it into valuable knowledge and effective operation becomes an important problem, it needs to interoperate in the smart grid. In this paper, we had researched the various applications of power big data and integrate the cloud computing and big data technology, which include electricity consumption online monitoring, the short-term power load forecasting and the analysis of the energy efficiency. Based on Hadoop, HBase and Hive etc., we realize the ETL and OLAP functions; and we also adopt the parallel computing framework to achieve the power load forecasting algorithms and propose a parallel locally weighted linear regression model; we study on energy efficiency rating model to comprehensive evaluate the level of energy consumption of electricity users, which allows users to understand their real-time energy consumption situation, adjust their electricity behavior to reduce energy consumption, it provides decision-making basis for the user. With an intelligent industrial park as example, this paper complete electricity management. Therefore, in the future, power big data will provide decision-making support tools for energy conservation and emissions reduction.

  11. Empowering Personalized Medicine with Big Data and Semantic Web Technology: Promises, Challenges, and Use Cases.

    Science.gov (United States)

    Panahiazar, Maryam; Taslimitehrani, Vahid; Jadhav, Ashutosh; Pathak, Jyotishman

    2014-10-01

    In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. In this paper, we briefly discuss the nature of big data and the role of semantic web and data analysis for generating "smart data" which offer actionable information that supports better decision for personalized medicine. In our view, the biggest challenge is to create a system that makes big data robust and smart for healthcare providers and patients that can lead to more effective clinical decision-making, improved health outcomes, and ultimately, managing the healthcare costs. We highlight some of the challenges in using big data and propose the need for a semantic data-driven environment to address them. We illustrate our vision with practical use cases, and discuss a path for empowering personalized medicine using big data and semantic web technology.

  12. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare.

  13. Recurrent Pregnancy Loss

    Directory of Open Access Journals (Sweden)

    Véronique Piroux

    1997-01-01

    Full Text Available Antiphospholipid antibodies (APA are associated with thrombosis, thrombocytopenia and fetal loss but they occur in a variety of diseases. Despite many efforts, a correlation between the specificity of particular subgroups of APA and particular clinical situations remains to be established. The antigens at the origin of APA remain to be identified. We discuss here the possible links between cell apoptosis or necrosis, leading to plasma membrane alterations, and the occurrence of APA in response to sustained stimulation. The pathogenic potential of APA is also considered with respect to recurrent pregnancy loss.

  14. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  15. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  16. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  17. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  18. Computing the bounds on the loss rates

    OpenAIRE

    Fourneau J.-M.; Mokdad L.; Pekergin N.

    2002-01-01

    We consider an example network where we compute the bounds on cell loss rates. The stochastic bounds for these loss rates using simple arguments lead to models easier to solve. We proved, using stochastic orders, that the loss rates of these easier models are really the bounds of our original model. For ill-balanced configurations these models give good estimates of loss rates.

  19. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  20. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  1. The big de Rham–Witt complex

    DEFF Research Database (Denmark)

    Hesselholt, Lars

    2015-01-01

    This paper gives a new and direct construction of the multi-prime big de Rham–Witt complex, which is defined for every commutative and unital ring; the original construction by Madsen and myself relied on the adjoint functor theorem and accordingly was very indirect. The construction given here....... It is the existence of these divided Frobenius operators that makes the new construction of the big de Rham–Witt complex possible. It is further shown that the big de Rham–Witt complex behaves well with respect to étale maps, and finally, the big de Rham–Witt complex of the ring of integers is explicitly evaluated....

  2. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  3. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  4. Black Hole Blows Big Bubble

    Science.gov (United States)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  5. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  6. Big Brother Has Bigger Say

    Institute of Scientific and Technical Information of China (English)

    Yang Wei

    2009-01-01

    @@ 156 delegates from all walks of life in Guangdong province composed the Guangdong delegation for the NPC this year. The import and export value of Guangdong makes up one-third of national total value, and accounts for one-eighth of national economic growth. Guangdong province has maintained its top spot in import and export value among China's many provinces and cities for several years, commonly referred to as "Big Brother". At the same time, it is the region where the global financial crisis has hit China hardest.

  7. Parkinson’s Brain Disease Prediction Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    N. Shamli

    2016-06-01

    Full Text Available In healthcare industries, the demand for maintaining large amount of patients’ data is steadily growing due to rising population which has resulted in the increase of details about clinical and laboratory tests, imaging, prescription and medication. These data can be called “Big Data”, because of their size, complexity and diversity. Big data analytics aims at improving patient care and identifying preventive measures proactively. To save lives and recommend life style changes for a peaceful and healthier life at low costs. The proposed predictive analytics framework is a combination of Decision Tree, Support Vector Machine and Artificial Neural Network which is used to gain insights from patients. Parkinson’s disease voice dataset from UCI Machine learning repository is used as input. The experimental results show that early detection of disease will facilitate clinical monitoring of elderly people and increase the chances of their life span and improved lifestyle to lead peaceful life.

  8. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en terminolo

  9. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  10. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  11. Dextran sodium sulfate-induced colitis leads to bone loss in mice%葡聚糖硫酸钠诱导的小鼠肠炎中骨质丢失的分析

    Institute of Scientific and Technical Information of China (English)

    魏凡华; 胡志华

    2014-01-01

    建立小鼠葡聚糖硫酸钠肠炎(DSS)模型,并分析肠炎小鼠中骨质丢失的发生情况.C57BL/6小鼠口服DSS溶液2周以建立肠炎模型,第1周饮用2% DSS溶液,第2周饮用1% DSS溶液.饮用蒸馏水的C57BL/6小鼠作为对照.记录每组小鼠体质量变化,观察粪便和血便状况,μCT分析股骨骨质丢失的情况.结果表明DSS处理组小鼠和对照组相比,出现了明显的骨质丢失,骨量减少,骨小梁数目下降.说明DSS诱导的小鼠肠炎模型可以作为研究肠炎引发骨质丢失的良好模型.%To establish a colitis model induced by dextran sodium sulfate (DSS) and analyzed bone loss in colitic mice. Colitis was induced by administration of DSS solution for 2 weeks, 2%DSS and 1%DSS solution was administrated on the first week and second week, respectively. Body weight, stool and blood score of each group mice was recorded, and femoral bone loss were examined by micro computed tomography (μCT). The results demonstrate that DSS-treated mice exhibited a lower bone mass and decreased trabecular numbers as compared with the controls. Collectively, DSS-induced colitis model can be used to study pharmacological interventions for bone loss in mice.

  12. THE 2H(alpha, gamma6LI REACTION AT LUNA AND BIG BANG NUCLEOSYNTHETIS

    Directory of Open Access Journals (Sweden)

    Carlo Gustavino

    2013-12-01

    Full Text Available The 2H(α, γ6Li reaction is the leading process for the production of 6Li in standard Big Bang Nucleosynthesis. Recent observations of lithium abundance in metal-poor halo stars suggest that there might be a 6Li plateau, similar to the well-known Spite plateau of 7Li. This calls for a re-investigation of the standard production channel for 6Li. As the 2H(α, γ6Li cross section drops steeply at low energy, it has never before been studied directly at Big Bang energies. For the first time the reaction has been studied directly at Big Bang energies at the LUNA accelerator. The preliminary data and their implications for Big Bang nucleosynthesis and the purported 6Li problem will be shown.

  13. Advances in mobile cloud computing and big data in the 5G era

    CERN Document Server

    Mastorakis, George; Dobre, Ciprian

    2017-01-01

    This book reports on the latest advances on the theories, practices, standards and strategies that are related to the modern technology paradigms, the Mobile Cloud computing (MCC) and Big Data, as the pillars and their association with the emerging 5G mobile networks. The book includes 15 rigorously refereed chapters written by leading international researchers, providing the readers with technical and scientific information about various aspects of Big Data and Mobile Cloud Computing, from basic concepts to advanced findings, reporting the state-of-the-art on Big Data management. It demonstrates and discusses methods and practices to improve multi-source Big Data manipulation techniques, as well as the integration of resources availability through the 3As (Anywhere, Anything, Anytime) paradigm, using the 5G access technologies.

  14. Recent Development in Big Data Analytics for Business Operations and Risk Management.

    Science.gov (United States)

    Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang

    2017-01-01

    "Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

  15. What's in a Name: History and Meanings of the Term "Big Bang"

    CERN Document Server

    Kragh, Helge

    2013-01-01

    The name "big bang" introduced by Fred Hoyle in 1949 is one of the most successful scientific neologisms ever. How did the name originate and how was it received by physicists and astronomers in the period leading up to the hot big bang consensus model in the late 1960s? How did it reflect the meanings of the big bang, a concept that predates the name by nearly two decades? The paper gives a detailed account of names and concepts associated with finite-age cosmological models from the 1920s to the 1970s. It turns out that Hoyle's celebrated name has a richer and more surprising history than commonly assumed and also that the literature on modern cosmology and its history includes many common mistakes and errors. By following the story of "big bang" a new dimension is added to the historical understanding of the emergence of modern cosmology.

  16. Hidden loss

    DEFF Research Database (Denmark)

    Kieffer-Kristensen, Rikke; Johansen, Karen Lise Gaardsvig

    2013-01-01

    to participate. RESULTS: All children were affected by their parents' ABI and the altered family situation. The children's expressions led the authors to identify six themes, including fear of losing the parent, distress and estrangement, chores and responsibilities, hidden loss, coping and support. The main...... the ill parent. These findings contribute to a deeper understanding of the traumatic process of parental ABI that some children experience and emphasize the importance of family-centred interventions that include the children....

  17. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    Science.gov (United States)

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  18. Tumor suppression function of the Big-h3 gene in radiation carcinogenesis

    Science.gov (United States)

    Zhao, Y.; Piao, C.; Hei, T.

    Interaction between cell and extracellular matrix (ECM) plays a crucial role in tumor invasiveness and metastasis. Using an immortalized human bronchial epithelial (BEP2D) cell model, we show here that expression of Big-h3 gene, a secreted adhesion molecule induced by transforming growth factor- beta (TGF-beta ), is markedly decreased in independently generated, high LET radiation-induced tumor cell lines (TL1-TL5) relative to parental BEP2D cells. Expression of this gene was restored to control level in fusion cell lines between the tumorigenic and parental BEP2D cells that were no longer tumorigenic in nude mice. Transfection of Big-h3 gene into tumor cells resulted in a significant reduction of tumor growth. While integrin receptor alpha 5/beta 1 was overexpressed in tumor cells, its expression was corrected to the level of control BEP2D cells after Big-h3 transfection. These data suggest that Big-h3 is involved in tumor progression by regulating integrin receptor alpha 5/beta 1. . WWee We further show that down regulation of Big-h3 results from loss of expression of TGFbeta1 in tumor cells. The findings provide strong evidence that the Big-h3 gene has tumor suppressor function in radiation induced tumorigenic human bronchial epithelial cells and suggest a potential target for interventional therapy.

  19. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2016-09-13

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  20. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  1. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  2. Big Data – Big Deal for Organization Design?

    Directory of Open Access Journals (Sweden)

    Janne J. Korhonen

    2014-04-01

    Full Text Available Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998. Requisite organization argues that a new strategic emphasis requires the addition of a new stratum in the organization, resulting in greater organizational complexity. Requisite organization could serve as an objective, verifiable criterion for what qualifies as a genuine new strategic emphasis. Such a criterion is  necessary for research on the co-evolution of strategy and structure.

  3. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  4. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  5. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  6. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  7. 革命者BIG BANG

    Institute of Scientific and Technical Information of China (English)

    刘岩

    2015-01-01

    <正>在鄂尔多斯的繁荣时代,我遇见了那里的一位"意见领袖",因为他从美国回来,见过外面的世界,有着对奢侈品辽阔的见识和独到的品味。他引领着那座神秘财富城市中一个小圈子的购物风潮,他们一块接一块儿地购入Big Bang。那个时候,我并不太清楚他们迷恋这款腕表的原因,直到我一次次地去到巴塞尔表展,一次次地了解到Big Bang的想象力。是的,Big Bang的确充满了魅力。Big Bang进化史2005年Big Bang系列诞生2006年Big Bang全黑"全黑"理念使Big Bang更加纯粹和简洁。Big Bang全黑腕表从表壳到表盘浑然天成的亚光质感和多层次、不同材料融合起来的黑色,蕴含"不可见的可见"之禅意。

  8. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  9. Structuring the Curriculum around Big Ideas

    Science.gov (United States)

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  10. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  11. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-03-20

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  12. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  13. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  14. Review Study of Mining Big Data

    Directory of Open Access Journals (Sweden)

    Mohammad Misagh Javaherian

    2016-06-01

    Full Text Available Big data is time period for collecting extensive and complex data set which including both structured and nonstructured information. Data can come from everywhere. sensors for collecting environment data are presented in online networking targets, computer images and recording and so on , this information is known as big data. The valuable data can be extracted from this big data using data mining. Data mining is a method to find attractive samples and also logical models of information in wide scale. This article shown types of big data and future problems in extensive information as a chart. Study of issues in data-centered model in addition to big data will be analyzed.

  15. Toxicity of methyl parathion to bats: Mortality and coordination loss

    Science.gov (United States)

    Clark, D.R.

    1986-01-01

    The 24-h oral LD50 of methyl parathion (phosphorothioic acid O,O-dimethyl O-(4-nitrophenyl) ester) to little brown bats (Myotis lucifugus) (372 mg/kg) was 8.5 times the LD50 for mice (Mus musculus) (44 mg/kg). However, orally dosed mice either died or appeared behaviorally normal after 2 to 3 h, whereas many dosed bats, although alive at 24 h, could not right themselves when placed on their backs. The oral dose estimated to cause this loss of coordination in 50% of a sample of big brown bats (Eptesicus fuscus) was one-third or less the LD50 of this species. Cholinesterase activity depression in brains of little brown bats was similar whether dosage was oral or dermal. With death as the criterion, bats proved relatively insensitive to methyl parathion in 24-h tests, but considerations of the chemical's potential to cause coordination loss, leading to capture and death by predators, coupled with bats' naturally low reproductive rates, suggest possible injury to exposed bat populations.

  16. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  17. Evidence of the Big Fix

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2014-01-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value $v_{h}$. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self coupling are fixed when we vary $v_{h}$. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental low in our case.

  18. Evidence of the big fix

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  19. Big Book of Apple Hacks

    CERN Document Server

    Seibold, Chris

    2008-01-01

    Bigger in size, longer in length, broader in scope, and even more useful than our original Mac OS X Hacks, the new Big Book of Apple Hacks offers a grab bag of tips, tricks and hacks to get the most out of Mac OS X Leopard, as well as the new line of iPods, iPhone, and Apple TV. With 125 entirely new hacks presented in step-by-step fashion, this practical book is for serious Apple computer and gadget users who really want to take control of these systems. Many of the hacks take you under the hood and show you how to tweak system preferences, alter or add keyboard shortcuts, mount drives and

  20. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  1. Big ideas for psychotherapy training.

    Science.gov (United States)

    Fauth, James; Gates, Sarah; Vinca, Maria Ann; Boles, Shawna; Hayes, Jeffrey A

    2007-12-01

    Research indicates that traditional psychotherapy training practices are ineffective in durably improving the effectiveness of psychotherapists. In addition, the quantity and quality of psychotherapy training research has also been limited in several ways. Thus, based on extant scholarship and personal experience, we offer several suggestions for improving on this state of affairs. Specifically, we propose that future psychotherapy trainings focus on a few "big ideas," target psychotherapist meta-cognitive skills, and attend more closely to the organizational/treatment context in which the training takes place. In terms of future training research, we recommend that researchers include a wider range of intermediate outcomes in their studies, examine the nature of trainee skill development, and investigate the role that organizational/treatment culture plays in terms of the retention of changes elicited by psychotherapy training. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  2. Microsystems - The next big thing

    Energy Technology Data Exchange (ETDEWEB)

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  3. Big Bang 6Li nucleosynthesis studied deep underground (LUNA collaboration)

    Science.gov (United States)

    Trezzi, D.; Anders, M.; Aliotta, M.; Bellini, A.; Bemmerer, D.; Boeltzig, A.; Broggini, C.; Bruno, C. G.; Caciolli, A.; Cavanna, F.; Corvisiero, P.; Costantini, H.; Davinson, T.; Depalo, R.; Elekes, Z.; Erhard, M.; Ferraro, F.; Formicola, A.; Fülop, Zs.; Gervino, G.; Guglielmetti, A.; Gustavino, C.; Gyürky, Gy.; Junker, M.; Lemut, A.; Marta, M.; Mazzocchi, C.; Menegazzo, R.; Mossa, V.; Pantaleo, F.; Prati, P.; Rossi Alvarez, C.; Scott, D. A.; Somorjai, E.; Straniero, O.; Szücs, T.; Takacs, M.

    2017-03-01

    The correct prediction of the abundances of the light nuclides produced during the epoch of Big Bang Nucleosynthesis (BBN) is one of the main topics of modern cosmology. For many of the nuclear reactions that are relevant for this epoch, direct experimental cross section data are available, ushering the so-called "age of precision". The present work addresses an exception to this current status: the 2H(α,γ)6Li reaction that controls 6Li production in the Big Bang. Recent controversial observations of 6Li in metal-poor stars have heightened the interest in understanding primordial 6Li production. If confirmed, these observations would lead to a second cosmological lithium problem, in addition to the well-known 7Li problem. In the present work, the direct experimental cross section data on 2H(α,γ)6Li in the BBN energy range are reported. The measurement has been performed deep underground at the LUNA (Laboratory for Underground Nuclear Astrophysics) 400 kV accelerator in the Laboratori Nazionali del Gran Sasso, Italy. The cross section has been directly measured at the energies of interest for Big Bang Nucleosynthesis for the first time, at Ecm = 80, 93, 120, and 133 keV. Based on the new data, the 2H(α,γ)6Li thermonuclear reaction rate has been derived. Our rate is even lower than previously reported, thus increasing the discrepancy between predicted Big Bang 6Li abundance and the amount of primordial 6Li inferred from observations.

  4. What makes Big Data, Big Data? Exploring the ontological characteristics of 26 datasets

    Directory of Open Access Journals (Sweden)

    Rob Kitchin

    2016-02-01

    Full Text Available Big Data has been variously defined in the literature. In the main, definitions suggest that Big Data possess a suite of key traits: volume, velocity and variety (the 3Vs, but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes Big Data, Big Data?’, applying Kitchin’s taxonomy of seven Big Data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute Big Data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of Big Data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that Big Data as an analytical category needs to be unpacked, with the genus of Big Data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes Big Data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.

  5. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  6. Compromised data from social media to big data

    CERN Document Server

    Redden, Joanna; Langlois, Ganaele

    2015-01-01

    There has been a data rush in the past decade brought about by online communication and, in particular, social media (Facebook, Twitter, Youtube, among others), which promises a new age of digital enlightenment. But social data is compromised: it is being seized by specific economic interests, it leads to a fundamental shift in the relationship between research and the public good, and it fosters new forms of control and surveillance. Compromised Data: From Social Media to Big Data explores how we perform critical research within a compromised social data framework. The expert, international l

  7. Special Issue: Big data and predictive computational modeling

    Science.gov (United States)

    Koutsourelakis, P. S.; Zabaras, N.; Girolami, M.

    2016-09-01

    The motivation for this special issue stems from the symposium on "Big Data and Predictive Computational Modeling" that took place at the Institute for Advanced Study, Technical University of Munich, during May 18-21, 2015. With a mindset firmly grounded in computational discovery, but a polychromatic set of viewpoints, several leading scientists, from physics and chemistry, biology, engineering, applied mathematics, scientific computing, neuroscience, statistics and machine learning, engaged in discussions and exchanged ideas for four days. This special issue contains a subset of the presentations. Video and slides of all the presentations are available on the TUM-IAS website http://www.tum-ias.de/bigdata2015/.

  8. A Perplexed Economist Confronts 'too Big to Fail'

    Directory of Open Access Journals (Sweden)

    Scherer, F. M.

    2010-12-01

    Full Text Available This paper examines premises and data underlying the assertion that some financial institutions in the U.S. economy were "too big to fail" and hence warranted government bailout. It traces the merger histories enhancing the dominance of six leading firms in the U. S. banking industry and he sharp increases in the concentration of financial institution assets accompanying that merger wave. Financial institution profits are found to have soared in tandem with rising concentration. The paper advances hypotheses why these phenomena might be related and surveys relevant empirical literature on the relationships between market concentration, interest rates received and charged by banks, and economies of scale in banking.

  9. Big data as governmentality in international development

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    2017-01-01

    Statistics have long shaped the field of visibility for the governance of development projects. The introduction of big data has altered the field of visibility. Employing Dean's “analytics of government” framework, we analyze two cases—malaria tracking in Kenya and monitoring of food prices...... in Indonesia. Our analysis shows that big data introduces a bias toward particular types of visualizations. What problems are being made visible through big data depends to some degree on how the underlying data is visualized and who is captured in the visualizations. It is also influenced by technical factors...

  10. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  11. BLENDING IOT AND BIG DATA ANALYTICS

    OpenAIRE

    Tulasi.B*; Girish J Vemulkar

    2016-01-01

    Internet is continuously evolving and changing. Internet of Things (IoT) can be considered as the future of Internet applications which involves machine to machine learning (M2M). The actionable intelligence can be derived through fusion of Big Data and real time analytics with IoT. Big Data and IoT can be viewed as two sides of a coin. With the connection between Big Data and the objects on Internet benefits of IoT can be easily reaped. The applications of IoT spread across various domains l...

  12. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization.

  13. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  14. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  15. Computing seismic damage estimates for buildings within a big city. Bucharest case study.

    Science.gov (United States)

    Toma-Danila, Dragos; Armas, Iuliana

    2016-04-01

    The seismic risk analysis of big cities is a very demanding yet necessary task; the modeling of such complex systems requires first of all insightful input data at good resolution, referring to local effects, buildings and socio-economic aspects. Also, seismic risk estimation methods with good confidence levels are needed. Until recently, these requirements were not fulfilled for Bucharest, one of the most endangered capital city in Europe due to earthquakes. Based on 2011 and 2002 census data, standardized according to the framework of the Near-real time System for Estimating the Seismic Damage in Romania (SeisDaRo) through a unique approach and on relevant hazard scenarios, we estimate for the first time the building damage within the city, divided in more than 120 areas. The methodology applied relies on 48 vulnerability curves for buildings, on the Improved Displacement Coefficient Analytical Method included in the SELENA software for computing damage probabilities and on multiple seismic hazard scenarios, including the maximum possible. In order to compare results with real losses we use a scenario based on the 4 March 1977 Vrancea earthquake (7.4 moment-magnitude) that lead to 1424 deaths in Bucharest. By using overlay analysis with satellite imagery and a new methodology integrated in GIS we show how results can be enhanced, reflecting even more local characteristics. Best practices for seismic risk mapping are also expressed. Results are promising and contribute to the mitigation efforts in Bucharest.

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  17. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    Directory of Open Access Journals (Sweden)

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-02-01

    Full Text Available The Big Bang–Big Crunch (BB–BC optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response of FIR filters and error graph. The BB-BC seems to be promising tool for FIR filter design especially in a dynamic environment where filter coefficients have to be adapted and fast convergence is of importance.

  18. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  19. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  20. 6 Top Tools for Taming Big Data%6Top Tools for Taming Big Data

    Institute of Scientific and Technical Information of China (English)

    JakoB BJ orklund

    2012-01-01

    The industry now has a buzzword,"big data," for how we're going to do something with the huge amount of information piling up."Big data" is replacing "business intelligence,"which subsumed "reporting," which put a nicer gloss on "spreadsheets," which beat out the old-fashioned "printouts."Managers who long ago studied printouts are now hiring mathematicians who claim to be big data specialists to help them solve the same old problem:What's selling and why?

  1. BDGS: A Scalable Big Data Generator Suite in Big Data Benchmarking

    OpenAIRE

    Ming, Zijian; Luo, Chunjie; Gao, Wanling; Han, Rui; Yang, Qiang; Wang, Lei; Zhan, Jianfeng

    2014-01-01

    Data generation is a key issue in big data benchmarking that aims to generate application-specific data sets to meet the 4V requirements of big data. Specifically, big data generators need to generate scalable data (Volume) of different types (Variety) under controllable generation rates (Velocity) while keeping the important characteristics of raw data (Veracity). This gives rise to various new challenges about how we design generators efficiently and successfully. To date, most existing tec...

  2. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Science.gov (United States)

    2011-02-11

    ... December 1, 2010, the date that Big Rivers integrated its transmission facilities with the Midwest... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its...

  3. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  4. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  5. Neutrino oscillations and Big Bang Nucleosynthesis

    OpenAIRE

    Bell, Nicole F.

    2001-01-01

    We outline how relic neutrino asymmetries may be generated in the early universe via active-sterile neutrino oscillations. We discuss possible consequences for big bang nucleosynthesis, within the context of a particular 4-neutrino model.

  6. Tick-Borne Diseases: The Big Two

    Science.gov (United States)

    ... of this page please turn Javascript on. Feature: Ticks and Diseases Tick-borne Diseases: The Big Two Past Issues / Spring - ... on the skin where there has been a tick bite. Photo: CDC/James Gathany Lyme disease Lyme ...

  7. Big Fish and Prized Trees Gain Protection

    Institute of Scientific and Technical Information of China (English)

    Fred Pearce; 吴敏

    2004-01-01

    @@ Decisions made at a key conservation① meeting are good news for big and quirky② fish and commercially prized trees. Several species will enjoy extra protection against trade following rulings made at the Convention on International Trade in Endangered Species (CITES).

  8. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  9. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  10. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D. “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  11. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  12. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  13. Cosmic relics from the big bang

    Energy Technology Data Exchange (ETDEWEB)

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  14. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  15. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  16. Superhorizon curvaton amplitude in inflation and pre-big bang cosmology

    DEFF Research Database (Denmark)

    Sloth, Martin Snoager

    2002-01-01

    the same way. We also discuss the amplitude of the density perturbations, which leads to some interesting constrains on the pre-big bang scenario. It is shown that within a SL(3,R) non-linear sigma model one of the three axions has the right coupling to the dilaton and moduli to yield a flat spectrum...

  17. Initial conditions and the structure of the singularity in pre-big-bang cosmology

    NARCIS (Netherlands)

    Feinstein, A.; Kunze, K.E.; Vazquez-Mozo, M.A.

    2000-01-01

    We propose a picture, within the pre-big-bang approach, in which the universe emerges from a bath of plane gravitational and dilatonic waves. The waves interact gravitationally breaking the exact plane symmetry and lead generically to gravitational collapse resulting in a singularity with the Kasner

  18. Bid to recreate the Big Bang and unlock the secrets of life hits a

    CERN Multimedia

    Morgan, James

    2007-01-01

    "It was not the kind of "big band" they were hopint for - but the explosion at the new £6.81 bn particle accelerator in Switzerland on Saturday, was "not a major setback", says a British scientist who is leading the project." (1 page)

  19. A rare functional haplotype of the P2RX4 and P2RX7 genes leads to loss of innate phagocytosis and confers increased risk of age-related macular degeneration.

    Science.gov (United States)

    Gu, Ben J; Baird, Paul N; Vessey, Kirstan A; Skarratt, Kristen K; Fletcher, Erica L; Fuller, Stephen J; Richardson, Andrea J; Guymer, Robyn H; Wiley, James S

    2013-04-01

    Age-related macular degeneration (AMD) is a leading cause of blindness in Western countries and is diagnosed by the clinical appearance of yellow subretinal deposits called drusen. Genetic changes in immune components are clearly implicated in the pathology of this disease. We have previously shown that the purinergic receptor P2X7 can act as a scavenger receptor, mediating phagocytosis of apoptotic cells and insoluble debris. We performed a genetic association study of functional polymorphisms in the P2RX7 and P2RX4 genes in a cohort of 744 patients with AMD and 557 age-matched Caucasian control subjects. The P2X4 Tyr315Cys variant was 2-fold more frequent in patients with AMD compared to control subjects, with the minor allele predicting susceptibility to disease. Pairwise linkage disequilibrium was observed between Tyr315Cys in the P2RX4 gene and Gly150Arg in the P2RX7 gene, and these two minor alleles formed a rare haplotype that was overrepresented in patients with AMD (n=17) compared with control subjects (n=3) (odds ratio 4.05, P=0.026). Expression of P2X7 (wild type or variant 150Arg) in HEK293 cells conferred robust phagocytosis toward latex beads, whereas coexpression of the P2X7 150Arg with P2X4 315Cys variants almost completely inhibited phagocytic capacity. Fresh human monocytes harboring this heterozygous 150Arg-315Cys haplotype showed 40% reduction in bead phagocytosis. In the primate eye, immunohistochemistry indicated that P2X7 and P2X4 receptors were coexpressed on microglia and macrophages, but neither receptor was seen on retinal pigment epithelial cells. These results demonstrate that a haplotype including two rare variants in P2RX7 and P2RX4 confers a functional interaction between these two variant receptors that impairs the normal scavenger function of macrophages and microglia. Failure of this P2X7-mediated phagocytic pathway may impair removal of subretinal deposits and predispose individuals toward AMD.

  20. Detection of Equipment Faults Before Beam Loss

    CERN Document Server

    Galambos, J

    2016-01-01

    High-power hadron accelerators have strict limits on fractional beam loss. In principle, once a high-quality beam is set up in an acceptable state, beam loss should remain steady. However, in practice, there are many trips in operational machines, owing to excessive beam loss. This paper deals with monitoring equipment health to identify precursor signals that indicate an issue with equipment that will lead to unacceptable beam loss. To this end, a variety of equipment and beam signal measurements are described. In particular, several operational examples from the Spallation Neutron Source (SNS) of deteriorating equipment functionality leading to beam loss are reported.

  1. Big bang nucleosynthesis: Present status

    Science.gov (United States)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nνpointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  2. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  3. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  4. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  5. Astronomical Surveys and Big Data

    CERN Document Server

    Mickaelian, A M

    2015-01-01

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum are reviewed, from Gamma-ray to radio, such as Fermi-GLAST and INTEGRAL in Gamma-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and II based catalogues (APM, MAPS, USNO, GSC) in optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio and many others, as well as most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS) and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era. Astrophysical Virtual Observatories and Computational Astrophysics play a...

  6. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  7. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  8. Big Data Empowered Self Organized Networks

    OpenAIRE

    Baldo, Nicola; Giupponi, Lorenza; Mangues-Bafalluy, Josep

    2014-01-01

    Mobile networks are generating a huge amount of data in the form of network measurements as well as network control and management interactions, and 5G is expected to make it even bigger. In this paper, we discuss the different approaches according to which this information could be leveraged using a Big Data approach. In particular, we focus on Big Data Empowered Self Organized Networks, discussing its most peculiar traits, its potential, and the relevant related work, as well as analysing s...

  9. Congenital malalignment of the big toe nail.

    Science.gov (United States)

    Wagner, Gunnar; Sachse, Michael Max

    2012-05-01

    Congenital malalignment of the big toe nail is based on a lateral deviation of the nail plate. This longitudinal axis shift is due to a deviation of the nail matrix, possibly caused by increased traction of the hypertrophic extensor tendon of the hallux. Congenital malalignment of the big toe nail is typically present at birth. Ingrown toenails and onychogryphosis are among the most common complications. Depending on the degree of deviation, conservative or surgical treatment may be recommended.

  10. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    OpenAIRE

    Jaseena K.U,; Julie M. David

    2014-01-01

    Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguishe...

  11. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  12. Data Confidentiality Challenges in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  13. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  14. Harnessing the Heart of Big Data

    OpenAIRE

    Scruggs, Sarah B; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. ...

  15. Soil biogeochemistry in the age of big data

    Science.gov (United States)

    Cécillon, Lauric; Barré, Pierre; Coissac, Eric; Plante, Alain; Rasse, Daniel

    2015-04-01

    Data is becoming one of the key resource of the XXIst century. Soil biogeochemistry is not spared by this new movement. The conservation of soils and their services recently came into the political agenda. However, clear knowledge on the links between soil characteristics and the various processes ensuring the provision of soil services is rare at the molecular or the plot scale, and does not exist at the landscape scale. This split between society's expectations on its natural capital, and scientific knowledge on the most complex material on earth has lead to an increasing number of studies on soils, using an increasing number of techniques of increasing complexity, with an increasing spatial and temporal coverage. From data scarcity with a basic data management system, soil biogeochemistry is now facing a proliferation of data, with few quality controls from data collection to publication and few skills to deal with them. Based on this observation, here we (1) address how big data could help in making sense of all these soil biogeochemical data, (2) point out several shortcomings of big data that most biogeochemists will experience in their future career. Massive storage of data is now common and recent opportunities for cloud storage enables data sharing among researchers all over the world. The need for integrative and collaborative computational databases in soil biogeochemistry is emerging through pioneering initiatives in this direction (molTERdb; earthcube), following soil microbiologists (GenBank). We expect that a series of data storage and management systems will rapidly revolutionize the way of accessing raw biogeochemical data, published or not. Data mining techniques combined with cluster or cloud computing hold significant promises for facilitating the use of complex analytical methods, and for revealing new insights previously hidden in complex data on soil mineralogy, organic matter and biodiversity. Indeed, important scientific advances have

  16. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  17. Perspectives on making big data analytics work for oncology.

    Science.gov (United States)

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  18. Autoimmunity in visual loss.

    Science.gov (United States)

    Petzold, Axel; Wong, Sui; Plant, Gordon T

    2016-01-01

    There are a number of autoimmune disorders which can affect visual function. There are a very large number of mechanisms in the visual pathway which could potentially be the targets of autoimmune attack. In practice it is the retina and the anterior visual pathway (optic nerve and chiasm) that are recognised as being affected in autoimmune disorders. Multiple Sclerosis is one of the commonest causes of visual loss in young adults because of the frequency of attacks of optic neuritis in that condition, however the basis of the inflammation in Multiple Sclerosis and the confirmation of autoimmunity is lacking. The immune process is known to be highly unusual in that it is not systemic and confined to the CNS compartment. Previously an enigmatic partner to Multiple Sclerosis, Neuromyelitis Optica is now established to be autoimmune and two antibodies - to Aquaporin4 and to Myelin Oligodendrocyte Glycoprotein - have been implicated in the pathogenesis. The term Chronic Relapsing Inflammatory Optic Neuropathy is applied to those cases of optic neuritis which require long term immunosuppression and hence are presumed to be autoimmune but where no autoimmune pathogenesis has been confirmed. Optic neuritis occurring post-infection and post vaccination and conditions such as Systemic Lupus Erythematosus and various vasculitides may cause direct autoimmune attack to visual structures or indirect damage through occlusive vasculopathy. Chronic granulomatous disorders such as Sarcoidosis affect vision commonly by a variety of mechanisms, whether and how these are placed in the autoimmune panoply is unknown. As far as the retina is concerned Cancer Associated Retinopathy and Melanoma Associated Retinopathy are well characterised clinically but a candidate autoantibody (recoverin) is only described in the former disorder. Other, usually monophasic, focal retinal inflammatory disorders (Idiopathic Big Blind Spot Syndrome, Acute Zonal Occult Outer Retinopathy and Acute Macular

  19. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    Science.gov (United States)

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  20. Big Computing in Astronomy: Perspectives and Challenges

    Science.gov (United States)

    Pankratius, Victor

    2014-06-01

    Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds

  1. On Subtitle Translation of Sitcoms-A Case Study of The Big Bang Theory

    Institute of Scientific and Technical Information of China (English)

    杨雯婷

    2013-01-01

    As we all know that exquisite subtitle translation of foreign film and television series is the fatal elements for them to spread among Chinese audiences. This article is based on Eugene·Nida’s“the Functional Equivalence”principle with three char⁃acteristics of sitcoms’subtitle to study the type, form and features of the Big Bang Theory, which lead to the conclusion of sitcom subtitle’s characteristics. It helps us to analyze its subtitle from six aspects. As the result, the author of the paper makes the conclu⁃sion of translation tactic about Big Bang Theory, which could help the subtitle translation of similar sitcoms.

  2. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  3. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  4. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  5. Computing the bounds on the loss rates

    Directory of Open Access Journals (Sweden)

    Fourneau J.-M.

    2002-01-01

    Full Text Available We consider an example network where we compute the bounds on cell loss rates. The stochastic bounds for these loss rates using simple arguments lead to models easier to solve. We proved, using stochastic orders, that the loss rates of these easier models are really the bounds of our original model. For ill-balanced configurations these models give good estimates of loss rates.

  6. Decoherence in Josephson Qubits from Dielectric Loss

    OpenAIRE

    Martinis, John M.; Cooper, K. B.; McDermott, R.; Steffen, Matthias; Ansmann, Markus; Osborn, K; Cicak, K.; Oh, S.; Pappas, D. P.; Simmonds, R. W.; Yu, Clare C

    2005-01-01

    Dielectric loss from two-level states is shown to be a dominant decoherence source in superconducting quantum bits. Depending on the qubit design, dielectric loss from insulating materials or the tunnel junction can lead to short coherence times. We show that a variety of microwave and qubit measurements are well modeled by loss from resonant absorption of two-level defects. Our results demonstrate that this loss can be significantly reduced by using better dielectrics and fabricating junctio...

  7. Small government or big government?

    Directory of Open Access Journals (Sweden)

    MATEO SPAHO

    2015-03-01

    Full Text Available Since the beginning of the twentieth century, economists and philosophers were polarizedon their positions beyond the role that the government should have in the economy. On one hand John Maynard Keynes represented, within the optics of market economy, a position where the state should intervene in the economy to maintain the aggregate demand and the employment in the country, without hesitation in creating budget deficits and public debt expansion. This approach happens especially in the moments when the domestic economy and global economic trends show a weak growth or a recession. This means a heavy interference inthe economy, with higher income but with high expenditure to GDP too. On the other side, Liberals and Neoliberalsled by Friedrich Hayek advocated a withdrawal of the government from economic activity not just in moments of economic growth but also during the crisis, believing that the market has self-regulating mechanisms within itself. The government, as a result will have a smaller dimension with lower income and also low expenditures compared to the GDP of the country. We took the South-Eastern Europe countries distinguishing those with a "Big Government" or countries with "Small Government". There are analyzed the economic performances during the global crisis (2007-2014. In which countries the public debt grew less? Which country managed to attract more investments and which were the countries that preserved the purchasing power of their consumers? We shall see if during the economic crisis in Eastern Europe the Great Government or the Liberal and "Small" one has been the most successful the model.

  8. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  9. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  10. Big Data Big Changes%大数据,大变革

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。%Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.

  11. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  12. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  13. Vision Loss, Sudden

    Science.gov (United States)

    ... of age-related macular degeneration. Spotlight on Aging: Vision Loss in Older People Most commonly, vision loss ... Some Causes and Features of Sudden Loss of Vision Cause Common Features* Tests Sudden loss of vision ...

  14. HADOOP+Big Data: Analytics Using Series Queue with Blocking Model

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2014-07-01

    Full Text Available Big data deals with large volumes of tons and tons of data. Since managing this much amount of data is not in the mere way for the traditional data mining techniques. Technology is in the world of pervasive environment i.e., technology follows up with its tremendous growth. Hence coordinating these amount of data in a linear way is mere little difficult, hence we proposed a new scheme in order to draw the data and data transformation in large data base. We extended our work in HADOOP (one of the big data managing tool. Our model is fully based on aggregation of data and data modelling. Our proposed model leads to high end data transformation for big data processing. We achieved our analytical result by applying our model with 2 HADOOP clusters, 4 nodes and with 25 jobs in MR functionality.

  15. Big Data Analytics Platforms analyze from startups to traditional database players

    Directory of Open Access Journals (Sweden)

    Ionut TARANU

    2015-07-01

    Full Text Available Big data analytics enables organizations to analyze a mix of structured, semi-structured and unstructured data in search of valuable business information and insights. The analytical findings can lead to more effective marketing, new revenue opportunities, better customer service, improved operational efficiency, competitive advantages over rival organizations and other business benefits. With so many emerging trends around big data and analytics, IT organizations need to create conditions that will allow analysts and data scientists to experiment. "You need a way to evaluate, prototype and eventually integrate some of these technologies into the business," says Chris Curran[1]. In this paper we are going to review 10 Top Big Data Analytics Platforms and compare the key-features.

  16. Big sized players on the European Union’s financial advisory market

    Directory of Open Access Journals (Sweden)

    Nicolae, C.

    2013-06-01

    Full Text Available The paper presents the activity and the objectives of “The Big Four” Group of Financial Advisory Firms. The “Big Four” are the four largest international professional services networks in accountancy and professional services, offering audit, assurance, tax, consulting, advisory, actuarial, corporate finance and legal services. They handle the vast majority of audits for publicly traded companies as well as many private companies, creating an oligopoly in auditing large companies. It is reported that the Big Four audit all but one of the companies that constitute the FTSE 100, and 240 of the companies in the FTSE 250, an index of the leading mid-cap listing companies.

  17. Chronic lead poisoning in horses

    Energy Technology Data Exchange (ETDEWEB)

    Knight, H.D.; Burau, R.G.

    1973-05-01

    Chronic lead poisoning in horses was manifested as anorexia, loss of body weight, muscular weakness, anemia, laryngeal hemiplegia, and, terminally, inhalation pneumonia. Some deaths were sudden and unexplained. The lead content in liver specimens from 10 horses was greater than that considered indicative of lead intoxication; however, the lead content of blood was equivocal. The most conclusive laboratory finding was increased urine lead concentration after chelation therapy. The concentration of lead in a sample of vegetation considered to be representative of what a horse would eat if he was grazing in the area sampled was 325 ppM (oven-dry basis). It was determined that a 450-kg horse grazing grass of this lead content would consume 2.9 Gm of lead daily (6.4 mg/kg of body weight), an amount considered toxic for horses. Leaching lowered the calcium content of the forage but failed to reduce the lead concentration of the plants significantly, thus opening the possibility that winter rains might have influenced the onset of poisoning. Airborne fallout from a nearby lead smelter was proposed as the primary mode of pasture contamination.

  18. Big Bang-Big Crunch Algorithm for Voltage Stability Limit Improvement by Coordinated Control of SVC Settings

    Directory of Open Access Journals (Sweden)

    S. Sakthivel

    2013-07-01

    Full Text Available Modern power system networks are operated under highly stressed conditions and there is a risk of voltage instability problems owing to increased load demand. A power system needs to be with sufficient voltage stability margin for secured operation. In this study, SVC parameters of location and size along with generator bus voltages, transformer tap settings are considered as control parameters for voltage stability limit improvement by minimizing loss and voltage deviation. The control parameters are varied in a coordinated manner for better results. The line based LQP voltage stability indicator is used for voltage stability assessment. The nature inspired meta heuristic Big Bang-Big Crunch (BB-BC algorithm is exploited for optimization of the control variables and the performance is compared with that of PSO algorithm. The effectiveness of the proposed algorithm is tested on the standard IEEE 30 bus system under normal and N-1 line outage contingency conditions. The results obtained from the simulation encourage the performances of the new algorithm.

  19. Transcriptome marker diagnostics using big data.

    Science.gov (United States)

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  20. Conceptualization and theorization of the Big Data

    Directory of Open Access Journals (Sweden)

    Marcos Mazzieri

    2016-06-01

    Full Text Available The term Big Data is being used widely by companies and researchers who consider your relevant functionalities or applications to create value and business innovation. However some questions arise about what is this phenomenon and, more precisely, how it occurs and under what conditions it can create value and innovation in business. In our view, the lack of depth related to the principles involved in Big Data and the very absence of a conceptual definition, made it difficult to answer these questions that have been the basis for our research. To answer these questions we did a bibliometric study and extensive literature review. The bibliometric studies were realized based in articles and citation of Web of Knowledge database. The main result of our research is the providing a conceptual definition for the term Big Data. Also, we propose which principles discovered can contribute with other researches  that intend value creation by Big Data. Finally we propose see the value creation through Big Data using the  Resource Based View as the main theory used for discuss that theme.