WorldWideScience

Sample records for analyses predict a20

  1. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 94; Issue 1. Comparative analyses of genetic risk prediction methods reveal extreme diversity of genetic predisposition to nonalcoholic fatty liver disease (NAFLD) among ethnic populations of India. Ankita Chatterjee Analabha Basu Abhijit Chowdhury Kausik Das Neeta ...

  2. Baseline Depressive Symptoms Predict Subsequent Heart Disease; A 20-Year Cohort

    Directory of Open Access Journals (Sweden)

    Maryam Moghani Lankarani

    2016-03-01

    Full Text Available Depression is common among patients with heart disease. Depression is also associated with worse outcomes among patients with heart disease. Fewer studies have shown whether or not baseline depressive symptoms predict subsequent heart disease in general population.

  3. Neonatal Sleep-Wake Analyses Predict 18-month Neurodevelopmental Outcomes.

    Science.gov (United States)

    Shellhaas, Renée A; Burns, Joseph W; Hassan, Fauziya; Carlson, Martha D; Barks, John D E; Chervin, Ronald D

    2017-11-01

    The neurological examination of critically ill neonates is largely limited to reflexive behavior. The exam often ignores sleep-wake physiology that may reflect brain integrity and influence long-term outcomes. We assessed whether polysomnography and concurrent cerebral near-infrared spectroscopy (NIRS) might improve prediction of 18-month neurodevelopmental outcomes. Term newborns with suspected seizures underwent standardized neurologic examinations to generate Thompson scores and had 12-hour bedside polysomnography with concurrent cerebral NIRS. For each infant, the distribution of sleep-wake stages and electroencephalogram delta power were computed. NIRS-derived fractional tissue oxygen extraction (FTOE) was calculated across sleep-wake stages. At age 18-22 months, surviving participants were evaluated with Bayley Scales of Infant Development (Bayley-III), 3rd edition. Twenty-nine participants completed Bayley-III. Increased newborn time in quiet sleep predicted worse 18-month cognitive and motor scores (robust regression models, adjusted r2 = 0.22, p = .007, and 0.27, .004, respectively). Decreased 0.5-2 Hz electroencephalograph (EEG) power during quiet sleep predicted worse 18-month language and motor scores (adjusted r2 = 0.25, p = .0005, and 0.33, .001, respectively). Predictive values remained significant after adjustment for neonatal Thompson scores or exposure to phenobarbital. Similarly, an attenuated difference in FTOE, between neonatal wakefulness and quiet sleep, predicted worse 18-month cognitive, language, and motor scores in adjusted analyses (each p sleep-as quantified by increased time in quiet sleep, lower electroencephalogram delta power during that stage, and muted differences in FTOE between quiet sleep and wakefulness-may improve prediction of adverse long-term outcomes for newborns with neurological dysfunction. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved

  4. Genome sequencing of Sulfolobus sp. A20 from Costa Rica and comparative analyses of the putative pathways of carbon, nitrogen and sulfur metabolism in various Sulfolobus strains

    Directory of Open Access Journals (Sweden)

    Xin Dai

    2016-11-01

    Full Text Available The genome of Sulfolobus sp. A20 isolated from a hot spring in Costa Rica was sequenced. This circular genome of the strain is 2,688,317 bp in size and 34.8% in G+C content, and contains 2,591 open reading frames (ORFs. Strain A20 shares ~95.6% identity at the 16S rRNA gene sequence level and less than 30% DNA-DNA hybridization (DDH values with the most closely related known Sulfolobus species (i.e., S. islandicus and S. solfataricus, suggesting that it represents a novel Sulfolobus species. Comparison of the genome of strain A20 with those of the type strains of S. solfataricus, S. acidocaldarius, S. islandicus and S. tokodaii, which were isolated from geographically separated areas, identified 1,801 genes conserved among all Sulfolobus species analyzed (core genes. Comparative genome analyses show that central carbon metabolism in Sulfolobus is highly conserved, and enzymes involved in the Entner-Doudoroff pathway, the tricarboxylic acid cycle and the CO2 fixation pathways are predominantly encoded by the core genes. All Sulfolobus species encode genes required for the conversion of ammonium into glutamate/glutamine. Some Sulfolobus strains have gained the ability to utilize additional nitrogen source such as nitrate (i.e. S. islandicus strain REY15A, LAL14/1, M14.25 and M16.27 or urea (i.e. S. islandicus HEV10/4, S. tokodaii strain7 and S. metallicus DSM 6482. The strategies for sulfur metabolism are most diverse and least understood. S. tokodaii encodes sulfur oxygenase/reductase (SOR, whereas both S. islandicus and S. solfataricus contain genes for sulfur reductase (SRE. However, neither SOR nor SRE genes exist in the genome of strain A20, raising the possibility that an unknown pathway for the utilization of elemental sulfur may be present in the strain. The ability of Sulfolobus to utilize nitrate or sulfur is encoded by a gene cluster flanked by IS elements or their remnants. These clusters appear to have become fixed at a specific

  5. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  6. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    2015-03-12

    Mar 12, 2015 ... where it is related to modern lifestyle with additional com- plication due to rising incidence of type 2 diabetes melli- tus (DM) and obesity (Angulo and ..... is rapidly becoming a health burden in western and develop- ing countries. In this study we defined a model of disease risk score prediction for different ...

  7. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  8. Analyse minéralogique et par microscopie électronique d’échantillons de sols argileux provenant de l’A20 à proximité Châteauroux

    OpenAIRE

    DUC, Myriam; MALOULA, Aurélie

    2008-01-01

    Analyses minéralogiques, composition élémentaire semi-quantitative et observations par microscopie électronique à balayage sur plusieurs échantillons de sol provenant de la région de Châteauroux suite au désordre survenu sur A20 (retrait gonflement des argiles suspecté).

  9. HHV Predicting Correlations for Torrefied Biomass Using Proximate and Ultimate Analyses

    Directory of Open Access Journals (Sweden)

    Daya Ram Nhuchhen

    2017-01-01

    Full Text Available Many correlations are available in the literature to predict the higher heating value (HHV of raw biomass using the proximate and ultimate analyses. Studies on biomass torrefaction are growing tremendously, which suggest that the fuel characteristics, such as HHV, proximate analysis and ultimate analysis, have changed significantly after torrefaction. Such changes may cause high estimation errors if the existing HHV correlations were to be used in predicting the HHV of torrefied biomass. No study has been carried out so far to verify this. Therefore, this study seeks answers to the question: “Can the existing correlations be used to determine the HHV of the torrefied biomass”? To answer this, the existing HHV predicting correlations were tested using torrefied biomass data points. Estimation errors were found to be significantly high for the existing HHV correlations, and thus, they are not suitable for predicting the HHV of the torrefied biomass. New correlations were then developed using data points of torrefied biomass. The ranges of reported data for HHV, volatile matter (VM, fixed carbon (FC, ash (ASH, carbon (C, hydrogen (H and oxygen (O contents were 14.90 MJ/kg–33.30 MJ/kg, 13.30%–88.57%, 11.25%–82.74%, 0.08%–47.62%, 35.08%–86.28%, 0.53%–7.46% and 4.31%–44.70%, respectively. Correlations with the minimum mean absolute errors and having all components of proximate and ultimate analyses were selected for future use. The selected new correlations have a good accuracy of prediction when they are validated using another set of data (26 samples. Thus, these new and more accurate correlations can be useful in modeling different thermochemical processes, including combustion, pyrolysis and gasification processes of torrefied biomass.

  10. Predictability of the monthly North Atlantic Oscillation index based on fractal analyses and dynamic system theory

    Science.gov (United States)

    Martínez, M. D.; Lana, X.; Burgueño, A.; Serra, C.

    2010-03-01

    The predictability of the monthly North Atlantic Oscillation, NAO, index is analysed from the point of view of different fractal concepts and dynamic system theory such as lacunarity, rescaled analysis (Hurst exponent) and reconstruction theorem (embedding and correlation dimensions, Kolmogorov entropy and Lyapunov exponents). The main results point out evident signs of randomness and the necessity of stochastic models to represent time evolution of the NAO index. The results also show that the monthly NAO index behaves as a white-noise Gaussian process. The high minimum number of nonlinear equations needed to describe the physical process governing the NAO index fluctuations is evidence of its complexity. A notable predictive instability is indicated by the positive Lyapunov exponents. Besides corroborating the complex time behaviour of the NAO index, present results suggest that random Cantor sets would be an interesting tool to model lacunarity and time evolution of the NAO index.

  11. Predictability of the monthly North Atlantic Oscillation index based on fractal analyses and dynamic system theory

    Directory of Open Access Journals (Sweden)

    M. D. Martínez

    2010-03-01

    Full Text Available The predictability of the monthly North Atlantic Oscillation, NAO, index is analysed from the point of view of different fractal concepts and dynamic system theory such as lacunarity, rescaled analysis (Hurst exponent and reconstruction theorem (embedding and correlation dimensions, Kolmogorov entropy and Lyapunov exponents. The main results point out evident signs of randomness and the necessity of stochastic models to represent time evolution of the NAO index. The results also show that the monthly NAO index behaves as a white-noise Gaussian process. The high minimum number of nonlinear equations needed to describe the physical process governing the NAO index fluctuations is evidence of its complexity. A notable predictive instability is indicated by the positive Lyapunov exponents. Besides corroborating the complex time behaviour of the NAO index, present results suggest that random Cantor sets would be an interesting tool to model lacunarity and time evolution of the NAO index.

  12. Computational approaches to analyse and predict small molecule transport and distribution at cellular and subcellular levels.

    Science.gov (United States)

    Min, Kyoung Ah; Zhang, Xinyuan; Yu, Jing-yu; Rosania, Gus R

    2014-01-01

    Quantitative structure-activity relationship (QSAR) studies and mechanistic mathematical modeling approaches have been independently employed for analysing and predicting the transport and distribution of small molecule chemical agents in living organisms. Both of these computational approaches have been useful for interpreting experiments measuring the transport properties of small molecule chemical agents, in vitro and in vivo. Nevertheless, mechanistic cell-based pharmacokinetic models have been especially useful to guide the design of experiments probing the molecular pathways underlying small molecule transport phenomena. Unlike QSAR models, mechanistic models can be integrated from microscopic to macroscopic levels, to analyse the spatiotemporal dynamics of small molecule chemical agents from intracellular organelles to whole organs, well beyond the experiments and training data sets upon which the models are based. Based on differential equations, mechanistic models can also be integrated with other differential equations-based systems biology models of biochemical networks or signaling pathways. Although the origin and evolution of mathematical modeling approaches aimed at predicting drug transport and distribution has occurred independently from systems biology, we propose that the incorporation of mechanistic cell-based computational models of drug transport and distribution into a systems biology modeling framework is a logical next step for the advancement of systems pharmacology research. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas

    Science.gov (United States)

    Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.

    2017-10-01

    KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.

  14. Functional enrichment analyses and construction of functional similarity networks with high confidence function prediction by PFP

    Directory of Open Access Journals (Sweden)

    Kihara Daisuke

    2010-05-01

    Full Text Available Abstract Background A new paradigm of biological investigation takes advantage of technologies that produce large high throughput datasets, including genome sequences, interactions of proteins, and gene expression. The ability of biologists to analyze and interpret such data relies on functional annotation of the included proteins, but even in highly characterized organisms many proteins can lack the functional evidence necessary to infer their biological relevance. Results Here we have applied high confidence function predictions from our automated prediction system, PFP, to three genome sequences, Escherichia coli, Saccharomyces cerevisiae, and Plasmodium falciparum (malaria. The number of annotated genes is increased by PFP to over 90% for all of the genomes. Using the large coverage of the function annotation, we introduced the functional similarity networks which represent the functional space of the proteomes. Four different functional similarity networks are constructed for each proteome, one each by considering similarity in a single Gene Ontology (GO category, i.e. Biological Process, Cellular Component, and Molecular Function, and another one by considering overall similarity with the funSim score. The functional similarity networks are shown to have higher modularity than the protein-protein interaction network. Moreover, the funSim score network is distinct from the single GO-score networks by showing a higher clustering degree exponent value and thus has a higher tendency to be hierarchical. In addition, examining function assignments to the protein-protein interaction network and local regions of genomes has identified numerous cases where subnetworks or local regions have functionally coherent proteins. These results will help interpreting interactions of proteins and gene orders in a genome. Several examples of both analyses are highlighted. Conclusion The analyses demonstrate that applying high confidence predictions from PFP

  15. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Tumor Microvessel Density as a Potential Predictive Marker for Bevacizumab Benefit: GOG-0218 Biomarker Analyses.

    Science.gov (United States)

    Bais, Carlos; Mueller, Barbara; Brady, Mark F; Mannel, Robert S; Burger, Robert A; Wei, Wei; Marien, Koen M; Kockx, Mark M; Husain, Amreen; Birrer, Michael J

    2017-11-01

    Combining bevacizumab with frontline chemotherapy statistically significantly improved progression-free survival (PFS) but not overall survival (OS) in the phase III GOG-0218 trial. Evaluation of candidate biomarkers was an exploratory objective. Patients with stage III (incompletely resected) or IV ovarian cancer were randomly assigned to receive six chemotherapy cycles with placebo or bevacizumab followed by single-agent placebo or bevacizumab. Five candidate tumor biomarkers were assessed by immunohistochemistry. The biomarker-evaluable population was categorized into high or low biomarker-expressing subgroups using median and quartile cutoffs. Associations between biomarker expression and efficacy were analyzed. All statistical tests were two-sided. The biomarker-evaluable population (n = 980) comprising 78.5% of the intent-to-treat population had representative baseline characteristics and efficacy outcomes. Neither prognostic nor predictive associations were seen for vascular endothelial growth factor (VEGF) receptor-2, neuropilin-1, or MET. Higher microvessel density (MVD; measured by CD31) showed predictive value for PFS (hazard ratio [HR] for bevacizumab vs placebo = 0.40, 95% confidence interval [CI] = 0.29 to 0.54, vs 0.80, 95% CI = 0.59 to 1.07, for high vs low MVD, respectively, P interaction = .003) and OS (HR = 0.67, 95% CI = 0.51 to 0.88, vs 1.10, 95% CI = 0.84 to 1.44, P interaction = .02). Tumor VEGF-A was not predictive for PFS but showed potential predictive value for OS using a third-quartile cutoff for high VEGF-A expression. These retrospective tumor biomarker analyses suggest a positive association between density of vascular endothelial cells (the predominant cell type expressing VEGF receptors) and tumor VEGF-A levels and magnitude of bevacizumab effect in ovarian cancer. The potential predictive value of MVD (CD31) and tumor VEGF-A is consistent with a mechanism of action driven by VEGF-A signaling blockade. © The

  17. Using stability analyses to predict dynamic behaviour of self-oscillating polymer gels

    Science.gov (United States)

    Palkar, Vaibhav; Srivastava, Gaurav; Kuksenok, Olga; Balazs, Anna C.; Dayal, Pratyush

    2015-03-01

    Use of chemo-mechanical transduction to produce locomotion is one of the significant characteristics of biological systems. Polymer gels, intrinsically powered by oscillatory Belousov-Zhabotinsky (BZ) reaction, are biomimetic materials that exhibit rhythmic self-sustained mechanical oscillations by chemo-mechanical transduction. Via simulations, based on the 3D gel lattice spring model, we have successfully captured the dynamic behaviour of BZ gels. We have demonstrated that it is possible to direct the movement of BZ gels along complex paths, guiding them to bend, reorient and turn. From a mathematical perspective, the oscillations in the BZ gels occur when the gel's steady states loose stability by virtue of Hopf bifurcations (HB). Through the use of stability analyses, we predict the conditions under which gel switches from stationary to oscillatory mode and vice versa. In addition, we characterize the nature of HB and also identify other types of bifurcations that play a critical role in governing the dynamic behaviour of BZ gels. Also, we successfully predict the frequency of chemo-mechanical oscillations and characterize its dependency on the model parameters. Our approach not only allows us to establish optimal conditions for the motion of BZ gels, but also can be used to design other dynamical systems. IIT Gandhinagar and DST-SERB for funding.

  18. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Mori, Steven [MAGNA Exteriors and Interiors Corporation, Aurora, ON (Canada); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States)

    2016-08-01

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used as resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.

  19. Considerations when loading spinal finite element models with predicted muscle forces from inverse static analyses.

    Science.gov (United States)

    Zhu, Rui; Zander, Thomas; Dreischarf, Marcel; Duda, Georg N; Rohlmann, Antonius; Schmidt, Hendrik

    2013-04-26

    Mostly simplified loads were used in biomechanical finite element (FE) studies of the spine because of a lack of data on muscular physiological loading. Inverse static (IS) models allow the prediction of muscle forces for predefined postures. A combination of both mechanical approaches - FE and IS - appears to allow a more realistic modeling. However, it is unknown what deviations are to be expected when muscle forces calculated for models with rigid vertebrae and fixed centers of rotation, as generally found in IS models, are applied to a FE model with elastic vertebrae and discs. The aim of this study was to determine the effects of these disagreements. Muscle forces were estimated for 20° flexion and 10° extension in an IS model and transferred to a FE model. The effects of the elasticity of bony structures (rigid vs. elastic) and the definition of the center of rotation (fixed vs. non-fixed) were quantified using the deviation of actual intervertebral rotation (IVR) of the FE model and the targeted IVR from the IS model. For extension, the elasticity of the vertebrae had only a minor effect on IVRs, whereas a non-fixed center of rotation increased the IVR deviation on average by 0.5° per segment. For flexion, a combination of the two parameters increased IVR deviation on average by 1° per segment. When loading FE models with predicted muscle forces from IS analyses, the main limitations in the IS model - rigidity of the segments and the fixed centers of rotation - must be considered. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Predicting behavior change from persuasive messages using neural representational similarity and social network analyses.

    Science.gov (United States)

    Pegors, Teresa K; Tompson, Steven; O'Donnell, Matthew Brook; Falk, Emily B

    2017-08-15

    Neural activity in medial prefrontal cortex (MPFC), identified as engaging in self-related processing, predicts later health behavior change. However, it is unknown to what extent individual differences in neural representation of content and lived experience influence this brain-behavior relationship. We examined whether the strength of content-specific representations during persuasive messaging relates to later behavior change, and whether these relationships change as a function of individuals' social network composition. In our study, smokers viewed anti-smoking messages while undergoing fMRI and we measured changes in their smoking behavior one month later. Using representational similarity analyses, we found that the degree to which message content (i.e. health, social, or valence information) was represented in a self-related processing MPFC region was associated with later smoking behavior, with increased representations of negatively valenced (risk) information corresponding to greater message-consistent behavior change. Furthermore, the relationship between representations and behavior change depended on social network composition: smokers who had proportionally fewer smokers in their network showed increases in smoking behavior when social or health content was strongly represented in MPFC, whereas message-consistent behavior (i.e., less smoking) was more likely for those with proportionally more smokers in their social network who represented social or health consequences more strongly. These results highlight the dynamic relationship between representations in MPFC and key outcomes such as health behavior change; a complete understanding of the role of MPFC in motivation and action should take into account individual differences in neural representation of stimulus attributes and social context variables such as social network composition. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A20 Inhibits β-Cell Apoptosis by Multiple Mechanisms and Predicts Residual β-Cell Function in Type 1 Diabetes

    DEFF Research Database (Denmark)

    Fukaya, Makiko; Brorsson, Caroline A; Meyerovich, Kira

    2016-01-01

    signaling via v-akt murine thymoma viral oncogene homolog (Akt) and consequently inhibition of the intrinsic apoptotic pathway. Finally, in a cohort of T1D children, we observed that the risk allele of the rs2327832 single nucleotide polymorphism of TNFAIP3 predicted lower C-peptide and higher hemoglobin A1...

  2. CLPX-Model: Local Analysis and Prediction System: 4-D Atmospheric Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — The Local Analysis and Prediction System (LAPS), run by the NOAA's Forecast Systems Laboratory (FSL), combines numerous observed meteorological data sets into a...

  3. CLPX-Model: Local Analysis and Prediction System: 4-D Atmospheric Analyses, Version 1

    Data.gov (United States)

    National Aeronautics and Space Administration — The Local Analysis and Prediction System (LAPS), run by the NOAA's Forecast Systems Laboratory (FSL), combines numerous observed meteorological data sets into a...

  4. Analyses of deep mammalian sequence alignments and constraint predictions for 1% of the human genome

    OpenAIRE

    Margulies, Elliott H.; Cooper, Gregory M.; Asimenos, George; Thomas, Daryl J.; Dewey, Colin N.; Siepel, Adam; Birney, Ewan; Keefe, Damian; Schwartz, Ariel S.; Hou, Minmei; Taylor, James; Nikolaev, Sergey; Montoya-Burgos, Juan I.; Löytynoja, Ari; Whelan, Simon

    2007-01-01

    A key component of the ongoing ENCODE project involves rigorous comparative sequence analyses for the initially targeted 1% of the human genome. Here, we present orthologous sequence generation, alignment, and evolutionary constraint analyses of 23 mammalian species for all ENCODE targets. Alignments were generated using four different methods; comparisons of these methods reveal large-scale consistency but substantial differences in terms of small genomic rearrangements, sensitivity (sequenc...

  5. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. [Argonne National Lab., IL (United States); Pelton, A.; Eriksson, G. [Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering

    1992-07-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  6. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. (Argonne National Lab., IL (United States)); Pelton, A.; Eriksson, G. (Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering)

    1992-01-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  7. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur......A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  8. Combining Results from Distinct MicroRNA Target Prediction Tools Enhances the Performance of Analyses.

    Science.gov (United States)

    Oliveira, Arthur C; Bovolenta, Luiz A; Nachtigall, Pedro G; Herkenhoff, Marcos E; Lemke, Ney; Pinhal, Danillo

    2017-01-01

    Target prediction is generally the first step toward recognition of bona fide microRNA (miRNA)-target interactions in living cells. Several target prediction tools are now available, which use distinct criteria and stringency to provide the best set of candidate targets for a single miRNA or a subset of miRNAs. However, there are many false-negative predictions, and consensus about the optimum strategy to select and use the output information provided by the target prediction tools is lacking. We compared the performance of four tools cited in literature-TargetScan (TS), miRanda-mirSVR (MR), Pita, and RNA22 (R22), and we determined the most effective approach for analyzing target prediction data (individual, union, or intersection). For this purpose, we calculated the sensitivity, specificity, precision, and correlation of these approaches using 10 miRNAs (miR-1-3p, miR-17-5p, miR-21-5p, miR-24-3p, miR-29a-3p, miR-34a-5p, miR-124-3p, miR-125b-5p, miR-145-5p, and miR-155-5p) and 1,400 genes (700 validated and 700 non-validated) as targets of these miRNAs. The four tools provided a subset of high-quality predictions and returned few false-positive predictions; however, they could not identify several known true targets. We demonstrate that union of TS/MR and TS/MR/R22 enhanced the quality of in silico prediction analysis of miRNA targets. We conclude that the union rather than the intersection of the aforementioned tools is the best strategy for maximizing performance while minimizing the loss of time and resources in subsequent in vivo and in vitro experiments for functional validation of miRNA-target interactions.

  9. Simulation, prediction, and genetic analyses of daily methane emissions in dairy cattle.

    Science.gov (United States)

    Yin, T; Pinent, T; Brügemann, K; Simianer, H; König, S

    2015-08-01

    This study presents an approach combining phenotypes from novel traits, deterministic equations from cattle nutrition, and stochastic simulation techniques from animal breeding to generate test-day methane emissions (MEm) of dairy cows. Data included test-day production traits (milk yield, fat percentage, protein percentage, milk urea nitrogen), conformation traits (wither height, hip width, body condition score), female fertility traits (days open, calving interval, stillbirth), and health traits (clinical mastitis) from 961 first lactation Brown Swiss cows kept on 41 low-input farms in Switzerland. Test-day MEm were predicted based on the traits from the current data set and 2 deterministic prediction equations, resulting in the traits labeled MEm1 and MEm2. Stochastic simulations were used to assign individual concentrate intake in dependency of farm-type specifications (requirement when calculating MEm2). Genetic parameters for MEm1 and MEm2 were estimated using random regression models. Predicted MEm had moderate heritabilities over lactation and ranged from 0.15 to 0.37, with highest heritabilities around DIM 100. Genetic correlations between MEm1 and MEm2 ranged between 0.91 and 0.94. Antagonistic genetic correlations in the range from 0.70 to 0.92 were found for the associations between MEm2 and milk yield. Genetic correlations between MEm with days open and with calving interval increased from 0.10 at the beginning to 0.90 at the end of lactation. Genetic relationships between MEm2 and stillbirth were negative (0 to -0.24) from the beginning to the peak phase of lactation. Positive genetic relationships in the range from 0.02 to 0.49 were found between MEm2 with clinical mastitis. Interpretation of genetic (co)variance components should also consider the limitations when using data generated by prediction equations. Prediction functions only describe that part of MEm which is dependent on the factors and effects included in the function. With high

  10. FUN3D Analyses in Support of the Second Aeroelastic Prediction Workshop

    Science.gov (United States)

    Chwalowski, Pawel; Heeg, Jennifer

    2016-01-01

    This paper presents the computational aeroelastic results generated in support of the second Aeroelastic Prediction Workshop for the Benchmark Supercritical Wing (BSCW) configurations and compares them to the experimental data. The computational results are obtained using FUN3D, an unstructured grid Reynolds- Averaged Navier-Stokes solver developed at NASA Langley Research Center. The analysis results include aerodynamic coefficients and surface pressures obtained for steady-state, static aeroelastic equilibrium, and unsteady flow due to a pitching wing or flutter prediction. Frequency response functions of the pressure coefficients with respect to the angular displacement are computed and compared with the experimental data. The effects of spatial and temporal convergence on the computational results are examined.

  11. Aeromechanics and Aeroacoustics Predictions of the Boeing-SMART Rotor Using Coupled-CFD/CSD Analyses

    Science.gov (United States)

    Bain, Jeremy; Sim, Ben W.; Sankar, Lakshmi; Brentner, Ken

    2010-01-01

    This paper will highlight helicopter aeromechanics and aeroacoustics prediction capabilities developed by Georgia Institute of Technology, the Pennsylvania State University, and Northern Arizona University under the Helicopter Quieting Program (HQP) sponsored by the Tactical Technology Office of the Defense Advanced Research Projects Agency (DARPA). First initiated in 2004, the goal of the HQP was to develop high fidelity, state-of-the-art computational tools for designing advanced helicopter rotors with reduced acoustic perceptibility and enhanced performance. A critical step towards achieving this objective is the development of rotorcraft prediction codes capable of assessing a wide range of helicopter configurations and operations for future rotorcraft designs. This includes novel next-generation rotor systems that incorporate innovative passive and/or active elements to meet future challenging military performance and survivability goals.

  12. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur...... anxiety and frustration appear not to be localized in a specific time interval but rather dependent on particular game stimuli....

  13. Predicting prostate cancer: analysing the clinical efficacy of prostate cancer risk calculators in a referral population.

    Science.gov (United States)

    Foley, R W; Lundon, D J; Murphy, K; Murphy, T B; Galvin, D J; Watson, R W G

    2015-09-01

    The decision to proceed to biopsy for the diagnosis of prostate cancer in clinical practice is a difficult one. Prostate cancer risk calculators allow for a systematic approach to the use of patient information to predict a patient's likelihood of prostate cancer. In this paper, we validate the two leading prostate cancer risk calculators, the prostate cancer prevention trial (PCPT) and the European Randomized Study of Screening for Prostate Cancer (ERSPC) in an Irish population. Data were collected for 337 men referred to one tertiary referral center in Ireland. Calibration analysis, ROC analysis and decision curve analysis were undertaken to ascertain the performance of the PCPT and the ERSPC risk calculators in this cohort. Of 337 consecutive biopsies, cancer was subsequently diagnosed in 146 men (43 %), 98 (67 %) of which were high grade. The AUC for the PCPT and ERSPC risk calculators were 0.68 and 0.66, respectively for the prediction of prostate cancer. Each calculator was sufficiently calibrated in this cohort. Decision curve analysis demonstrated a net benefit via the use of the PCPT and ERSPC risk calculators in the diagnosis of prostate cancer. The PCPT and ERSPC risk calculators achieve a statistically significant prediction of prostate cancer in this Irish population. This study provides external validation for these calculators, and therefore these tools can be used to aid in clinical decision making.

  14. Accuracy of Fall Prediction in Parkinson Disease: Six-Month and 12-Month Prospective Analyses

    Directory of Open Access Journals (Sweden)

    Ryan P. Duncan

    2012-01-01

    Full Text Available Introduction. We analyzed the ability of four balance assessments to predict falls in people with Parkinson Disease (PD prospectively over six and 12 months. Materials and Methods. The BESTest, Mini-BESTest, Functional Gait Assessment (FGA, and Berg Balance Scale (BBS were administered to 80 participants with idiopathic PD at baseline. Falls were then tracked for 12 months. Ability of each test to predict falls at six and 12 months was assessed using ROC curves and likelihood ratios (LR. Results. Twenty-seven percent of the sample had fallen at six months, and 32% of the sample had fallen at 12 months. At six months, areas under the ROC curve (AUC for the tests ranged from 0.8 (FGA to 0.89 (BESTest with LR+ of 3.4 (FGA to 5.8 (BESTest. At 12 months, AUCs ranged from 0.68 (BESTest, BBS to 0.77 (Mini-BESTest with LR+ of 1.8 (BESTest to 2.4 (BBS, FGA. Discussion. The various balance tests were effective in predicting falls at six months. All tests were relatively ineffective at 12 months. Conclusion. This pilot study suggests that people with PD should be assessed biannually for fall risk.

  15. Analyses of deep mammalian sequence alignments and constraint predictions for 1% of the human genome.

    Science.gov (United States)

    Margulies, Elliott H; Cooper, Gregory M; Asimenos, George; Thomas, Daryl J; Dewey, Colin N; Siepel, Adam; Birney, Ewan; Keefe, Damian; Schwartz, Ariel S; Hou, Minmei; Taylor, James; Nikolaev, Sergey; Montoya-Burgos, Juan I; Löytynoja, Ari; Whelan, Simon; Pardi, Fabio; Massingham, Tim; Brown, James B; Bickel, Peter; Holmes, Ian; Mullikin, James C; Ureta-Vidal, Abel; Paten, Benedict; Stone, Eric A; Rosenbloom, Kate R; Kent, W James; Bouffard, Gerard G; Guan, Xiaobin; Hansen, Nancy F; Idol, Jacquelyn R; Maduro, Valerie V B; Maskeri, Baishali; McDowell, Jennifer C; Park, Morgan; Thomas, Pamela J; Young, Alice C; Blakesley, Robert W; Muzny, Donna M; Sodergren, Erica; Wheeler, David A; Worley, Kim C; Jiang, Huaiyang; Weinstock, George M; Gibbs, Richard A; Graves, Tina; Fulton, Robert; Mardis, Elaine R; Wilson, Richard K; Clamp, Michele; Cuff, James; Gnerre, Sante; Jaffe, David B; Chang, Jean L; Lindblad-Toh, Kerstin; Lander, Eric S; Hinrichs, Angie; Trumbower, Heather; Clawson, Hiram; Zweig, Ann; Kuhn, Robert M; Barber, Galt; Harte, Rachel; Karolchik, Donna; Field, Matthew A; Moore, Richard A; Matthewson, Carrie A; Schein, Jacqueline E; Marra, Marco A; Antonarakis, Stylianos E; Batzoglou, Serafim; Goldman, Nick; Hardison, Ross; Haussler, David; Miller, Webb; Pachter, Lior; Green, Eric D; Sidow, Arend

    2007-06-01

    A key component of the ongoing ENCODE project involves rigorous comparative sequence analyses for the initially targeted 1% of the human genome. Here, we present orthologous sequence generation, alignment, and evolutionary constraint analyses of 23 mammalian species for all ENCODE targets. Alignments were generated using four different methods; comparisons of these methods reveal large-scale consistency but substantial differences in terms of small genomic rearrangements, sensitivity (sequence coverage), and specificity (alignment accuracy). We describe the quantitative and qualitative trade-offs concomitant with alignment method choice and the levels of technical error that need to be accounted for in applications that require multisequence alignments. Using the generated alignments, we identified constrained regions using three different methods. While the different constraint-detecting methods are in general agreement, there are important discrepancies relating to both the underlying alignments and the specific algorithms. However, by integrating the results across the alignments and constraint-detecting methods, we produced constraint annotations that were found to be robust based on multiple independent measures. Analyses of these annotations illustrate that most classes of experimentally annotated functional elements are enriched for constrained sequences; however, large portions of each class (with the exception of protein-coding sequences) do not overlap constrained regions. The latter elements might not be under primary sequence constraint, might not be constrained across all mammals, or might have expendable molecular functions. Conversely, 40% of the constrained sequences do not overlap any of the functional elements that have been experimentally identified. Together, these findings demonstrate and quantify how many genomic functional elements await basic molecular characterization.

  16. The PRO-ACT database: design, initial analyses, and predictive features.

    Science.gov (United States)

    Atassi, Nazem; Berry, James; Shui, Amy; Zach, Neta; Sherman, Alexander; Sinani, Ervin; Walker, Jason; Katsovskiy, Igor; Schoenfeld, David; Cudkowicz, Merit; Leitner, Melanie

    2014-11-04

    To pool data from completed amyotrophic lateral sclerosis (ALS) clinical trials and create an open-access resource that enables greater understanding of the phenotype and biology of ALS. Clinical trials data were pooled from 16 completed phase II/III ALS clinical trials and one observational study. Over 8 million de-identified longitudinally collected data points from over 8,600 individuals with ALS were standardized across trials and merged to create the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database. This database includes demographics, family histories, and longitudinal clinical and laboratory data. Mixed effects models were used to describe the rate of disease progression measured by the Revised ALS Functional Rating Scale (ALSFRS-R) and vital capacity (VC). Cox regression models were used to describe survival data. Implementing Bonferroni correction, the critical p value for 15 different tests was p = 0.003. The ALSFRS-R rate of decline was 1.02 (±2.3) points per month and the VC rate of decline was 2.24% of predicted (±6.9) per month. Higher levels of uric acid at trial entry were predictive of a slower drop in ALSFRS-R (p = 0.01) and VC (p predictive of a slower drop in ALSFRS-R (p = 0.01) and VC (p < 0.0001), and longer survival (p = 0.01). Finally, higher body mass index (BMI) at baseline was associated with longer survival (p < 0.0001). The PRO-ACT database is the largest publicly available repository of merged ALS clinical trials data. We report that baseline levels of creatinine and uric acid, as well as baseline BMI, are strong predictors of disease progression and survival. © 2014 American Academy of Neurology.

  17. Prediction of venous thromboembolism using semantic and sentiment analyses of clinical narratives.

    Science.gov (United States)

    Sabra, Susan; Mahmood Malik, Khalid; Alobaidi, Mazen

    2018-03-01

    Venous thromboembolism (VTE) is the third most common cardiovascular disorder. It affects people of both genders at ages as young as 20 years. The increased number of VTE cases with a high fatality rate of 25% at first occurrence makes preventive measures essential. Clinical narratives are a rich source of knowledge and should be included in the diagnosis and treatment processes, as they may contain critical information on risk factors. It is very important to make such narrative blocks of information usable for searching, health analytics, and decision-making. This paper proposes a Semantic Extraction and Sentiment Assessment of Risk Factors (SESARF) framework. Unlike traditional machine-learning approaches, SESARF, which consists of two main algorithms, namely, ExtractRiskFactor and FindSeverity, prepares a feature vector as the input to a support vector machine (SVM) classifier to make a diagnosis. SESARF matches and maps the concepts of VTE risk factors and finds adjectives and adverbs that reflect their levels of severity. SESARF uses a semantic- and sentiment-based approach to analyze clinical narratives of electronic health records (EHR) and then predict a diagnosis of VTE. We use a dataset of 150 clinical narratives, 80% of which are used to train our prediction classifier support vector machine, with the remaining 20% used for testing. Semantic extraction and sentiment analysis results yielded precisions of 81% and 70%, respectively. Using a support vector machine, prediction of patients with VTE yielded precision and recall values of 54.5% and 85.7%, respectively. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Directory of Open Access Journals (Sweden)

    Young Bin Kim

    Full Text Available Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  19. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Science.gov (United States)

    Kim, Young Bin; Lee, Jurim; Park, Nuri; Choo, Jaegul; Kim, Jong-Hyun; Kim, Chang Hun

    2017-01-01

    Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  20. Prediction and validation of gene-disease associations using methods inspired by social network analyses.

    Directory of Open Access Journals (Sweden)

    U Martin Singh-Blom

    Full Text Available Correctly identifying associations of genes with diseases has long been a goal in biology. With the emergence of large-scale gene-phenotype association datasets in biology, we can leverage statistical and machine learning methods to help us achieve this goal. In this paper, we present two methods for predicting gene-disease associations based on functional gene associations and gene-phenotype associations in model organisms. The first method, the Katz measure, is motivated from its success in social network link prediction, and is very closely related to some of the recent methods proposed for gene-disease association inference. The second method, called Catapult (Combining dATa Across species using Positive-Unlabeled Learning Techniques, is a supervised machine learning method that uses a biased support vector machine where the features are derived from walks in a heterogeneous gene-trait network. We study the performance of the proposed methods and related state-of-the-art methods using two different evaluation strategies, on two distinct data sets, namely OMIM phenotypes and drug-target interactions. Finally, by measuring the performance of the methods using two different evaluation strategies, we show that even though both methods perform very well, the Katz measure is better at identifying associations between traits and poorly studied genes, whereas Catapult is better suited to correctly identifying gene-trait associations overall [corrected].

  1. Prediction of Epileptic Seizure by Analysing Time Series EEG Signal Using k-NN Classifier

    Directory of Open Access Journals (Sweden)

    Md. Kamrul Hasan

    2017-01-01

    Full Text Available Electroencephalographic signal is a representative signal that contains information about brain activity, which is used for the detection of epilepsy since epileptic seizures are caused by a disturbance in the electrophysiological activity of the brain. The prediction of epileptic seizure usually requires a detailed and experienced analysis of EEG. In this paper, we have introduced a statistical analysis of EEG signal that is capable of recognizing epileptic seizure with a high degree of accuracy and helps to provide automatic detection of epileptic seizure for different ages of epilepsy. To accomplish the target research, we extract various epileptic features namely approximate entropy (ApEn, standard deviation (SD, standard error (SE, modified mean absolute value (MMAV, roll-off (R, and zero crossing (ZC from the epileptic signal. The k-nearest neighbours (k-NN algorithm is used for the classification of epilepsy then regression analysis is used for the prediction of the epilepsy level at different ages of the patients. Using the statistical parameters and regression analysis, a prototype mathematical model is proposed which helps to find the epileptic randomness with respect to the age of different subjects. The accuracy of this prototype equation depends on proper analysis of the dynamic information from the epileptic EEG.

  2. Phylogenomic analyses predict sistergroup relationship of nucleariids and Fungi and paraphyly of zygomycetes with significant support

    Directory of Open Access Journals (Sweden)

    Steenkamp Emma

    2009-01-01

    Full Text Available Abstract Background Resolving the evolutionary relationships among Fungi remains challenging because of their highly variable evolutionary rates, and lack of a close phylogenetic outgroup. Nucleariida, an enigmatic group of amoeboids, have been proposed to emerge close to the fungal-metazoan divergence and might fulfill this role. Yet, published phylogenies with up to five genes are without compelling statistical support, and genome-level data should be used to resolve this question with confidence. Results Our analyses with nuclear (118 proteins and mitochondrial (13 proteins data now robustly associate Nucleariida and Fungi as neighbors, an assemblage that we term 'Holomycota'. With Nucleariida as an outgroup, we revisit unresolved deep fungal relationships. Conclusion Our phylogenomic analysis provides significant support for the paraphyly of the traditional taxon Zygomycota, and contradicts a recent proposal to include Mortierella in a phylum Mucoromycotina. We further question the introduction of separate phyla for Glomeromycota and Blastocladiomycota, whose phylogenetic positions relative to other phyla remain unresolved even with genome-level datasets. Our results motivate broad sampling of additional genome sequences from these phyla.

  3. Predicting Geomorphic and Hydrologic Risks after Wildfire Using Harmonic and Stochastic Analyses

    Science.gov (United States)

    Mikesell, J.; Kinoshita, A. M.; Florsheim, J. L.; Chin, A.; Nourbakhshbeidokhti, S.

    2017-12-01

    Wildfire is a landscape-scale disturbance that often alters hydrological processes and sediment flux during subsequent storms. Vegetation loss from wildfires induce changes to sediment supply such as channel erosion and sedimentation and streamflow magnitude or flooding. These changes enhance downstream hazards, threatening human populations and physical aquatic habitat over various time scales. Using Williams Canyon, a basin burned by the Waldo Canyon Fire (2012) as a case study, we utilize deterministic and statistical modeling methods (Fourier series and first order Markov chain) to assess pre- and post-fire geomorphic and hydrologic characteristics, including of precipitation, enhanced vegetation index (EVI, a satellite-based proxy of vegetation biomass), streamflow, and sediment flux. Local precipitation, terrestrial Light Detection and Ranging (LiDAR) scanning, and satellite-based products are used for these time series analyses. We present a framework to assess variability of periodic and nonperiodic climatic and multivariate trends to inform development of a post-wildfire risk assessment methodology. To establish the extent to which a wildfire affects hydrologic and geomorphic patterns, a Fourier series was used to fit pre- and post-fire geomorphic and hydrologic characteristics to yearly temporal cycles and subcycles of 6, 4, 3, and 2.4 months. These cycles were analyzed using least-squares estimates of the harmonic coefficients or amplitudes of each sub-cycle's contribution to fit the overall behavior of a Fourier series. The stochastic variances of these characteristics were analyzed by composing first-order Markov models and probabilistic analysis through direct likelihood estimates. Preliminary results highlight an increased dependence of monthly post-fire hydrologic characteristics on 12 and 6-month temporal cycles. This statistical and probabilistic analysis provides a basis to determine the impact of wildfires on the temporal dependence of

  4. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  5. Condition based maintenance of a 20 kV-PE/XLPE-insulated cable network using the IRC-Analysis; Zustandsorientierte Instandhaltung eines polymerisolierten 20-kV-Kabelnetzes mit der IRC-Analyse. Moderne Diagnostik reduziert Stoerungsrate

    Energy Technology Data Exchange (ETDEWEB)

    Hoff, G.; Kranz, H.G. [BUGH Wuppertal (Germany). Labs. fuer Hochspannungstechnik; Beigert, M.; Petzold, F. [Seba Dynatronic Mess- und Ortungstechnik GmbH, Baunach (Germany); Kneissl, C. [Bereich Konzeption und Messtechnik, Lech Elektrizitaetswerke AG, Augsburg (Germany)

    2001-10-22

    For a preventive maintenance of a polymer insulated cable network a destruction free estimation of the status of the buried PE/XLPE-cables is needed. This contribution presents a condition based maintenance concept which is based on the IRC-Analysis. With this concept a major German utility was able to reduce the amount of failures in a part of the 20kV-cable network. The general tendency of increasing faults was broken. (orig.) [German] Die praeventive Instandhaltung in polymerisolierten Kabelnetzen setzt eine zerstoerungsfreie Zustandbestimmung von gelegten PE/VPE-isolierten Kabeln voraus. Die-Verfasser beschreiben ein zustandsorientiertes Wartungskonzept auf Basis der IRC-Analyse, mit dem es gelungen ist, in einem Teil des 20-kV-Kabelnetzes eines Energieversorgers die in der Vergangenheit stetig steigende Stoerungsrate drastisch zu reduzieren. (orig.)

  6. Benchmark of SCALE (SAS2H) isotopic predictions of depletion analyses for San Onofre PWR MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    2000-02-01

    The isotopic composition of mixed-oxide (MOX) fuel, fabricated with both uranium and plutonium, after discharge from reactors is of significant interest to the Fissile Materials Disposition Program. The validation of the SCALE (SAS2H) depletion code for use in the prediction of isotopic compositions of MOX fuel, similar to previous validation studies on uranium-only fueled reactors, has corresponding significance. The EEI-Westinghouse Plutonium Recycle Demonstration Program examined the use of MOX fuel in the San Onofre PWR, Unit 1, during cycles 2 and 3. Isotopic analyses of the MOX spent fuel were conducted on 13 actinides and {sup 148}Nd by either mass or alpha spectrometry. Six fuel pellet samples were taken from four different fuel pins of an irradiated MOX assembly. The measured actinide inventories from those samples has been used to benchmark SAS2H for MOX fuel applications. The average percentage differences in the code results compared with the measurement were {minus}0.9% for {sup 235}U and 5.2% for {sup 239}Pu. The differences for most of the isotopes were significantly larger than in the cases for uranium-only fueled reactors. In general, comparisons of code results with alpha spectrometer data had extreme differences, although the differences in the calculations compared with mass spectrometer analyses were not extremely larger than that of uranium-only fueled reactors. This benchmark study should be useful in estimating uncertainties of inventory, criticality and dose calculations of MOX spent fuel.

  7. Tumor marker analyses in patients with brain metastases: patterns of practice and implications for survival prediction research.

    Science.gov (United States)

    Nieder, Carsten; Dalhaug, Astrid; Haukland, Ellinor; Mannsåker, Bård; Pawinski, Adam

    2015-08-01

    This study aims to explore patterns of practice of tumor marker analyses and potential prognostic impact of abnormal markers in patients with brain metastases from solid tumors. Previously, lactate dehydrogenase (LDH) and albumin were identified as relevant biomarkers. We performed a retrospective analysis of 120 patients with known LDH and albumin treated with whole-brain radiotherapy (WBRT) in two different situations: (1) brain metastases detected at initial cancer diagnosis (n = 46) and (2) brain metastases at later time points (n = 74, median interval 13 months). Twenty-six patients (57 %) from group 1 had at least one tumor marker analyzed, and 11 patients (24 %) had abnormal results. Twenty-two patients (30 %) from group 2 had at least one tumor marker analyzed, and 16 patients (22 %) had abnormal results. When assuming that LDH and albumin would be standard tests before WBRT, additional potential biomarkers were found in 36 % of patients with normal LDH and albumin. Marker positivity rates were for example 80 % for carcinoembryonic antigen (CEA) in colorectal cancer and 79 % for CA 15-3 in breast cancer. Abnormal markers were associated with presence of liver metastases. CA 15-3 values above median predicted shorter survival in patients with breast cancer (median 1.9 vs. 13.8 months, p = 0.1). Comparable trends were not observed for various markers in other tumor types. In conclusion, only a minority of patients had undergone tumor marker analyses. Final group sizes were too small to perform multivariate analyses or draw definitive conclusions. We hypothesize that CA 15-3 could be a promising biomarker that should be studied further.

  8. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.

  9. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Yan Song

    2015-01-01

    Full Text Available Background: Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC, but biomarkers of activity are lacking. The aim of this study was to investigate the association of Von Hippel-Lindau (VHL gene status, vascular endothelial growth factor receptor (VEGFR or stem cell factor receptor (KIT expression, and their relationships with characteristics and clinical outcome of advanced ccRCC. Methods: A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute, Chinese Academy of Medical Sciences between January 2010 and November 2012. Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry. Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS and overall survival (OS were calculated and then compared based on expression status. The Chi-square test, the Kaplan-Meier method, and the Lon-rank test were used for statistical analyses. Results: Of 59 patients, objective responses were observed in 28 patients (47.5%. The median PFS was 13.8 months and median OS was 39.9 months. There was an improved PFS in patients with the following clinical features: Male gender, number of metastatic sites 2 or less, VEGFR-2 positive or KIT positive. Eleven patients (18.6% had evidence of VHL mutation, with an objective response rate of 45.5%, which showed no difference with patients with no VHL mutation (47.9%. VHL mutation status did not correlate with either overall response rate (P = 0.938 or PFS (P = 0.277. The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients, respectively, which was significantly longer than that of VEGFR-2 or KIT negative patients (P = 0.026 and P = 0.043. Conclusion

  10. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  11. Within What Distance Does "Greenness" Best Predict Physical Health? A Systematic Review of Articles with GIS Buffer Analyses across the Lifespan.

    Science.gov (United States)

    Browning, Matthew; Lee, Kangjae

    2017-06-23

    Is the amount of "greenness" within a 250-m, 500-m, 1000-m or a 2000-m buffer surrounding a person's home a good predictor of their physical health? The evidence is inconclusive. We reviewed Web of Science articles that used geographic information system buffer analyses to identify trends between physical health, greenness, and distance within which greenness is measured. Our inclusion criteria were: (1) use of buffers to estimate residential greenness; (2) statistical analyses that calculated significance of the greenness-physical health relationship; and (3) peer-reviewed articles published in English between 2007 and 2017. To capture multiple findings from a single article, we selected our unit of inquiry as the analysis, not the article. Our final sample included 260 analyses in 47 articles. All aspects of the review were in accordance with PRISMA guidelines. Analyses were independently judged as more, less, or least likely to be biased based on the inclusion of objective health measures and income/education controls. We found evidence that larger buffer sizes, up to 2000 m, better predicted physical health than smaller ones. We recommend that future analyses use nested rather than overlapping buffers to evaluate to what extent greenness not immediately around a person's home (i.e., within 1000-2000 m) predicts physical health.

  12. Prediction of beef color using time-domain nuclear magnetic resonance (TD-NMR) relaxometry data and multivariate analyses.

    Science.gov (United States)

    Moreira, Luiz Felipe Pompeu Prado; Ferrari, Adriana Cristina; Moraes, Tiago Bueno; Reis, Ricardo Andrade; Colnago, Luiz Alberto; Pereira, Fabíola Manhas Verbi

    2016-05-19

    Time-domain nuclear magnetic resonance and chemometrics were used to predict color parameters, such as lightness (L*), redness (a*), and yellowness (b*) of beef (Longissimus dorsi muscle) samples. Analyzing the relaxation decays with multivariate models performed with partial least-squares regression, color quality parameters were predicted. The partial least-squares models showed low errors independent of the sample size, indicating the potentiality of the method. Minced procedure and weighing were not necessary to improve the predictive performance of the models. The reduction of transverse relaxation time (T 2 ) measured by Carr-Purcell-Meiboom-Gill pulse sequence in darker beef in comparison with lighter ones can be explained by the lower relaxivity Fe 2+ present in deoxymyoglobin and oxymyoglobin (red beef) to the higher relaxivity of Fe 3+ present in metmyoglobin (brown beef). These results point that time-domain nuclear magnetic resonance spectroscopy can become a useful tool for quality assessment of beef cattle on bulk of the sample and through-packages, because this technique is also widely applied to measure sensorial parameters, such as flavor, juiciness and tenderness, and physicochemical parameters, cooking loss, fat and moisture content, and instrumental tenderness using Warner Bratzler shear force. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. A novel bioinformatics strategy for function prediction of poorly-characterized protein genes obtained from metagenome analyses.

    Science.gov (United States)

    Abe, Takashi; Kanaya, Shigehiko; Uehara, Hiroshi; Ikemura, Toshimichi

    2009-10-01

    As a result of remarkable progresses of DNA sequencing technology, vast quantities of genomic sequences have been decoded. Homology search for amino acid sequences, such as BLAST, has become a basic tool for assigning functions of genes/proteins when genomic sequences are decoded. Although the homology search has clearly been a powerful and irreplaceable method, the functions of only 50% or fewer of genes can be predicted when a novel genome is decoded. A prediction method independent of the homology search is urgently needed. By analyzing oligonucleotide compositions in genomic sequences, we previously developed a modified Self-Organizing Map 'BLSOM' that clustered genomic fragments according to phylotype with no advance knowledge of phylotype. Using BLSOM for di-, tri- and tetrapeptide compositions, we developed a system to enable separation (self-organization) of proteins by function. Analyzing oligopeptide frequencies in proteins previously classified into COGs (clusters of orthologous groups of proteins), BLSOMs could faithfully reproduce the COG classifications. This indicated that proteins, whose functions are unknown because of lack of significant sequence similarity with function-known proteins, can be related to function-known proteins based on similarity in oligopeptide composition. BLSOM was applied to predict functions of vast quantities of proteins derived from mixed genomes in environmental samples.

  14. Prediction of size-fractionated airborne particle-bound metals using MLR, BP-ANN and SVM analyses.

    Science.gov (United States)

    Leng, Xiang'zi; Wang, Jinhua; Ji, Haibo; Wang, Qin'geng; Li, Huiming; Qian, Xin; Li, Fengying; Yang, Meng

    2017-08-01

    Size-fractionated heavy metal concentrations were observed in airborne particulate matter (PM) samples collected from 2014 to 2015 (spanning all four seasons) from suburban (Xianlin) and industrial (Pukou) areas in Nanjing, a megacity of southeast China. Rapid prediction models of size-fractionated metals were established based on multiple linear regression (MLR), back propagation artificial neural network (BP-ANN) and support vector machine (SVM) by using meteorological factors and PM concentrations as input parameters. About 38% and 77% of PM 2.5 concentrations in Xianlin and Pukou, respectively, were beyond the Chinese National Ambient Air Quality Standard limit of 75 μg/m 3 . Nearly all elements had higher concentrations in industrial areas, and in winter among the four seasons. Anthropogenic elements such as Pb, Zn, Cd and Cu showed larger percentages in the fine fraction (ø≤2.5 μm), whereas the crustal elements including Al, Ba, Fe, Ni, Sr and Ti showed larger percentages in the coarse fraction (ø > 2.5 μm). SVM showed a higher training correlation coefficient (R), and lower mean absolute error (MAE) as well as lower root mean square error (RMSE), than MLR and BP-ANN for most metals. All the three methods showed better prediction results for Ni, Al, V, Cd and As, whereas relatively poor for Cr and Fe. The daily airborne metal concentrations in 2015 were then predicted by the fully trained SVM models and the results showed the heaviest pollution of airborne heavy metals occurred in December and January, whereas the lightest pollution occurred in June and July. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Can the lifetime of the superheater tubes be predicted according to the fuel analyses? Assessment from field and laboratory data

    Energy Technology Data Exchange (ETDEWEB)

    Salmenoja, K. [Kvaerner Pulping Oy, Tampere (Finland)

    1998-12-31

    Lifetime of the superheaters in different power boilers is more or less still a mystery. This is especially true in firing biomass based fuels (biofuels), such as bark, forest residues, and straw. Due to the unhomogeneous nature of the biofuels, the lifetime of the superheaters may vary from case to case. Sometimes the lifetime is significantly shorter than originally expected, sometimes no corrosion even in the hottest tubes is observed. This is one of the main reasons why the boiler operators often demand for a better predictability on the corrosion resistance of the materials to avoid unscheduled shutdowns. (orig.) 9 refs.

  16. Pan-genome Analyses of the Species Salmonella enterica, and Identification of Genomic Markers Predictive for Species, Subspecies, and Serovar

    Science.gov (United States)

    Laing, Chad R.; Whiteside, Matthew D.; Gannon, Victor P. J.

    2017-01-01

    Food safety is a global concern, with upward of 2.2 million deaths due to enteric disease every year. Current whole-genome sequencing platforms allow routine sequencing of enteric pathogens for surveillance, and during outbreaks; however, a remaining challenge is the identification of genomic markers that are predictive of strain groups that pose the most significant health threats to humans, or that can persist in specific environments. We have previously developed the software program Panseq, which identifies the pan-genome among a group of sequences, and the SuperPhy platform, which utilizes this pan-genome information to identify biomarkers that are predictive of groups of bacterial strains. In this study, we examined the pan-genome of 4893 genomes of Salmonella enterica, an enteric pathogen responsible for the loss of more disability adjusted life years than any other enteric pathogen. We identified a pan-genome of 25.3 Mbp, a strict core of 1.5 Mbp present in all genomes, and a conserved core of 3.2 Mbp found in at least 96% of these genomes. We also identified 404 genomic regions of 1000 bp that were specific to the species S. enterica. These species-specific regions were found to encode mostly hypothetical proteins, effectors, and other proteins related to virulence. For each of the six S. enterica subspecies, markers unique to each were identified. No serovar had pan-genome regions that were present in all of its genomes and absent in all other serovars; however, each serovar did have genomic regions that were universally present among all constituent members, and statistically predictive of the serovar. The phylogeny based on SNPs within the conserved core genome was found to be highly concordant to that produced by a phylogeny using the presence/absence of 1000 bp regions of the entire pan-genome. Future studies could use these predictive regions as components of a vaccine to prevent salmonellosis, as well as in simple and rapid diagnostic tests for both

  17. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  18. A novel approach to predict sudden cardiac death (SCD using nonlinear and time-frequency analyses from HRV signals.

    Directory of Open Access Journals (Sweden)

    Elias Ebrahimzadeh

    Full Text Available Investigations show that millions of people all around the world die as the result of sudden cardiac death (SCD. These deaths can be reduced by using medical equipment, such as defibrillators, after detection. We need to propose suitable ways to assist doctors to predict sudden cardiac death with a high level of accuracy. To do this, Linear, Time-Frequency (TF and Nonlinear features have been extracted from HRV of ECG signal. Finally, healthy people and people at risk of SCD are classified by k-Nearest Neighbor (k-NN and Multilayer Perceptron Neural Network (MLP. To evaluate, we have compared the classification rates for both separate and combined Nonlinear and TF features. The results show that HRV signals have special features in the vicinity of the occurrence of SCD that have the ability to distinguish between patients prone to SCD and normal people. We found that the combination of Time-Frequency and Nonlinear features have a better ability to achieve higher accuracy. The experimental results show that the combination of features can predict SCD by the accuracy of 99.73%, 96.52%, 90.37% and 83.96% for the first, second, third and forth one-minute intervals, respectively, before SCD occurrence.

  19. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    Energy Technology Data Exchange (ETDEWEB)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-08-15

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  20. Landscaping analyses of the ROC predictions of discrete-slots and signal-detection models of visual working memory.

    Science.gov (United States)

    Donkin, Chris; Tran, Sophia Chi; Nosofsky, Robert

    2014-10-01

    A fundamental issue concerning visual working memory is whether its capacity limits are better characterized in terms of a limited number of discrete slots (DSs) or a limited amount of a shared continuous resource. Rouder et al. (2008) found that a mixed-attention, fixed-capacity, DS model provided the best explanation of behavior in a change detection task, outperforming alternative continuous signal detection theory (SDT) models. Here, we extend their analysis in two ways: first, with experiments aimed at better distinguishing between the predictions of the DS and SDT models, and second, using a model-based analysis technique called landscaping, in which the functional-form complexity of the models is taken into account. We find that the balance of evidence supports a DS account of behavior in change detection tasks but that the SDT model is best when the visual displays always consist of the same number of items. In our General Discussion section, we outline, but ultimately reject, a number of potential explanations for the observed pattern of results. We finish by describing future research that is needed to pinpoint the basis for this observed pattern of results.

  1. The GENOTEND chip: a new tool to analyse gene expression in muscles of beef cattle for beef quality prediction

    Directory of Open Access Journals (Sweden)

    Hocquette Jean-Francois

    2012-08-01

    validated in the groups of 30 Charolais young bulls slaughtered in year 2, and in the 21 Charolais steers slaughtered in year 1, but not in the group of 19 steers slaughtered in year 2 which differ from the reference group by two factors (gender and year. When the first three groups of animals were analysed together, this subset of genes explained a 4-fold higher proportion of the variability in tenderness than muscle biochemical traits. Conclusion This study underlined the relevance of the GENOTEND chip to identify markers of beef quality, mainly by confirming previous results and by detecting other genes of the heat shock family as potential markers of beef quality. However, it was not always possible to extrapolate the relevance of these markers to all animal groups which differ by several factors (such as gender or environmental conditions of production from the initial population of reference in which these markers were identified.

  2. Taxometric analyses and predictive accuracy of callous-unemotional traits regarding quality of life and behavior problems in non-conduct disorder diagnoses.

    Science.gov (United States)

    Herpers, Pierre C M; Klip, Helen; Rommelse, Nanda N J; Taylor, Mark J; Greven, Corina U; Buitelaar, Jan K

    2017-07-01

    Callous-unemotional (CU) traits have mainly been studied in relation to conduct disorder (CD), but can also occur in other disorder groups. However, it is unclear whether there is a clinically relevant cut-off value of levels of CU traits in predicting reduced quality of life (QoL) and clinical symptoms, and whether CU traits better fit a categorical (taxonic) or dimensional model. Parents of 979 youths referred to a child and adolescent psychiatric clinic rated their child's CU traits on the Inventory of Callous-Unemotional traits (ICU), QoL on the Kidscreen-27, and clinical symptoms on the Child Behavior Checklist. Experienced clinicians conferred DSM-IV-TR diagnoses of ADHD, ASD, anxiety/mood disorders and DBD-NOS/ODD. The ICU was also used to score the DSM-5 specifier 'with limited prosocial emotions' (LPE) of Conduct Disorder. Receiver operating characteristic (ROC) analyses revealed that the predictive accuracy of the ICU and LPE regarding QoL and clinical symptoms was poor to fair, and similar across diagnoses. A clinical cut-off point could not be defined. Taxometric analyses suggested that callous-unemotional traits on the ICU best reflect a dimension rather than taxon. More research is needed on the impact of CU traits on the functional adaptation, course, and response to treatment of non-CD conditions. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  3. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  4. Cost-effectiveness of insulin aspart versus human soluble insulin in type 2 diabetes in four European countries: subgroup analyses from the PREDICTIVE study.

    Science.gov (United States)

    Palmer, James L; Goodall, Gordon; Nielsen, Steffen; Kotchie, Robert W; Valentine, William J; Palmer, Andrew J; Roze, Stéphane

    2008-05-01

    To evaluate the long-term health economic outcomes associated with insulin aspart (IAsp) compared to human soluble insulin (HI) in type 2 diabetes patients on basal-bolus therapy in Sweden, Spain, Italy and Poland. A published computer simulation model of diabetes was used to predict life expectancy, quality-adjusted life expectancy and incidence of diabetes-related complications. Baseline cohort characteristics (age 61.6 years, duration of diabetes 13.2 years, 45.1% male, HbA(1c) 8.2%, BMI 29.8 kg/m(2)) and treatment effects were derived from the PREDICTIVE observational study. Country-specific complication costs were derived from published sources. The analyses were run over 35-year time horizons from third-party payer perspectives in Spain, Italy and Poland and from a societal perspective in Sweden. Future costs and clinical benefits were discounted at country-specific discount rates. Sensitivity analyses were performed. IAsp was associated with improvements in discounted life expectancy and quality-adjusted life expectancy, and a reduced incidence of most diabetes-related complications versus HI in all four settings. IAsp was associated with societal cost-savings in Sweden (SEK 2470), direct medical cost-savings in Sweden and Spain (SEK 8248 and euro 1382, respectively), but increased direct costs in Italy (euro 2235) and Poland (euro 743). IAsp was associated with improved quality-adjusted life expectancy in Sweden (0.077 QALYs), Spain (0.080 QALYs), Italy (0.120 QALYs) and Poland (0.003 QALYs). IAsp was dominant versus HI in both Sweden and Spain, would be considered cost-effective in Italy with an incremental cost-effectiveness ratio of euro 18,597 per QALY gained, but would not be considered cost-effective in Poland.

  5. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Dionne, B.

    2011-01-01

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  6. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent

    Science.gov (United States)

    Guo, Yan; Warren Andersen, Shaneda; Shu, Xiao-Ou; Michailidou, Kyriaki; Bolla, Manjeet K.; Wang, Qin; Garcia-Closas, Montserrat; Milne, Roger L.; Schmidt, Marjanka K.; Chang-Claude, Jenny; Dunning, Allison; Bojesen, Stig E.; Ahsan, Habibul; Aittomäki, Kristiina; Andrulis, Irene L.; Anton-Culver, Hoda; Beckmann, Matthias W.; Beeghly-Fadiel, Alicia; Benitez, Javier; Bogdanova, Natalia V.; Bonanni, Bernardo; Børresen-Dale, Anne-Lise; Brand, Judith; Brauch, Hiltrud; Brenner, Hermann; Brüning, Thomas; Burwinkel, Barbara; Casey, Graham; Chenevix-Trench, Georgia; Couch, Fergus J.; Cross, Simon S.; Czene, Kamila; Dörk, Thilo; Dumont, Martine; Fasching, Peter A.; Figueroa, Jonine; Flesch-Janys, Dieter; Fletcher, Olivia; Flyger, Henrik; Fostira, Florentia; Gammon, Marilie; Giles, Graham G.; Guénel, Pascal; Haiman, Christopher A.; Hamann, Ute; Hooning, Maartje J.; Hopper, John L.; Jakubowska, Anna; Jasmine, Farzana; Jenkins, Mark; John, Esther M.; Johnson, Nichola; Jones, Michael E.; Kabisch, Maria; Knight, Julia A.; Koppert, Linetta B.; Kosma, Veli-Matti; Kristensen, Vessela; Le Marchand, Loic; Lee, Eunjung; Li, Jingmei; Lindblom, Annika; Lubinski, Jan; Malone, Kathi E.; Mannermaa, Arto; Margolin, Sara; McLean, Catriona; Meindl, Alfons; Neuhausen, Susan L.; Nevanlinna, Heli; Neven, Patrick; Olson, Janet E.; Perez, Jose I. A.; Perkins, Barbara; Phillips, Kelly-Anne; Pylkäs, Katri; Rudolph, Anja; Santella, Regina; Sawyer, Elinor J.; Schmutzler, Rita K.; Seynaeve, Caroline; Shah, Mitul; Shrubsole, Martha J.; Southey, Melissa C.; Swerdlow, Anthony J.; Toland, Amanda E.; Tomlinson, Ian; Torres, Diana; Truong, Thérèse; Ursin, Giske; Van Der Luijt, Rob B.; Verhoef, Senno; Whittemore, Alice S.; Winqvist, Robert; Zhao, Hui; Zhao, Shilin; Hall, Per; Simard, Jacques; Kraft, Peter; Hunter, David; Easton, Douglas F.; Zheng, Wei

    2016-01-01

    Background Observational epidemiological studies have shown that high body mass index (BMI) is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors. Methods We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC) (cases  =  46,325, controls  =  42,482). We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE) Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively. Results In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56–0.75, p = 3.32 × 10−10). The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31–0.62, p  =  9.91 × 10−8) and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46–0.71, p  =  1.88 × 10−8). This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60–0.84, p   =   1.64 × 10−7). Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs) in association with breast cancer risk at p

  7. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    2016-08-01

    Full Text Available Observational epidemiological studies have shown that high body mass index (BMI is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors.We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC (cases  =  46,325, controls  =  42,482. We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively.In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56-0.75, p = 3.32 × 10-10. The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31-0.62, p  =  9.91 × 10-8 and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46-0.71, p  =  1.88 × 10-8. This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60-0.84, p   =   1.64 × 10-7. Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs in association with breast cancer risk at p < 0.05; for 16 of them, the

  8. Application of pathways analyses for site performance prediction for the Gas Centrifuge Enrichment Plant and Oak Ridge Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.

    1984-01-01

    The suitability of the Gas Centrifuge Enrichment Plant and the Oak Ridge Central Waste Disposal Facility for shallow-land burial of low-level radioactive waste is evaluated using pathways analyses. The analyses rely on conservative scenarios to describe the generation and migration of contamination and the potential human exposure to the waste. Conceptual and numerical models are developed using data from comprehensive laboratory and field investigations and are used to simulate the long-term transport of contamination to man. Conservatism is built into the analyses when assumptions concerning future events have to be made or when uncertainties concerning site or waste characteristics exist. Maximum potential doses to man are calculated and compared to the appropriate standards. The sites are found to provide adequate buffer to persons outside the DOE reservations. Conclusions concerning site capacity and site acceptability are drawn. In reaching these conclusions, some consideration is given to the uncertainties and conservatisms involved in the analyses. Analytical methods to quantitatively assess the probability of future events to occur and the sensitivity of the results to data uncertainty may prove useful in relaxing some of the conservatism built into the analyses. The applicability of such methods to pathways analyses is briefly discussed. 18 refs., 9 figs

  9. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. In silico and cell-based analyses reveal strong divergence between prediction and observation of T-cell-recognized tumor antigen T-cell epitopes.

    Science.gov (United States)

    Schmidt, Julien; Guillaume, Philippe; Dojcinovic, Danijel; Karbach, Julia; Coukos, George; Luescher, Immanuel

    2017-07-14

    Tumor exomes provide comprehensive information on mutated, overexpressed genes and aberrant splicing, which can be exploited for personalized cancer immunotherapy. Of particular interest are mutated tumor antigen T-cell epitopes, because neoepitope-specific T cells often are tumoricidal. However, identifying tumor-specific T-cell epitopes is a major challenge. A widely used strategy relies on initial prediction of human leukocyte antigen-binding peptides by in silico algorithms, but the predictive power of this approach is unclear. Here, we used the human tumor antigen NY-ESO-1 (ESO) and the human leukocyte antigen variant HLA-A*0201 (A2) as a model and predicted in silico the 41 highest-affinity, A2-binding 8-11-mer peptides and assessed their binding, kinetic complex stability, and immunogenicity in A2-transgenic mice and on peripheral blood mononuclear cells from ESO-vaccinated melanoma patients. We found that 19 of the peptides strongly bound to A2, 10 of which formed stable A2-peptide complexes and induced CD8 + T cells in A2-transgenic mice. However, only 5 of the peptides induced cognate T cells in humans; these peptides exhibited strong binding and complex stability and contained multiple large hydrophobic and aromatic amino acids. These results were not predicted by in silico algorithms and provide new clues to improving T-cell epitope identification. In conclusion, our findings indicate that only a small fraction of in silico -predicted A2-binding ESO peptides are immunogenic in humans, namely those that have high peptide-binding strength and complex stability. This observation highlights the need for improving in silico predictions of peptide immunogenicity. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  11. Molecular and clinical analyses of Greig cephalopolysyndactyly and Pallister-Hall syndromes: Robust phenotype prediction from the type and position of GLI3 mutations

    NARCIS (Netherlands)

    Johnston, Jennifer J.; Olivos-Glander, Isabelle; Killoran, Christina; Elson, Emma; Turner, Joyce T.; Peters, Kathryn F.; Abbott, Margaret H.; Aughton, David J.; Aylsworth, Arthur S.; Bamshad, Michael J.; Booth, Carol; Curry, Cynthia J.; David, Albert; Dinulos, Mary Beth; Flannery, David B.; Fox, Michelle A.; Graham, John M.; Grange, Dorothy K.; Guttmacher, Alan E.; Hannibal, Mark C.; Henn, Wolfram; Hennekam, Raoul C. M.; Holmes, Lewis B.; Hoyme, H. Eugene; Leppig, Kathleen A.; Lin, Angela E.; Macleod, Patrick; Manchester, David K.; Marcelis, Carlo; Mazzanti, Laura; McCann, Emma; McDonald, Marie T.; Mendelsohn, Nancy J.; Moeschler, John B.; Moghaddam, Billur; Neri, Giovanni; Newbury-Ecob, Ruth; Pagon, Roberta A.; Phillips, John A.; Sadler, Laurie S.; Stoler, Joan M.; Tilstra, David; Walsh Vockley, Catherine M.; Zackai, Elaine H.; Zadeh, Touran M.; Brueton, Louise; Black, Graeme Charles M.; Biesecker, Leslie G.

    2005-01-01

    Mutations in the GLI3 zinc-finger transcription factor gene cause Greig cephalopolysyndactyly syndrome (GCPS) and Pallister-Hall syndrome (PHS), which are variable but distinct clinical entities. We hypothesized that GLI3 mutations that predict a truncated functional repressor protein cause PHS and

  12. The Janus-faced nature of time spent on homework : Using latent profile analyses to predict academic achievement over a school year

    NARCIS (Netherlands)

    Flunger, Barbara; Trautwein, Ulrich; Nagengast, Benjamin; Lüdtke, Oliver; Niggli, Alois; Schnyder, Inge

    2015-01-01

    Homework time and achievement are only modestly associated, whereas homework effort has consistently been shown to positively predict later achievement. We argue that time spent on homework can be an important predictor of achievement when combined with measures of homework effort. Latent profile

  13. Consequences of kriging and land use regression for PM2.5 predictions in epidemiologic analyses: insights into spatial variability using high-resolution satellite data.

    Science.gov (United States)

    Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A

    2015-01-01

    Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.

  14. A systematic study of coordinate precision in X-ray structure analyses. Pt. 2. Predictive estimates of E.S.D.'s for the general-atom case

    International Nuclear Information System (INIS)

    Allen, F.H.; Cole, J.C.; Durham Univ.; Howard, J.A.K.; Durham Univ.

    1995-01-01

    The relationship between the mean isotropic e.s.d. anti σ(A) o of any element type A in a crystal structure and the R factor and atomic constitution of that structure is explored for 124 905 element-type occurrences calculated from 33 955 entries in the Cambridge Structural Database. On the basis of the work of Cruickshank [Acta Cryst. (1960), 13, 774-777], it is shown that anti σ(A) p values can be estimated by equations of the form anti σ(A) p = KRN 1/2 c /Z A where N c is taken as ΣZ 2 i /Z 2 C , the Z i are atomic numbers and the summation is over all atoms in the asymmetric unit. Values of K were obtained by regression techniques using the anti σ(A) o as basis. The constant K nc for noncentrosymmetric structures is found to be larger than K c for centrosymmetric structures by a factor of ∼2 1/2 , as predicted by Cruickshank (1960). Two predictive equations are generated, one for first-row elements and the second for elements with Z A > 10. The relationship between the different constants K that arise in these two situations is linked to shape differentials in scattering-factor (f i ) curves for light and heavy atoms. It is found that predictive equations in which the Z i are selectively replaced by f i at a constant sinθ/λ of 0.30 A -1 generate closely similar values of K for the light-atom and heavy-atom subsets. The overall analysis indicates that atomic e.s.d.'s may be seriously underestimated in the more precise structure determinations, that e.s.d.'s for the heaviest atoms may be less reliable than those for lighter atoms and that e.s.d.'s in noncentrosymmetric structures may be less accurate than those in centrosymmetric structures. (orig.)

  15. Comparative Analyses of the Relative Effects of Various Mutations in Major Histocompatibility Complex I-a Way to Predict Protein-Protein Interactions.

    Science.gov (United States)

    Ali, Ananya; Biswas, Ria; Bhattacharjee, Sanchari; Nath, Prabahan; Pan, Sumanjit; Bagchi, Angshuman

    2016-09-01

    Protein-protein interactions (PPIs) play pivotal roles in most of the biological processes. PPI dysfunctions are therefore associated with disease situations. Mutations often lead to PPI dysfunctions, but there are certain other types of mutations which do not cause any appreciable abnormalities. This second type of mutations is called polymorphic mutations. So far, there are many studies that deal with the identification of PPI sites, but clear-cut analyses of the involvements of mutations in PPI dysfunctions are few and far between. We therefore made an attempt to link the appearances of mutations and PPI disruptions. We used major histocompatibility complex as our reference protein complex. We analyzed the mutations leading to the disease amyloidosis and also the other mutations that do not lead to disease conditions. We computed various biophysical parameters like relative solvent accessibility to discriminate between the two different types of mutations. Our analyses for the first time came up with a plausible explanation for the effects of different types of mutations in disease development. Our future plans are to build tools to detect the effects of mutations in disease developments by disrupting the PPIs.

  16. Effect of the fermentation pH on the storage stability of Lactobacillus rhamnosus preparations and suitability of in vitro analyses of cell physiological functions to predict it.

    Science.gov (United States)

    Saarela, M H; Alakomi, H-L; Puhakka, A; Mättö, J

    2009-04-01

    To investigate how cell physiological functions can predict the stability of freeze-dried probiotics. In addition, the effect of the fermentation pH on the stability of probiotics was investigated. Fermenter-grown (pH 5.8 or 5.0) Lactobacillus rhamnosus cells were freeze-dried and their survival was evaluated during storage at 37 degrees C, in apple juice and during acid [hydrochloric acid (HCl) and malic acid] and bile exposure. Cells grown at pH 5.0 were generally coping better with acid-stress than cells grown at pH 5.8. Cells were more sensitive to malic acid compared with HCl. Short-term stability results of Lact. rhamnosus cells in malic acid correlated well with the long-term stability results in apple juice, whereas the results of cell membrane integrity studies were in accordance with bile exposure results. Malic acid exposure can prove useful in evaluating the long-term stability of probiotic preparations in apple juice. Fermentation at reduced pH may ensure a better performance of Lact. rhamnosus cells during the subsequent acid-stress. The beneficial effect of lowered fermentation pH to Lact. rhamnosus stability during storage in apple juice and the usefulness of malic acid test in predicting the stability were shown.

  17. Biomimetic in vitro oxidation of lapachol: a model to predict and analyse the in vivo phase I metabolism of bioactive compounds.

    Science.gov (United States)

    Niehues, Michael; Barros, Valéria Priscila; Emery, Flávio da Silva; Dias-Baruffi, Marcelo; Assis, Marilda das Dores; Lopes, Norberto Peporine

    2012-08-01

    The bioactive naphtoquinone lapachol was studied in vitro by a biomimetic model with Jacobsen catalyst (manganese(III) salen) and iodosylbenzene as oxidizing agent. Eleven oxidation derivatives were thus identified and two competitive oxidation pathways postulated. Similar to Mn(III) porphyrins, Jacobsen catalyst mainly induced the formation of para-naphtoquinone derivatives of lapachol, but also of two ortho-derivatives. The oxidation products were used to develop a GC-MS (SIM mode) method for the identification of potential phase I metabolites in vivo. Plasma analysis of Wistar rats orally administered with lapachol revealed two metabolites, α-lapachone and dehydro-α-lapachone. Hence, the biomimetic model with a manganese salen complex has evidenced its use as a valuable tool to predict and elucidate the in vivo phase I metabolism of lapachol and possibly also of other bioactive natural compounds. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  18. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  19. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon) berries at ripening initiation

    Science.gov (United States)

    Lücker, Joost; Laszczak, Mario; Smith, Derek; Lund, Steven T

    2009-01-01

    Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison') in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening initiation and may be further

  20. Structural Analysis and Test Comparison of a 20-Meter Inflation-Deployed Solar Sail

    Science.gov (United States)

    Sleight, David W.; Mann, Troy; Lichodziejewski, David; Derbes, Billy

    2006-01-01

    Under the direction of the NASA In-Space Propulsion Technology Office, the team of L Garde, NASA Jet Propulsion Laboratory, Ball Aerospace, and NASA Langley Research Center has been developing a scalable solar sail configuration to address NASA s future space propulsion needs. Prior to a flight experiment of a full-scale solar sail, a comprehensive test program was implemented to advance the technology readiness level of the solar sail design. These tests consisted of solar sail component, subsystem, and sub-scale system ground tests that simulated the aspects of the space environment such as vacuum and thermal conditions. In July 2005, a 20-m four-quadrant solar sail system test article was tested in the NASA Glenn Research Center s Space Power Facility to measure its static and dynamic structural responses. Key to the maturation of solar sail technology is the development of validated finite element analysis (FEA) models that can be used for design and analysis of solar sails. A major objective of the program was to utilize the test data to validate the FEA models simulating the solar sail ground tests. The FEA software, ABAQUS, was used to perform the structural analyses to simulate the ground tests performed on the 20-m solar sail test article. This paper presents the details of the FEA modeling, the structural analyses simulating the ground tests, and a comparison of the pretest and post-test analysis predictions with the ground test results for the 20-m solar sail system test article. The structural responses that are compared in the paper include load-deflection curves and natural frequencies for the beam structural assembly and static shape, natural frequencies, and mode shapes for the solar sail membrane. The analysis predictions were in reasonable agreement with the test data. Factors that precluded better correlation of the analyses and the tests were unmeasured initial conditions in the test set-up.

  1. Structural and Biochemical Analyses of Swine Major Histocompatibility Complex Class I Complexes and Prediction of the Epitope Map of Important Influenza A Virus Strains.

    Science.gov (United States)

    Fan, Shuhua; Wu, Yanan; Wang, Song; Wang, Zhenbao; Jiang, Bo; Liu, Yanjie; Liang, Ruiying; Zhou, Wenzhong; Zhang, Nianzhi; Xia, Chun

    2016-08-01

    , the skewing of the α1 and α2 helixes is important in the different peptide conformations in SLA-3*hs0202. We also determined the fundamental motif for SLA-3*hs0202 to be X-(M/A/R)-(N/Q/R/F)-X-X-X-X-X-(V/I) based on a series of structural and biochemical analyses, and 28 SLA-3*hs0202-restricted epitope candidates were identified from important IAV strains. We believe our structure and analyses of pSLA-3*hs0202 can benefit vaccine development to control IAV in swine. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  2. Effects of pharmacists' interventions on appropriateness of prescribing and evaluation of the instruments' (MAI, STOPP and STARTs' ability to predict hospitalization--analyses from a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Ulrika Gillespie

    Full Text Available Appropriateness of prescribing can be assessed by various measures and screening instruments. The aims of this study were to investigate the effects of pharmacists' interventions on appropriateness of prescribing in elderly patients, and to explore the relationship between these results and hospital care utilization during a 12-month follow-up period.The study population from a previous randomized controlled study, in which the effects of a comprehensive pharmacist intervention on re-hospitalization was investigated, was used. The criteria from the instruments MAI, STOPP and START were applied retrospectively to the 368 study patients (intervention group (I n = 182, control group (C n = 186. The assessments were done on admission and at discharge to detect differences over time and between the groups. Hospital care consumption was recorded and the association between scores for appropriateness, and hospitalization was analysed.The number of Potentially Inappropriate Medicines (PIMs per patient as identified by STOPP was reduced for I but not for C (1.42 to 0.93 vs. 1.46 to 1.66 respectively, p<0.01. The number of Potential Prescription Omissions (PPOs per patient as identified by START was reduced for I but not for C (0.36 to 0.09 vs. 0.42 to 0.45 respectively, p<0.001. The summated score for MAI was reduced for I but not for C (8.5 to 5.0 and 8.7 to 10.0 respectively, p<0.001. There was a positive association between scores for MAI and STOPP and drug-related readmissions (RR 8-9% and 30-34% respectively. No association was detected between the scores of the tools and total re-visits to hospital.The interventions significantly improved the appropriateness of prescribing for patients in the intervention group as evaluated by the instruments MAI, STOPP and START. High scores in MAI and STOPP were associated with a higher number of drug-related readmissions.

  3. A new tool for prediction and analysis of thermal comfort in steady and transient states; Un nouvel outil pour la prediction et l'analyse du confort thermique en regime permanent et variable

    Energy Technology Data Exchange (ETDEWEB)

    Megri, A.Ch. [Illinois Institute of Technology, Civil and Architectural Engineering Dept., Chicago, Illinois (United States); Megri, A.F. [Centre Universitaire de Tebessa, Dept. d' Electronique (Algeria); El Naqa, I. [Washington Univ., School of Medicine, Dept. of Radiation Oncology, Saint Louis, Missouri (United States); Achard, G. [Universite de Savoie, Lab. Optimisation de la Conception et Ingenierie de L' Environnement (LOCIE) - ESIGEC, 73 - Le Bourget du Lac (France)

    2006-02-15

    Thermal comfort is influenced by psychological as well as physiological factors. This paper proposes the use of support vector machine (SVM) learning for automated prediction of human thermal comfort in steady and transient states. The SVM is an artificial intelligent approach that could capture the input/output mapping from the given data. Support vector machines were developed based on the Structural Risk Minimization principle. Different sets of representative experimental environmental factors that affect a homogenous person's thermal balance were used for training the SVM machine. The SVM is a very efficient, fast, and accurate technique to identify thermal comfort. This technique permits the determination of thermal comfort indices for different sub-categories of people; sick and elderly, in extreme climatic conditions, when the experimental data for such sub-category are available. The experimental data has been used for the learning and testing processes. The results show a good correlation between SVM predicted values and those obtained from conventional thermal comfort, such as Fanger and Gagge models. The 'trained machine' with representative data could be used easily and effectively in comparison with other conventional estimation methods of different indices. (author)

  4. Area under the curve predictions of dalbavancin, a new lipoglycopeptide agent, using the end of intravenous infusion concentration data point by regression analyses such as linear, log-linear and power models.

    Science.gov (United States)

    Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally

    2018-02-01

    1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.

  5. The transcription factor DREAM represses A20 and mediates inflammation

    OpenAIRE

    Tiruppathi, Chinnaswamy; Soni, Dheeraj; Wang, Dong-Mei; Xue, Jiaping; Singh, Vandana; Thippegowda, Prabhakar B.; Cheppudira, Bopaiah P.; Mishra, Rakesh K.; DebRoy, Auditi; Qian, Zhijian; Bachmaier, Kurt; Zhao, Youyang; Christman, John W.; Vogel, Stephen M.; Ma, Averil

    2014-01-01

    Here we show that the transcription-repressor DREAM binds to the A20 promoter to repress the expression of A20, the deubiquitinase suppressing inflammatory NF-κB signaling. DREAM-deficient (Dream−/− ) mice displayed persistent and unchecked A20 expression in response to endotoxin. DREAM functioned by transcriptionally repressing A20 through binding to downstream regulatory elements (DREs). In contrast, USF1 binding to the DRE-associated E-box domain activated A20 expression in response to inf...

  6. Environmental Volunteering and Health Outcomes over a 20-Year Period

    Science.gov (United States)

    Pillemer, Karl; Fuller-Rowell, Thomas E.; Reid, M. C.; Wells, Nancy M.

    2010-01-01

    Purpose: This study tested the hypothesis that volunteering in environmental organizations in midlife is associated with greater physical activity and improved mental and physical health over a 20-year period.  Design and Methods: The study used data from two waves (1974 and 1994) of the Alameda County Study, a longitudinal study of health and mortality that has followed a cohort of 6,928 adults since 1965. Using logistic and multiple regression models, we examined the prospective association between environmental and other volunteerism and three outcomes (physical activity, self-reported health, and depression), with 1974 volunteerism predicting 1994 outcomes, controlling for a number of relevant covariates.  Results: Midlife environmental volunteering was significantly associated with physical activity, self-reported health, and depressive symptoms.  Implications: This population-based study offers the first epidemiological evidence for a significant positive relationship between environmental volunteering and health and well-being outcomes. Further research, including intervention studies, is needed to confirm and shed additional light on these initial findings. PMID:20172902

  7. Juvenile nasopharyngeal angiofibroma in a 20 year old Nigerian male

    African Journals Online (AJOL)

    This paper presents misdiagnosis of a 20 year old male with Juvenile nasopharyngeal angiofibroma (JNA). Methods: The case record of a 20year old male who presented with recurrent spontaneous profuse epistaxis, progressive nasal obstruction, hyponasality and conductive hearing loss with mass in the post nasal space ...

  8. A20 Restrains Thymic Regulatory T Cell Development.

    Science.gov (United States)

    Fischer, Julius Clemens; Otten, Vera; Kober, Maike; Drees, Christoph; Rosenbaum, Marc; Schmickl, Martina; Heidegger, Simon; Beyaert, Rudi; van Loo, Geert; Li, Xian Chang; Peschel, Christian; Schmidt-Supprian, Marc; Haas, Tobias; Spoerl, Silvia; Poeck, Hendrik

    2017-10-01

    Maintaining immune tolerance requires the production of Foxp3-expressing regulatory T (T reg ) cells in the thymus. Activation of NF-κB transcription factors is critically required for T reg cell development, partly via initiating Foxp3 expression. NF-κB activation is controlled by a negative feedback regulation through the ubiquitin editing enzyme A20, which reduces proinflammatory signaling in myeloid cells and B cells. In naive CD4 + T cells, A20 prevents kinase RIPK3-dependent necroptosis. Using mice deficient for A20 in T lineage cells, we show that thymic and peripheral T reg cell compartments are quantitatively enlarged because of a cell-intrinsic developmental advantage of A20-deficient thymic T reg differentiation. A20-deficient thymic T reg cells exhibit reduced dependence on IL-2 but unchanged rates of proliferation and apoptosis. Activation of the NF-κB transcription factor RelA was enhanced, whereas nuclear translocation of c-Rel was decreased in A20-deficient thymic T reg cells. Furthermore, we found that the increase in T reg cells in T cell-specific A20-deficient mice was already observed in CD4 + single-positive CD25 + GITR + Foxp3 - thymic T reg cell progenitors. T reg cell precursors expressed high levels of the tumor necrosis factor receptor superfamily molecule GITR, whose stimulation is closely linked to thymic T reg cell development. A20-deficient T reg cells efficiently suppressed effector T cell-mediated graft-versus-host disease after allogeneic hematopoietic stem cell transplantation, suggesting normal suppressive function. Holding thymic production of natural T reg cells in check, A20 thus integrates T reg cell activity and increased effector T cell survival into an efficient CD4 + T cell response. Copyright © 2017 by The American Association of Immunologists, Inc.

  9. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  10. Cruise>Climate Variability and Predictability (CLIVAR) A22,A20 (AT20, EM122)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The hydrographic surveys will consist of approximately 180 full water column CTD/LADCP casts along the trackline. Each cast will acquire up to 36 water samples on...

  11. Conceptual Nuclear Design of a 20 MW Multipurpose Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Chul Gyo; Kim, Hak Sung; Park, Cheol [KAERI, Daejeon (Korea, Republic of); Nghiem, Huynh Ton; Vinh, Le Vinh; Dang, Vo Doan Hai [Dalat Nuclear Research Reactor, Hanoi (Viet Nam)

    2007-08-15

    A conceptual nuclear design of a 20 MW multi-purpose research reactor for Vietnam has been jointly done by the KAERI and the DNRI (VAEC). The AHR reference core in this report is a right water cooled and a heavy water reflected open-tank-in-pool type multipurpose research reactor with 20 MW. The rod type fuel of a dispersed U{sub 3}Si{sub 2}-Al with a density of 4.0 gU/cc is used as a fuel. The core consists of fourteen 36-element assemblies, four 18-element assemblies and has three in-core irradiation sites. The reflector tank filled with heavy water surrounds the core and provides rooms for various irradiation holes. Major analyses have been done for the relevant nuclear design parameters such as the neutron flux and power distributions, reactivity coefficients, control rod worths, etc. For the analysis, the MCNP, MVP, and HELIOS codes were used by KAERI and DNRI (VAEC). The results by MCNP (KAERI) and MVP (DNRI) showed good agreements and can be summarized as followings. For a clean, unperturbed core condition such that the fuels are all fresh and there are no irradiation holes in the reflector region, the fast neutron flux (E{sub n}{>=}1.0 MeV) reaches 1.47x10{sup 14} n/cm{sup 2}s and the maximum thermal neutron flux (E{sub n}{<=}0.625 eV) reaches 4.43x10{sup 14} n/cm{sup 2}s in the core region. In the reflector region, the thermal neutron peak occurs about 28 cm far from the core center and the maximum thermal neutron flux is estimated to be 4.09x10{sup 14} n/cm{sup 2}s. For the analysis of the equilibrium cycle core, the irradiation facilities in the reflector region were considered. The cycle length was estimated as 38 days long with a refueling scheme of replacing three 36-element fuel assemblies or replacing two 36-element and one 18-element fuel assemblies. The excess reactivity at a BOC was 103.4 mk, and 24.6 mk at a minimum was reserved at an EOC. The assembly average discharge burnup was 54.6% of initial U-235 loading. For the proposed fuel management

  12. Cardiovascular and autonomic response induced by a 20-week ...

    African Journals Online (AJOL)

    DOI:10.7196/SAJSM.564. Cardiovascular and autonomic response induced by a 20-week military training programme in young healthy South African males ... processed foods) and rural urbanisation in sub-Saharan Africa has seen a shift in the underlying population distribution of risk factors for cardiovascular disease.

  13. Environmental Volunteering and Health Outcomes over a 20-Year Period

    Science.gov (United States)

    Pillemer, Karl; Fuller-Rowell, Thomas E.; Reid, M. C.; Wells, Nancy M.

    2010-01-01

    Purpose: This study tested the hypothesis that volunteering in environmental organizations in midlife is associated with greater physical activity and improved mental and physical health over a 20-year period. Design and Methods: The study used data from two waves (1974 and 1994) of the Alameda County Study, a longitudinal study of health and…

  14. Cambio estructural a 20 años del TLCAN

    Directory of Open Access Journals (Sweden)

    Margarita Camarena Luhrs

    2017-01-01

    Full Text Available Texto leído en la presentación del libro A 20 años del TLC, México, Eugenia Correa, Antonio Gazol (Coordinadores (2016. Academia Mexicana de Economía Política/Facultad de Economía UNAM, 22/11/2016.

  15. Novel A20-gene-eluting stent inhibits carotid artery restenosis in a porcine model

    Directory of Open Access Journals (Sweden)

    Zhou ZH

    2016-08-01

    Full Text Available Zhen-hua Zhou,1 Jing Peng,1 Zhao-you Meng,1 Lin Chen,1 Jia-Lu Huang,1 He-qing Huang,1 Li Li,2 Wen Zeng,2 Yong Wei,2 Chu-Hong Zhu,2 Kang-Ning Chen1 1Department of Neurology, Cerebrovascular Disease Research Institute, Southwest Hospital, 2Department of Anatomy, Key Laboratory for Biomechanics of Chongqing, Third Military Medical University, Chongqing, People’s Republic of China Background: Carotid artery stenosis is a major risk factor for ischemic stroke. Although carotid angioplasty and stenting using an embolic protection device has been introduced as a less invasive carotid revascularization approach, in-stent restenosis limits its long-term efficacy and safety. The objective of this study was to test the anti-restenosis effects of local stent-mediated delivery of the A20 gene in a porcine carotid artery model.Materials and methods: The pCDNA3.1EHA20 was firmly attached onto stents that had been collagen coated and treated with N-succinimidyl-3-(2-pyridyldithiolpropionate solution and anti-DNA immunoglobulin fixation. Anti-restenosis effects of modified vs control (the bare-metal stent and pCDNA3.1 void vector stents were assessed by Western blot and scanning electron microscopy, as well as by morphological and inflammatory reaction analyses.Results: Stent-delivered A20 gene was locally expressed in porcine carotids in association with significantly greater extent of re-endothelialization at day 14 and of neointimal hyperplasia inhibition at 3 months than stenting without A20 gene expression.Conclusion: The A20-gene-eluting stent inhibits neointimal hyperplasia while promoting re-endothelialization and therefore constitutes a novel potential alternative to prevent restenosis while minimizing complications. Keywords: restenosis, A20, gene therapy, stent, endothelialization

  16. Expression of the nuclear factor-κB inhibitor A20 is altered in the cystic fibrosis epithelium.

    Science.gov (United States)

    Kelly, Catriona; Williams, Mark T; Mitchell, Kathryn; Elborn, J Stuart; Ennis, Madeleine; Schock, Bettina C

    2013-06-01

    A20 is a lipopolysaccharide (LPS)-inducible, cytoplasmic zinc finger protein, which inhibits Toll-like receptor-activated nuclear factor (NF)-κB signalling by deubiquitinating tumour necrosis factor receptor-associated factor (TRAF)-6. The action of A20 is facilitated by complex formation with ring finger protein (RNF)-11, Itch and TAX-1 binding protein-1 (TAX1BP1). This study investigated whether the expression of A20 is altered in the chronically inflamed cystic fibrosis (CF) airway epithelium. Nasal epithelial cells from CF patients (F508del homozygous), non-CF controls and immortalised epithelial cells (16HBE14o- and CFBE41o-) were stimulated with LPS. Cytoplasmic expression of A20 and expression of NF-κB subunits were analysed. Formation of the A20 ubiquitin editing complex was also investigated. In CFBE41o-, peak LPS-induced A20 expression was delayed compared with 16HBE14o- and fell significantly below basal levels 12-24 h after LPS stimulation. This was confirmed in primary CF airway cells. Additionally, a significant inverse relationship between A20 and p65 expression was observed. Inhibitor studies showed that A20 does not undergo proteasomal degradation in CFBE41o-. A20 interacted with TAX1BP1, RNF11 and TRAF6 in 16HBE14o- cells, but these interactions were not observed in CFBE41o-. The expression of A20 is significantly altered in CF, and important interactions with complex members and target proteins are lost, which may contribute to the state of chronic NF-κB-driven inflammation.

  17. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  18. Anti-inflammatory and anti-osteoclastogenic effects of zinc finger protein A20 overexpression in human periodontal ligament cells.

    Science.gov (United States)

    Hong, J-Y; Bae, W-J; Yi, J-K; Kim, G-T; Kim, E-C

    2016-08-01

    Although overexpression of the nuclear factor κB inhibitory and ubiquitin-editing enzyme A20 is thought to be involved in the pathogenesis of inflammatory diseases, its function in periodontal disease remains unknown. The aims of the present study were to evaluate A20 expression in patients with periodontitis and to study the effects of A20 overexpression, using a recombinant adenovirus encoding A20 (Ad-A20), on the inflammatory response and on osteoclastic differentiation in lipopolysaccharide (LPS)- and nicotine-stimulated human periodontal ligament cells (hPDLCs). The concentration of prostaglandin E2 was measured by radioimmunoassay. Reverse transcription-polymerase chain reactions and western blot analyses were used to measure mRNA and protein levels, respectively. Osteoclastic differentiation was assessed in mouse bone marrow-derived macrophages using conditioned medium from LPS- and nicotine-treated hPDLCs. A20 was upregulated in the gingival tissues and neutrophils from patients with periodontitis and in LPS- and nicotine-exposed hPDLCs. Pretreatment with A20 overexpression by Ad-A20 markedly attenuated LPS- and nicotine-induced production of prostaglandin E2 , as well as expression of cyclooxygenase-2 and proinflammatory cytokines. Moreover, A20 overexpression inhibited the number and size of tartrate-resistant acid phosphatase-stained osteoclasts, and downregulated osteoclast-specific gene expression. LPS- and nicotine-induced p38 phosphorylation and nuclear factor κB activation were blocked by Ad-A20. Ad-A20 inhibited the effects of nicotine and LPS on the activation of pan-protein kinase C, Akt, GSK-3β and protein kinase Cα. This study is the first to demonstrate that A20 overexpression has anti-inflammatory effects and blocks osteoclastic differentiation in a nicotine- and LPS-stimulated hPDLC model. Thus, A20 overexpression may be a potential therapeutic target in inflammatory bone loss diseases, such as periodontal disease. © 2015 John Wiley

  19. Periodic safety analyses

    International Nuclear Information System (INIS)

    Gouffon, A.; Zermizoglou, R.

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989

  20. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...... and finally data analysis based on the ISO approach. The device was calibrated and tested on commercially available laser systems. It showed good reproducibility. It was the target to be able to measure CW lasers with a power up to 200 W, focused down to spot diameters in the range of 10µm. In order...

  1. Antiferromagnetism in a 20% Ho-80% Tb alloy single crystal

    DEFF Research Database (Denmark)

    Lebech, Bente

    1968-01-01

    20% Ho-80% Tb exhibits two magnetic phases, similar to those of Tb. The spiral turn angle varies from 31.1° to 21.4°. A minimum effective spin for the occurrence of stable simple ferromagnetic structure at low temperatures is predicted....

  2. IMI - Oral biopharmaceutics tools project - Evaluation of bottom-up PBPK prediction success part 3: Identifying gaps in system parameters by analysing In Silico performance across different compound classes.

    Science.gov (United States)

    Darwich, Adam S; Margolskee, Alison; Pepin, Xavier; Aarons, Leon; Galetin, Aleksandra; Rostami-Hodjegan, Amin; Carlert, Sara; Hammarberg, Maria; Hilgendorf, Constanze; Johansson, Pernilla; Karlsson, Eva; Murphy, Dónal; Tannergren, Christer; Thörn, Helena; Yasin, Mohammed; Mazuir, Florent; Nicolas, Olivier; Ramusovic, Sergej; Xu, Christine; Pathak, Shriram M; Korjamo, Timo; Laru, Johanna; Malkki, Jussi; Pappinen, Sari; Tuunainen, Johanna; Dressman, Jennifer; Hansmann, Simone; Kostewicz, Edmund; He, Handan; Heimbach, Tycho; Wu, Fan; Hoft, Carolin; Pang, Yan; Bolger, Michael B; Huehn, Eva; Lukacova, Viera; Mullin, James M; Szeto, Ke X; Costales, Chester; Lin, Jian; McAllister, Mark; Modi, Sweta; Rotter, Charles; Varma, Manthena; Wong, Mei; Mitra, Amitava; Bevernage, Jan; Biewenga, Jeike; Van Peer, Achiel; Lloyd, Richard; Shardlow, Carole; Langguth, Peter; Mishenzon, Irina; Nguyen, Mai Anh; Brown, Jonathan; Lennernäs, Hans; Abrahamsson, Bertil

    2017-01-01

    Three Physiologically Based Pharmacokinetic software packages (GI-Sim, Simcyp® Simulator, and GastroPlus™) were evaluated as part of the Innovative Medicine Initiative Oral Biopharmaceutics Tools project (OrBiTo) during a blinded "bottom-up" anticipation of human pharmacokinetics. After data analysis of the predicted vs. measured pharmacokinetics parameters, it was found that oral bioavailability (F oral ) was underpredicted for compounds with low permeability, suggesting improper estimates of intestinal surface area, colonic absorption and/or lack of intestinal transporter information. F oral was also underpredicted for acidic compounds, suggesting overestimation of impact of ionisation on permeation, lack of information on intestinal transporters, or underestimation of solubilisation of weak acids due to less than optimal intestinal model pH settings or underestimation of bile micelle contribution. F oral was overpredicted for weak bases, suggesting inadequate models for precipitation or lack of in vitro precipitation information to build informed models. Relative bioavailability was underpredicted for both high logP compounds as well as poorly water-soluble compounds, suggesting inadequate models for solubility/dissolution, underperforming bile enhancement models and/or lack of biorelevant solubility measurements. These results indicate areas for improvement in model software, modelling approaches, and generation of applicable input data. However, caution is required when interpreting the impact of drug-specific properties in this exercise, as the availability of input parameters was heterogeneous and highly variable, and the modellers generally used the data "as is" in this blinded bottom-up prediction approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  4. Dialogisk kommunikationsteoretisk analyse

    DEFF Research Database (Denmark)

    Phillips, Louise Jane

    2018-01-01

    analysemetode, der er blevet udviklet inden for dialogisk kommunikationsforskning - The Integrated Framework for Analysing Dialogic Knowledge Production and Communication (IFADIA). IFADIA-metoden bygger på en kombination af Bakhtins dialogteori og Foucaults teori om magt/viden og diskurs. Metoden er beregnet...

  5. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    1997-01-01

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  6. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, Maria A.; Luyten, Johannes W.; Scheerens, Jaap; Sleegers, P.J.C.; Scheerens, J

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  7. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    . Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  8. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  9. Filmstil - teori og analyse

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    Filmstil påvirker på afgørende vis vores oplevelse af film. Men filmstil, måden, de levende billeder organiserer fortællingen på fylder noget mindre end filmens handling, når vi taler om film. Filmstil - teori og analyse er en rigt eksemplificeret præsentation, kritik og videreudvikling af...

  10. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  11. Hydrophysical conditions and periphyton in natural rivers. Analysis and predictive modelling of periphyton by changed regulations; Hydrofysiske forhold og begroing i naturlige elver. Analyse og prediktiv modellering av begroing ved reguleringsendringer

    Energy Technology Data Exchange (ETDEWEB)

    Stokseth, S.

    1994-10-01

    The objective of this thesis has been to examine the interaction between hydrodynamical and physical factors and the temporal and spatial dynamics of periphyton in natural steep rivers. The study strategy has been to work with quantitative system variables to be able to evaluate the potential usability of a predictive model for periphyton changes as a response to river regulations. The thesis is constituted by a theoretical and an empirical study. The theoretical study is aimed at presenting a conceptual model of the relevant factors based on an analysis of published studies. Effort has been made to evaluate and present the background material in a structured way. To concurrently handle the spatial and temporal dynamics of periphyton a new method for data collection has been developed. A procedure for quantifying the photo registrations has been developed. The simple hydrodynamical parameters were estimated from a set of standard formulas whereas the complex parameters were estimated from a three dimensional simulation model called SSIIM. The main conclusion from the analysis is that flood events are the major controlling factors wrt. periphyton biomass and that water temperature is of major importance for the periphyton resistance. Low temperature clearly increases the periphyton erosion resistance. Thus, to model or control the temporal dynamics the river periphyton, the water temperature and the frequency and size of floods should be regarded the most significant controlling factors. The data in this study has been collected from a river with a stable water quality and frequent floods. 109 refs., 41 figs., 34 tabs.

  12. Childhood neoplasms presenting at autopsy: A 20-year experience.

    Science.gov (United States)

    Bryant, Victoria A; Booth, John; Palm, Liina; Ashworth, Michael; Jacques, Thomas S; Sebire, Neil J

    2017-09-01

    The aims of the review are to establish the number of undiagnosed neoplasms presenting at autopsy in a single centre and to determine the incidence and most common causes of sudden unexpected death due to neoplasia in infancy and childhood (SUDNIC). Retrospective observational study of paediatric autopsies performed on behalf of Her Majesty's Coroner over a 20-year period (1996-2015; n = 2,432). Neoplasms first diagnosed at autopsy were identified from an established database and cases meeting the criteria for sudden unexpected death were further categorised. Thirteen previously undiagnosed neoplasms were identified, including five haematological malignancies, two medulloblastomas, two neuroblastomas, two cardiac tumours and two malignancies of renal origin. Eight cases met the criteria for SUDNIC (0.33% of autopsies), the commonest group of which were haematological malignancies (n = 3). Neoplasms presenting as unexpected death in infancy and childhood and diagnosed at autopsy are rare. The findings suggest that haematological malignancies are the commonest cause of SUDNIC and highlight the importance of specialist autopsy in cases of sudden unexpected death. © 2017 Wiley Periodicals, Inc.

  13. Description of a 20 Kilohertz power distribution system

    Science.gov (United States)

    Hansen, I. G.

    1986-01-01

    A single phase, 440 VRMS, 20 kHz power distribution system with a regulated sinusoidal wave form is discussed. A single phase power system minimizes the wiring, sensing, and control complexities required in a multi-sourced redundantly distributed power system. The single phase addresses only the distribution link; mulitphase lower frequency inputs and outputs accommodation techniques are described. While the 440 V operating potential was initially selected for aircraft operating below 50,000 ft, this potential also appears suitable for space power systems. This voltage choice recognizes a reasonable upper limit for semiconductor ratings, yet will direct synthesis of 220 V, 3 power. A 20 kHz operating frequency was selected to be above the range of audibility, minimize the weight of reactive components, yet allow the construction of single power stages of 25 to 30 kW. The regulated sinusoidal distribution system has several advantages. With a regulated voltage, most ac/dc conversions involve rather simple transformer rectifier applications. A sinusoidal distribution system, when used in conjunction with zero crossing switching, represents a minimal source of EMI. The present state of 20 kHz power technology includes computer controls of voltage and/or frequency, low inductance cable, current limiting circuit protection, bi-directional power flow, and motor/generator operating using standard induction machines. A status update and description of each of these items and their significance is presented.

  14. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  15. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  16. Some Tools for Robustifying Econometric Analyses

    NARCIS (Netherlands)

    V. Hoornweg (Victor)

    2013-01-01

    markdownabstract__Abstract__ We use automated algorithms to update and evaluate ad hoc judgments that are made in applied econometrics. Such an application of automated algorithms robustifies empirical econometric analyses, it achieves lower and more consistent prediction errors, and it helps to

  17. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  18. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  19. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    Sturzstroms are very fast landslides of very large initial volume. As type features they display extreme run out, pared with intensive fragmentation of the involved blocks of rock within a collisional flow. The inherent danger to the growing communities in alpine valleys below future potential sites of sturzstroms must be examined and results of predictions of endangered zones allow to impact upon the planning processes in these areas. This calls for the ability to make Type A predictions, according to Lambe (1973), which are done before an event. But Type A predictions are only possible if sufficient understanding of the mechanisms involved in a process is available. The motivation of the doctoral thesis research project presented is therefore to reveal the mechanics of sturzstroms in more detail in order to contribute to the development of a Type A run out prediction model. It is obvious that a sturzstrom represents a highly dynamic collisional granular regime. Thus particles do not only collide but will eventually crush each other. Erismann and Abele (2001) describe this process as dynamic disintegration, where kinetic energy is the main driver for fragmenting the rock mass. In this case an approach combining the type features long run out and fragmentation within a single hypothesis is represented by the dynamic fragmentation-spreading model (Davies and McSaveney, 2009; McSaveney and Davies, 2009). Unfortunately, sturzstroms, and fragmentation within sturzstroms, can not be observed directly in a real event because of their long "reoccurrence time" and the obvious difficulties in placing measuring devices within such a rock flow. Therefore, rigorous modelling is required in particular of the transition from static to dynamic behaviour to achieve better knowledge of the mechanics of sturzstroms, and to provide empirical evidence to confirm the dynamic fragmentation-spreading model. Within this study fragmentation and their effects on the mobility of sturzstroms

  20. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  1. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  2. Analyse af elbilers forbrug

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Torp, Kristian

    2014-01-01

    Denne rapport undersøger GPS og CAN bus datagrundlaget opsamlet ved kørsel med elbiler og analysere på elbilers forbrug. Analyserne er baseret på godt 133 millioner GPS og CAN bus målinger opsamlet fra 164 elbiler (Citroen C-Zero, Mitsubishi iMiev og Peugeot Ion) i kalenderåret 2012....... For datagrundlaget kan det konstateres, at der er behov for væsentlige, men simple opstramninger for fremadrettet at gøre det nemmere at anvende GPS/CAN bus data fra elbiler i andre analyser. Brugen af elbiler er sammenlignet med brændstofbiler og konklusionen er, at elbiler generelt kører 10-15 km/t langsommere på...

  3. Analyse de "La banlieue"

    Directory of Open Access Journals (Sweden)

    Nelly Morais

    2006-11-01

    Full Text Available 1. Préambule - Conditions de réalisation de la présente analyse Un groupe d'étudiants de master 1 de FLE de l'université Paris 3 (donc des étudiants en didactique des langues se destinant à l'enseignement du FLE a observé le produit au cours d'un module sur les TIC (Technologies de l'Information et de la Communication et la didactique des langues. Une discussion s'est ensuite engagée sur le forum d'une plate-forme de formation à distance à partir de quelques questions posées par l'enseigna...

  4. Fusion plasma experiments on TFTR: A 20 year retrospective*

    Energy Technology Data Exchange (ETDEWEB)

    Hawryluk, R. J.; Batha, S.; Blanchard, W.; Beer, M.; Bell, M. G.; Bell, R. E.; Berk, H.; Bernabei, S.; Bitter, M.; Breizman, B.; Bretz, N. L; Budny, R.; Bush, C. E.; Callen, J.; Camp, R.; Cauffman, S.; Chang, Z.; Cheng, C. Z.; Darrow, D. S.; Dendy, R. O.; Dorland, W.; Duong, H.; Efthimion, P. C.; Ernst, D.; Fisch, N. J.; Fisher, R.; Fonck, R. J.; Fredrickson, E. D.; Fu, G. Y.; Furth, H. P.; Gorelenkov, N. N.; Grek, B.; Grisham, L. R.; Hammett, G. W.; Hanson, G. R.; Herrmann, H. W.; Herrmann, M. C.; Hill, K. W.; Hogan, J.; Hosea, J. C.; Houlberg, W. A.; Hughes, M.; Hulse, R. A.; Jassby, D. L.; Jobes, F. C.; Johnson, D. W.; Kaita, R.; Kaye, S.; Kim, J. S.; Kissick, M.; Krasilnikov, A. V.; Kugel, H.; Kumar, A.; Leblanc, B.; Levinton, F. M.; Ludescher, C.; Majeski, R. P.; Manickam, J.; Mansfield, D. K.; Mazzucato, E.; McChesney, J.; McCune, D. C.; McGuire, K. M.; Meade, D. M.; Medley, S. S.; Mika, R.; Mikkelsen, D. R.; Mirnov, S. V.; Mueller, D.; Nagy, A.; Navratil, G. A.; Nazikian, R.; Okabayashi, M.; Park, H. K.; Park, W.; Paul, S. F.; Pearson, G.; Petrov, M. P.; Phillips, C. K.; Phillips, M.; Ramsey, A. T.; Redi, M. H.; Rewoldt, G.; Reznik, S.; Roquemore, A. L.; Rogers, J.; Ruskov, E.; Sabbagh, S. A.; Sasao, M.; Schilling, G.; Schivell, J.; Schmidt, G. L.; Scott, S. D.; Semenov, I.; Skinner, C. H.; Stevenson, T.; Stratton, B. C.; Strachan, J. D.; Stodiek, W.; Synakowski, E.; Takahashi, H.; Tang, W.; Taylor, G.; Thompson, M. E.; Von Goeler, S.; Von Halle, A.; Walters, R. T.; White, R.; Wieland, R. M.; Williams, M.; Wilson, J. R.; Wong, K. L.; Wurden, G. A.; Yamada, M.; Yavorski, V.; Young, K. M.; Zakharov, L.; Zarnstorff, M. C.; Zweben, S. J. [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    1998-01-01

    The Tokamak Fusion Test Reactor (TFTR) (R. J. Hawryluk, to be published in Rev. Mod. Phys.) experiments on high-temperature plasmas, that culminated in the study of deuterium–tritium D–T plasmas containing significant populations of energetic alpha particles, spanned over two decades from conception to completion. During the design of TFTR, the key physics issues were magnetohydrodynamic (MHD) equilibrium and stability, plasma energy transport, impurity effects, and plasma reactivity. Energetic particle physics was given less attention during this phase because, in part, of the necessity to address the issues that would create the conditions for the study of energetic particles and also the lack of diagnostics to study the energetic particles in detail. The worldwide tokamak program including the contributions from TFTR made substantial progress during the past two decades in addressing the fundamental issues affecting the performance of high-temperature plasmas and the behavior of energetic particles. The progress has been the result of the construction of new facilities, which enabled the production of high-temperature well-confined plasmas, development of sophisticated diagnostic techniques to study both the background plasma and the resulting energetic fusion products, and computational techniques to both interpret the experimental results and to predict the outcome of experiments. © 1998 American Institute of Physics.

  5. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  6. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  7. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  8. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  9. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    2015-03-12

    Mar 12, 2015 ... Abstract. Nonalcoholic fatty liver disease (NAFLD) is a distinct pathologic condition characterized by a disease spectrum ranging from simple steatosis to steato-hepatitis, cirrhosis and hepatocellular carcinoma. Prevalence of NAFLD varies in different ethnic groups, ranging from 12% in Chinese to 45% in ...

  10. Comparison of analyses to predict ruminal fibre degradability and ...

    African Journals Online (AJOL)

    The objective of this study was to compare the ruminal degradability of neutral detergent fibre (NDF) and indigestible NDF (INDF) between silages (n = 24) that originated from three different temperate grass species, i.e. Dactylis glomerata L., Festuca arundinacea L. and hybrid, Felina – Lolium multiflorum L. × Festuca ...

  11. Analysis of K-net and Kik-net data: implications for ground motion prediction - acceleration time histories, response spectra and nonlinear site response; Analyse des donnees accelerometriques de K-net et Kik-net: implications pour la prediction du mouvement sismique - accelerogrammes et spectres de reponse - et la prise en compte des effets de site non-lineaire

    Energy Technology Data Exchange (ETDEWEB)

    Pousse, G

    2005-10-15

    This thesis intends to characterize ground motion during earthquake. This work is based on two Japanese networks. It deals with databases of shallow events, depth less than 25 km, with magnitude between 4.0 and 7.3. The analysis of K-net allows to compute a spectral ground motion prediction equation and to review the shape of the Eurocode 8 design spectra. We show the larger amplification at short period for Japanese data and bring in light the soil amplification that takes place at large period. In addition, we develop a new empirical model for simulating synthetic stochastic nonstationary acceleration time histories. By specifying magnitude, distance and site effect, this model allows to produce many time histories, that a seismic event is liable to produce at the place of interest. Furthermore, the study of near-field borehole records of the Kik-net allows to explore the validity domain of predictive equations and to explain what occurs by extrapolating ground motion predictions. Finally, we show that nonlinearity reduces the dispersion of ground motion at the surface. (author)

  12. 17 CFR 240.13a-20 - Plain English presentation of specified information.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Plain English presentation of specified information. 240.13a-20 Section 240.13a-20 Commodity and Securities Exchanges SECURITIES AND... Regulations Under the Securities Exchange Act of 1934 Other Reports § 240.13a-20 Plain English presentation of...

  13. Semen analysis and prediction of natural conception

    NARCIS (Netherlands)

    Leushuis, Esther; van der Steeg, Jan Willem; Steures, Pieternel; Repping, Sjoerd; Bossuyt, Patrick M. M.; Mol, Ben Willem J.; Hompes, Peter G. A.; van der Veen, Fulco

    2014-01-01

    Do two semen analyses predict natural conception better than a single semen analysis and will adding the results of repeated semen analyses to a prediction model for natural pregnancy improve predictions? A second semen analysis does not add helpful information for predicting natural conception

  14. ENSO Prediction and Predictability

    Science.gov (United States)

    Cane, M. A.

    2016-12-01

    In 1986 there was one dynamical forecasting model for ENSO and a small handful of statistical and hybrid schemes. Now there are more than 40 models in the IRI-CPC forecast plume, many of them coupled GCMs. Why hasn't forecasting improved more in 30 years? In a 1982 landmark paper, Rasmussen and Carpenter created the canonical El Niño, transforming the inchoate view of the time. Now much research is focuses on ENSO diversity. Is our understanding deeper for this? Has this work properly incorporated the ENSO life cycle of the canonical event? Some eras are more predictable than others. Why? Is it random, or is there a systematic difference in background state? Do we have any idea how predictable ENSO is? Why is the ENSO band 2-7 years? What will become of ENSO in the next century?

  15. WALS Prediction

    NARCIS (Netherlands)

    Magnus, J.R.; Wang, W.; Zhang, Xinyu

    2012-01-01

    Abstract: Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty

  16. A20 prevents chronic liver inflammation and cancer by protecting hepatocytes from death

    Science.gov (United States)

    Catrysse, L; Farhang Ghahremani, M; Vereecke, L; Youssef, S A; Mc Guire, C; Sze, M; Weber, A; Heikenwalder, M; de Bruin, A; Beyaert, R; van Loo, G

    2016-01-01

    An important regulator of inflammatory signalling is the ubiquitin-editing protein A20 that acts as a break on nuclear factor-κB (NF-κB) activation, but also exerts important cytoprotective functions. A20 knockout mice are cachectic and die prematurely due to excessive multi-organ inflammation. To establish the importance of A20 in liver homeostasis and pathology, we developed a novel mouse line lacking A20 specifically in liver parenchymal cells. These mice spontaneously develop chronic liver inflammation but no fibrosis or hepatocellular carcinomas, illustrating an important role for A20 in normal liver tissue homeostasis. Hepatocyte-specific A20 knockout mice show sustained NF-κB-dependent gene expression in the liver upon tumor necrosis factor (TNF) or lipopolysaccharide injection, as well as hepatocyte apoptosis and lethality upon challenge with sublethal doses of TNF, demonstrating an essential role for A20 in the protection of mice against acute liver failure. Finally, chronic liver inflammation and enhanced hepatocyte apoptosis in hepatocyte-specific A20 knockout mice was associated with increased susceptibility to chemically or high fat-diet-induced hepatocellular carcinoma development. Together, these studies establish A20 as a crucial hepatoprotective factor. PMID:27253414

  17. A20 restricts wnt signaling in intestinal epithelial cells and suppresses colon carcinogenesis.

    Directory of Open Access Journals (Sweden)

    Ling Shao

    Full Text Available Colon carcinogenesis consists of a multistep process during which a series of genetic and epigenetic adaptations occur that lead to malignant transformation. Here, we have studied the role of A20 (also known as TNFAIP3, a ubiquitin-editing enzyme that restricts NFκB and cell death signaling, in intestinal homeostasis and tumorigenesis. We have found that A20 expression is consistently reduced in human colonic adenomas than in normal colonic tissues. To further investigate A20's potential roles in regulating colon carcinogenesis, we have generated mice lacking A20 specifically in intestinal epithelial cells and interbred these with mice harboring a mutation in the adenomatous polyposis coli gene (APC(min. While A20(FL/FL villin-Cre mice exhibit uninflamed intestines without polyps, A20(FL/FL villin-Cre APC(min/+ mice contain far greater numbers and larger colonic polyps than control APC(min mice. We find that A20 binds to the β-catenin destruction complex and restricts canonical wnt signaling by supporting ubiquitination and degradation of β-catenin in intestinal epithelial cells. Moreover, acute deletion of A20 from intestinal epithelial cells in vivo leads to enhanced expression of the β-catenin dependent genes cyclinD1 and c-myc, known promoters of colon cancer. Taken together, these findings demonstrate new roles for A20 in restricting β-catenin signaling and preventing colon tumorigenesis.

  18. Predictive Testing

    Science.gov (United States)

    ... you want to learn. Search form Search Predictive testing You are here Home Testing & Services Testing for ... you make the decision. What Is Predictive Genetic Testing Predictive genetic testing searches for genetic changes, or ...

  19. Incidence of Hepatitis C Infection among Prisoners by Routine Laboratory Values during a 20-Year Period

    Science.gov (United States)

    Marco, Andrés; Gallego, Carlos; Caylà, Joan A.

    2014-01-01

    Background To estimate the incidence of Hepatitis C virus (HCV) and the predictive factors through repeated routine laboratory analyses. Methods An observational cohort study was carried out in Quatre Camins Prison, Barcelona. The study included subjects with an initial negative HCV result and routine laboratory analyses containing HCV serology from 1992 to 2011. The incidence of infection was calculated for the study population and for sub-groups by 100 person-years of follow-up (100 py). The predictive factors were determined through Kaplan-Meier curves and a Cox regression. Hazard ratios (HR) and 95% confidence intervals (CI) were calculated. Results A total of 2,377 prisoners were included with a median follow-up time of 1,540.9 days per patient. Among the total population, 117 HCV seroconversions were detected (incidence of 1.17/100 py). The incidence was higher between 1992 and 1995 (2.57/100 py), among cases with HIV co-infection (8.34/100 py) and among intravenous drug users (IDU) without methadone treatment (MT) during follow-up (6.66/100 py). The incidence rate of HCV seroconversion among cases with a history of IDU and current MT was 1.35/100 py, which is close to that of the total study population. The following variables had a positive predictive value for HCV infection: IDU (p<0.001; HR = 7,30; CI: 4.83–11.04), Spanish ethnicity (p = 0.009; HR = 2,03; CI: 1.93–3.44) and HIV infection (p = 0.015; HR = 1.97; CI: 1.14–3.39). Conclusion The incidence of HCV infection among prisoners was higher during the first part of the study and among IDU during the entire study period. Preventative programs should be directed toward this sub-group of the prison population. PMID:24587394

  20. Multiple Imputation for Network Analyses

    NARCIS (Netherlands)

    Krause, Robert; Huisman, Mark; Steglich, Christian; Snijders, Thomas

    2016-01-01

    Missing data on network ties is a fundamental problem for network analyses. The biases induced by missing edge data, even when missing completely at random (MCAR), are widely acknowledged and problematic for network analyses (Kossinets, 2006; Huisman & Steglich, 2008; Huisman, 2009). Although

  1. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    , and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack...... for the influence of such size-effects on cavitation instabilities are presented. When a metal contains a distribution of micro voids, and the void spacing compared to void size is not extremely large, the surrounding voids may affect the occurrence of a cavitation instability at one of the voids. This has been...

  2. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  3. A 20-year analysis of implant treatment in an Australian public dental clinic.

    Science.gov (United States)

    Duong, Anne; Dudley, James

    2018-02-03

    There is limited information regarding the profile and outcomes of implant treatment provided in the Australian public sector. This retrospective cohort study reviewed dental implant treatment completed at the Adelaide Dental Hospital over a 20 year period. The database of implant treatment completed between 1996 and 2015 was analysed for patient, implant, prosthesis and operator specifics together with known implant status. Three hundred and twenty patients (mean age = 51.50 years) were treated with 527 implants. One hundred and eighty-four (57.50%) female patients received 296 (56.17%) implants and 136 (42.50%) males received 231 (43.84%) implants. Three hundred (56.93%) implants were restored with single crowns, 147 (27.89%) implants were restored with 63 mandibular implant overdentures, five (0.95%) implants were restored with two maxillary implant overdentures, and 67 (12.71%) implants were restored with 20 full arch fixed prostheses. The overall known implant survival rate was 87.67%. Mandibular implant overdentures had a risk of implant failure four times that of single implant-retained crowns (odds ratio = 4.0, 95% CI: 1.4, 11.4) that was statistically significant (comparison P-value = 0.0100). Implant treatment completed in this public sector clinic using finite resources and a defined system of patient and restorative selection criteria demonstrated a high known implant survival rate. Utilising a structured and maintained patient recall protocol, it would be ideal to investigate further parameters of interest particularly those that could improve treatment delivery and longevity. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  4. A20 inhibits the motility of HCC cells induced by TNF-α

    Science.gov (United States)

    Xiao, Ying; Li, Na; Guo, Chun; Zhang, Lining; Shi, Yongyu

    2016-01-01

    Metastasis of hepatocellular carcinoma (HCC) can be facilitated by TNF-α, a prototypical inflammatory cytokine in the HCC microenvironment. A20 is a negative regulator of NF-κB signaling pathway. In the present study we ask whether A20 plays a role in HCC metastasis. We found that A20 expression was downregulated in the invasive cells of microvascular invasions (MVI) compared with the noninvasive cells in 89 tissue samples from patients with HCC by immunochemistry methods. Overexpression of A20 in HCC cell lines inhibited their motility induced by TNF-α. Furthermore, the overexpression of A20 inhibited epithelial-mesenchymal transition (EMT), FAK activation and RAC1 activity. By contrast, knockdown of A20 in one HCC cell line results in the converse. In addition, the overexpression of A20 restrained the formation of MVI in HCC xenograft in nude mice treated with TNF-α. All the results suggested that A20 functioned as a negative regulator in motility of HCC cells induced by TNF-α. PMID:26909601

  5. A20--an omnipotent protein in the liver: prometheus myth resolved?

    Science.gov (United States)

    da Silva, Cleide Gonçalves; Cervantes, Jesus Revuelta; Studer, Peter; Ferran, Christiane

    2014-01-01

    Contribution of NF-kappaB inhibitory and ubiquitin-editing A20 (tnfaip3) to the liver's protective response to injury, particularly to its anti-inflammatory armamentarium, is exemplified by the dramatic phenotype of A20 knockout mice that die prematurely of unfettered inflammation predominantly in the liver. A number of additional studies originating from our laboratory and others clearly demonstrate that A20 is part of the liver response to injury and resection. Upregulation of A20 in hepatocytes serves a broad hepatoprotective goal through combined anti-inflammatory, anti-apoptotic, anti-oxidant and pro-regenerative functions. The molecular basis for A20's hepatoprotective functions were partially resolved and include blockade of NF-kappaB activation in support of its anti-inflammatory function, inhibition of pro-caspase 8 cleavage in support of its anti-apoptotic function, increasing Peroxisome Proliferator Activated Receptor alpha (PPARalpha) expression in support of its anti-oxidant function, and decreasing Cyclin Dependent Kinase Inhibitor p21 while boosting IL-6/STAT3 proliferative signals as part of its pro-regenerative function. In experimental animal models, overexpression of A20 in the liver protects from radical acute fulminant toxic hepatitis, lethal hepatectomy, and severe liver ischemia reperfusion injury (IRI), and allows successful engraftment of marginal liver grafts. Conversely, partial loss of A20, as in A20 heterozygote mice, significantly impairs liver regeneration and damage, which confers high lethality to an otherwise safe procedure i.e., 2/3 partial hepatectomy. This is the ultimate proof of the physiologic role of A20 in liver regeneration and repair. In recent work, A20's functions in the liver have expanded to encompass regulation of lipid and glucose metabolism, unlocking a whole new set of metabolic diseases that could be affected by A20. In this chapter we review all available data regarding A20's physiologic role in the liver, and

  6. Cholera Toxin Suppresses Expression of Ubiquitin Editing Enzyme A20 and Enhances Transcytosis

    Directory of Open Access Journals (Sweden)

    Ming-Yang Li

    2013-03-01

    Full Text Available Background: The ubiquitin editing enzyme A20 plays an important role in maintaining the homeostasis in the body Microbe-derived adjuvants are commonly used in animal models of intestinal allergy. Objective: This study aims to investigate the role of cholera toxin-induced A20 suppression in compromising intestinal barrier function. Methods: Human intestinal epithelial cells were cultured into monolayers as an in vitro epithelial barrier model. Ovalbumin (OVA was used as a specific allergen to test the degrading capability of intestinal epithelial cells for the endocytic allergens. The fusion of endosomes and lysosomes in epithelial cells was observed by immunocytochemistry. The antigenicity of OVA was tested by T cell proliferation assay. Results: A20 was detectable in the intestinal cell lines and mouse intestinal epithelialum. A20 was required in the degradation of endocytic allergens in HT-29 cells. The allergen, OVA, could pass through A20-deficient HT-29 monolayer barrier. Exposure to microbial adjuvant, cholera toxin, suppressed the expression of A20 in HT-29 cells, which compromised the epithelial barrier function. Conclusion: The microbial product, cholera toxin, interferes with the expression of A20 in intestinal epithelial cells, which compromises the intestinal epithelial barrier function.

  7. Climate prediction and predictability

    Science.gov (United States)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  8. [Prediction in cerebrovascular diseases].

    Science.gov (United States)

    Hamann, G F

    2014-10-01

    Prediction of the outcome of cerebrovascular diseases or of the effects and complications of various forms of treatment are essential components of all stroke treatment regimens. This review focuses on the prediction of the stroke risk in primary prevention, the prediction of the risk of secondary stroke following a transient ischemic attack (TIA), the estimation of the outcome following manifest stroke and the treatment effects, the prediction of secondary cerebrovascular events and the prediction of vascular cognitive impairment following stroke. All predictive activities in cerebrovascular disease are hindered by the translation of predictive results from studies and patient populations to the individual patient. Future efforts in genetic analyses may be able to overcome this barrier and to enable individual prediction in the area of so-called personalized medicine. In all the various fields of prediction in cerebrovascular diseases, three major variables are always important: age of the patient, severity and subtype of the stroke. Increasing age, more severe stroke symptoms and the cardioembolic stroke subtype predict a poor outcome regarding both survival and permanent disability. This finding is somewhat banal and will therefore never replace the well experienced clinician judging the chances of a patient and taking into account the personal situation of this patient, e.g. for initiation of a rehabilitation program. Besides the individualized prediction, in times of restricted economic resources and increasing tendency to clarify questions of medical treatment in court, it seems unavoidable to use prediction in economic and medicolegal interaction with clinical medicine. This tendency will be accompanied by difficult ethical problems which neurologists must be aware of. Improved prediction should not be used to allocate or restrict resources or to restrict medically indicated treatment.

  9. A20 modulates lipid metabolism and energy production to promote liver regeneration.

    Directory of Open Access Journals (Sweden)

    Scott M Damrauer

    2011-03-01

    Full Text Available Liver regeneration is clinically of major importance in the setting of liver injury, resection or transplantation. We have demonstrated that the NF-κB inhibitory protein A20 significantly improves recovery of liver function and mass following extended liver resection (LR in mice. In this study, we explored the Systems Biology modulated by A20 following extended LR in mice.We performed transcriptional profiling using Affymetrix-Mouse 430.2 arrays on liver mRNA retrieved from recombinant adenovirus A20 (rAd.A20 and rAd.βgalactosidase treated livers, before and 24 hours after 78% LR. A20 overexpression impacted 1595 genes that were enriched for biological processes related to inflammatory and immune responses, cellular proliferation, energy production, oxidoreductase activity, and lipid and fatty acid metabolism. These pathways were modulated by A20 in a manner that favored decreased inflammation, heightened proliferation, and optimized metabolic control and energy production. Promoter analysis identified several transcriptional factors that implemented the effects of A20, including NF-κB, CEBPA, OCT-1, OCT-4 and EGR1. Interactive scale-free network analysis captured the key genes that delivered the specific functions of A20. Most of these genes were affected at basal level and after resection. We validated a number of A20's target genes by real-time PCR, including p21, the mitochondrial solute carriers SLC25a10 and SLC25a13, and the fatty acid metabolism regulator, peroxisome proliferator activated receptor alpha. This resulted in greater energy production in A20-expressing livers following LR, as demonstrated by increased enzymatic activity of cytochrome c oxidase, or mitochondrial complex IV.This Systems Biology-based analysis unravels novel mechanisms supporting the pro-regenerative function of A20 in the liver, by optimizing energy production through improved lipid/fatty acid metabolism, and down-regulated inflammation. These findings

  10. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability: SSD Plot Diagrams

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  11. Data for decay Heat Predictions

    International Nuclear Information System (INIS)

    1987-01-01

    These proceedings of a specialists' meeting on data for decay heat predictions are based on fission products yields, on delayed neutrons and on comparative evaluations on evaluated and experimental data for thermal and fast fission. Fourteen conferences were analysed

  12. Análise lise da estabilidade e previsibilidade da qualidade fisiológica de sementes de soja produzidas em Cristalina, Goiás = Stability and predictability analyses of the physiological quality of soybean seeds produced in Cristalina, Goiás (Brazil

    Directory of Open Access Journals (Sweden)

    Éder Matsuo

    2008-04-01

    emergence speed and stability analyses were tested through the methods proposed by Lin and Binns (1988 and Annicchiarico (1992. The germination percentage averages, the emergence of plants and the emergence speed index were compared through Tukey’s test at 5% probability. In the evaluation of the seeds’ physiological quality, genotype 7B1454170 was identified as the best, and genotype9B1459189 as the worst. The genotypes Emgopa 313, 7B1454170, 11B145341 and DM339 were classified as offering high stability in physiological quality, and genotypes 3B1346193 and 9B1459189 offered low predictability. The estimation methods used were efficient,coherent among them and allowed the identification, among the evaluated genotypes, of the ones that offered greater stability and predictability.

  13. Ecosystem Development after Mangrove Wetland Creation: Plant-Soil Change across a 20-year Chronosequence

    Science.gov (United States)

    Mangrove wetland restoration and creation efforts are increasingly proposed as mechanisms to compensate for mangrove wetland loss. However, ecosystem development and functional equivalence in restored and created mangrove wetlands is poorly understood. We compared a 20-yr chrono...

  14. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software......, this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...... are based on (quite successful) ad-hoc techniques. We believe they can be significantly improved beyond the state-of-the-art by pairing them with static analyses techniques. In this paper we present an approach to both formalising those real-world systems, as well as providing an underlying semantics, which...

  15. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well...

  17. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  18. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  19. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  20. Imiquimod induces apoptosis of squamous cell carcinoma (SCC cells via regulation of A20.

    Directory of Open Access Journals (Sweden)

    Kyung-Cheol Sohn

    Full Text Available Imiquimod, a nucleoside analogue of the imidazoquinoline family, is being used to treat various cutaneous cancers including squamous cell carcinoma (SCC. Imiquimod activates anti-tumor immunity via Toll-like receptor 7 (TLR7 in macrophage and other immune cells. Imiquimod can also affect tumor cells directly, regardless of its impact on immune system. In this study, we demonstrated that imiquimod induced apoptosis of SCC cells (SCC12 and A20 was involved in this process. When A20 was overexpressed, imiquimod-induced apoptosis was markedly inhibited. Conversely, knockdown of A20 potentiated imiquimod-induced apoptosis. Interestingly, A20 counteracted activation of c-Jun N-terminal kinase (JNK, suggesting that A20-regulated JNK activity was possible mechanism underlying imiquimod-induced apoptosis of SCC12 cells. Finally, imiquimod-induced apoptosis of SCC12 cells was taken place in a TLR7-independent manner. Our data provide new insight into the mechanism underlying imiquimod effect in cutaneous cancer treatment.

  1. A20 (Tnfaip3 deficiency in myeloid cells protects against influenza A virus infection.

    Directory of Open Access Journals (Sweden)

    Jonathan Maelfait

    Full Text Available The innate immune response provides the first line of defense against viruses and other pathogens by responding to specific microbial molecules. Influenza A virus (IAV produces double-stranded RNA as an intermediate during the replication life cycle, which activates the intracellular pathogen recognition receptor RIG-I and induces the production of proinflammatory cytokines and antiviral interferon. Understanding the mechanisms that regulate innate immune responses to IAV and other viruses is of key importance to develop novel therapeutic strategies. Here we used myeloid cell specific A20 knockout mice to examine the role of the ubiquitin-editing protein A20 in the response of myeloid cells to IAV infection. A20 deficient macrophages were hyperresponsive to double stranded RNA and IAV infection, as illustrated by enhanced NF-κB and IRF3 activation, concomitant with increased production of proinflammatory cytokines, chemokines and type I interferon. In vivo this was associated with an increased number of alveolar macrophages and neutrophils in the lungs of IAV infected mice. Surprisingly, myeloid cell specific A20 knockout mice are protected against lethal IAV infection. These results challenge the general belief that an excessive host proinflammatory response is associated with IAV-induced lethality, and suggest that under certain conditions inhibition of A20 might be of interest in the management of IAV infections.

  2. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  3. Safety analyses of surface facilities

    International Nuclear Information System (INIS)

    Anspach, W.; Baran, A.; Dorst, H.J.; Eifert, B.; Gruen, M.; Behrendt, V.; Berkhan, W.; Dincklage, R.D. v.; Doehler, J.; Bruecher, H.

    1981-01-01

    The investigations were carried out using the example of the Gorleben waste disposal center and the planning documents established for this center. The safety analyses refer to the transport of spent fuel elements, the water-cooled interin storage and the reprocessing stage. Regarding the risk analysis of the technical systems the dynamics of the courses of incidents can be better taken into account by doing a methodical development. (DG) [de

  4. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  5. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  6. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  7. Accurate renormalization group analyses in neutrino sector

    Energy Technology Data Exchange (ETDEWEB)

    Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Kaneta, Kunio [Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Takahashi, Ryo [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Yamaguchi, Yuya [Department of Physics, Faculty of Science, Hokkaido University, Sapporo 060-0810 (Japan)

    2014-08-15

    We investigate accurate renormalization group analyses in neutrino sector between ν-oscillation and seesaw energy scales. We consider decoupling effects of top quark and Higgs boson on the renormalization group equations of light neutrino mass matrix. Since the decoupling effects are given in the standard model scale and independent of high energy physics, our method can basically apply to any models beyond the standard model. We find that the decoupling effects of Higgs boson are negligible, while those of top quark are not. Particularly, the decoupling effects of top quark affect neutrino mass eigenvalues, which are important for analyzing predictions such as mass squared differences and neutrinoless double beta decay in an underlying theory existing at high energy scale.

  8. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  9. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    Science.gov (United States)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  10. Para-prosthetic leaks following mitral valve replacement : Case analysis on a 20-year period.

    Science.gov (United States)

    Dziubek, Melvin; Pierrakos, Charalampos; Chebli, Louis; Demanet, Helene; Sanoussi, Ahmed; Wauthy, Pierre

    2017-11-09

    Mitral para-prosthetic leaks are rare but major complications of mitral heart valve replacements. When they must be re-operated, they are burdened with high mortality rates. We propose to review our surgical experience in terms of approach and type of operation carried out. Demographic, preoperative, intraoperative and postoperative characteristics of 34 patients who benefited from a surgical treatment of mitral paravalvular leak at the Brugmann University Hospital between 1996 and 2016 have been gathered retrospectively. We analysed the data to identify the risk factors of postoperative mortality. We then compared the data depending on the approach and the type of surgical treatment in order to compare the morbity-mortality. The postoperative mortality rate was 11.7%. The presence of endocarditis and increase in lactate dehydrogenase were predictive factors of mortality. Cardiac complications and acute kidney failure were significantly more common in the decease population. Direct mitral paravalvular leak suturing was more frequently performed on early apparition, anterior and isolated leaks, whereas a mitral heart valve replacement was most often performed to cure active primary endocarditis. The incidence of complications and mortality rates were identical according to the approach and the type of operation performed. A mitral para-prosthetic leak recurrence was observed in 33% of the cases. Surgical treatment of mitral para-prosthetic leaks is accompanied by a high mortality rate. The operative strategy plays a major role and can influence the morbity-mortality encountered in those patients. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  11. Providing traceability for neuroimaging analyses.

    Science.gov (United States)

    McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran

    2013-09-01

    With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of

  12. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  13. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  14. Finite element analyses of wood laminated composite poles

    Science.gov (United States)

    Cheng Piao; Todd F. Shupe; R.C. Tang; Chung Y. Hse

    2005-01-01

    Finite element analyses using ANSYS were conducted on orthotropic, polygonal, wood laminated composite poles subjected to a body force and a concentrated load at the free end. Deflections and stress distributions of small-scale and full-size composite poles were analyzed and compared to the results obtained in an experimental study. The predicted deflection for both...

  15. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  16. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; ten Have, Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health

  17. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  18. Incorporating nonlinearity into mediation analyses.

    Science.gov (United States)

    Knafl, George J; Knafl, Kathleen A; Grey, Margaret; Dixon, Jane; Deatrick, Janet A; Gallo, Agatha M

    2017-03-21

    Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X) on a child's adaptation (Y) to a chronic condition by the difficulty (M) for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  19. Incorporating nonlinearity into mediation analyses

    Directory of Open Access Journals (Sweden)

    George J. Knafl

    2017-03-01

    Full Text Available Abstract Background Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Methods Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Results Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X on a child’s adaptation (Y to a chronic condition by the difficulty (M for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Conclusions Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  20. Venous thromboembolism and subsequent hospitalisation due to acute arterial cardiovascular events: a 20-year cohort study

    DEFF Research Database (Denmark)

    Sørensen, Henrik Toft; Horvath-Puho, Erzsebet; Pedersen, Lars

    2007-01-01

    to investigate the risk of arterial cardiovascular events in patients who were diagnosed with venous thromboembolism. METHODS: We undertook a 20-year population-based cohort study using data from nationwide Danish medical databases. After excluding those with known cardiovascular disease, we assessed the risk...

  1. Fracturing and brittleness index analyses of shales

    Science.gov (United States)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  2. TNF-α-induced protein 3 (TNFAIP3)/A20 acts as a master switch in TNF-α blockade-driven IL-17A expression.

    Science.gov (United States)

    Urbano, Paulo C M; Aguirre-Gamboa, Raúl; Ashikov, Angel; van Heeswijk, Bennie; Krippner-Heidenreich, Anja; Tijssen, Henk; Li, Yang; Azevedo, Valderilio F; Smits, Lisa J T; Hoentjen, Frank; Joosten, Irma; Koenen, Hans J P M

    2017-12-14

    Anti-TNF inhibitors successfully improve the quality of life of patients with inflammatory disease. Unfortunately, not all patients respond to anti-TNF therapy, and some patients show paradoxical immune side effects, which are poorly understood. Surprisingly, anti-TNF agents were shown to promote IL-17A production with as yet unknown clinical implications. We sought to investigate the molecular mechanism underlying anti-TNF-driven IL-17A expression and the clinical implications of this phenomenon. Fluorescence-activated cell sorting, RNA sequencing, quantitative real-time PCR, Western blotting, small interfering RNA interference, and kinase inhibitors were used to study the molecular mechanisms in isolated human CD4 + T cells from healthy donors. The clinical implication was studied in blood samples of patients with inflammatory bowel disease (IBD) receiving anti-TNF therapy. Here we show that anti-TNF treatment results in inhibition of the anti-inflammatory molecule TNF-α-induced protein 3 (TNFAIP3)/A20 in memory CD4 + T cells. We found an inverse relationship between TNFAIP3/A20 expression levels and IL-17A production. Inhibition of TNFAIP3/A20 promotes kinase activity of p38 mitogen-activated protein kinase and protein kinase C, which drives IL-17A expression. Regulation of TNFAIP3/A20 expression and cognate IL-17A production in T cells are specifically mediated through TNF receptor 2 signaling. Ex vivo, in patients with IBD treated with anti-TNF, we found further evidence for an inverse relationship between TNFAIP3/A20 expression levels and IL-17A-producing T cells. Anti-TNF treatment interferes in the TNFAIP3/A20-mediated anti-inflammatory feedback loop in CD4 + T cells and promotes kinase activity. This puts TNFAIP3/A20, combined with IL-17A expression, on the map as a potential tool for predicting therapy responsiveness or side effects of anti-TNF therapy. Moreover, it provides novel targets related to TNFAIP3/A20 activity for superior therapeutic regimens

  3. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  4. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  5. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  6. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  7. Isotopic signatures by bulk analyses

    International Nuclear Information System (INIS)

    Efurd, D.W.; Rokop, D.J.

    1997-01-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally

  8. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  9. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  10. Quality assurance for Chinese herbal formulae: standardization of IBS-20, a 20-herb preparation

    OpenAIRE

    Ip, Siu-Po; Zhao, Ming; Xian, Yanfang; Chen, Mengli; Zong, Yuying; Tjong, Yung-Wui; Tsai, Sam-Hip; Sung, Joseph JY; Bensoussan, Alan; Berman, Brian; Fong, Harry HS; Che, Chun-Tao

    2010-01-01

    Abstract Background The employment of well characterized test samples prepared from authenticated, high quality medicinal plant materials is key to reproducible herbal research. The present study aims to demonstrate a quality assurance program covering the acquisition, botanical validation, chemical standardization and good manufacturing practices (GMP) production of IBS-20, a 20-herb Chinese herbal formula under study as a potential agent for the treatment of irritable bowel syndrome. Method...

  11. NKT sublineage specification and survival requires the ubiquitin-modifying enzyme TNFAIP3/A20.

    Science.gov (United States)

    Drennan, Michael B; Govindarajan, Srinath; Verheugen, Eveline; Coquet, Jonathan M; Staal, Jens; McGuire, Conor; Taghon, Tom; Leclercq, Georges; Beyaert, Rudi; van Loo, Geert; Lambrecht, Bart N; Elewaut, Dirk

    2016-09-19

    Natural killer T (NKT) cells are innate lymphocytes that differentiate into NKT1, NKT2, and NKT17 sublineages during development. However, the signaling events that control NKT sublineage specification and differentiation remain poorly understood. Here, we demonstrate that the ubiquitin-modifying enzyme TNFAIP3/A20, an upstream regulator of T cell receptor (TCR) signaling in T cells, is an essential cell-intrinsic regulator of NKT differentiation. A20 is differentially expressed during NKT cell development, regulates NKT cell maturation, and specifically controls the differentiation and survival of NKT1 and NKT2, but not NKT17, sublineages. Remaining A20-deficient NKT1 and NKT2 thymocytes are hyperactivated in vivo and secrete elevated levels of Th1 and Th2 cytokines after TCR ligation in vitro. Defective NKT development was restored by compound deficiency of MALT1, a key downstream component of TCR signaling in T cells. These findings therefore show that negative regulation of TCR signaling during NKT development controls the differentiation and survival of NKT1 and NKT2 cells. © 2016 Drennan et al.

  12. Ergonomic analyses of downhill skiing.

    Science.gov (United States)

    Clarys, J P; Publie, J; Zinzen, E

    1994-06-01

    The purpose of this study was to provide electromyographic feedback for (1) pedagogical advice in motor learning, (2) the ergonomics of materials choice and (3) competition. For these purposes: (1) EMG data were collected for the Stem Christie, the Stem Turn and the Parallel Christie (three basic ski initiation drills) and verified for the complexity of patterns; (2) integrated EMG (iEMG) and linear envelopes (LEs) were analysed from standardized positions, motions and slopes using compact, soft and competition skis; (3) in a simulated 'parallel special slalom', the muscular activity pattern and intensity of excavated and flat snow conditions were compared. The EMG data from the three studies were collected on location in the French Alps (Tignes). The analog raw EMG was recorded on the slopes with a portable seven-channel FM recorder (TEAC MR30) and with pre-amplified bipolar surface electrodes supplied with a precision instrumentation amplifier (AD 524, Analog Devices, Norwood, USA). The raw signal was full-wave rectified and enveloped using a moving average principle. This linear envelope was normalized according to the highest peak amplitude procedure per subject and was integrated in order to obtain a reference of muscular intensity. In the three studies and for all subjects (elite skiers: n = 25 in studies 1 and 2, n = 6 in study 3), we found a high level of co-contractions in the lower limb extensors and flexors, especially during the extension phase of the ski movement. The Stem Christie and the Parallel Christie showed higher levels of rhythmic movement (92 and 84%, respectively).(ABSTRACT TRUNCATED AT 250 WORDS)

  13. Analysing Interplanetary Probe Guidance Accuracy

    Directory of Open Access Journals (Sweden)

    S. V. Sukhova

    2016-01-01

    Full Text Available The paper presents a guidance accuracy analysis and estimates delta-v budget required to provide the trajectory correction maneuvers for direct interplanetary flights (without midcourse gravity assists. The analysis takes into consideration the orbital hyperbolic injection errors (depend on a selected launch vehicle and ascent trajectory and the uncertainties of midcourse correction maneuvers.The calculation algorithm is based on Monte Carlo simulation and Danby’s matrix methods (the matrizant of keplerian motion. Danby’s method establishes a link between the errors of the spacecraft state vectors at different flight times using the reference keplerian orbit matrizant. Utilizing the nominal trajectory parameters and the covariance matrix of launch vehicle injection errors the random perturbed orbits are generated and required velocity corrections are calculated. The next step is to simulate midcourse maneuver performance uncertainty using the midcourse maneuver covariance matrix. The obtained trajectory correction impulses and spacecraft position errors are statistically processed to compute required delta-v budget and dispersions ellipse parameters for different prediction intervals.As an example, a guidance accuracy analysis has been conducted for a 2022 mission to Mars and a Venus mission in 2026. The paper considers one and two midcourse correction options, as well as utilization of two different launch vehicles.The presented algorithm based on Monte Carlo simulation and Danby’s methods provides preliminary evaluation for midcourse corrections delta-v budget and spacecraft position error. The only data required for this guidance accuracy analysis are a reference keplerian trajectory and a covariance matrix of the injection errors. Danby’s matrix method allows us to take into account also the other factors affecting the trajectory thereby increasing the accuracy of analysis.

  14. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  15. Detonation Performance Analyses for Recent Energetic Molecules

    Science.gov (United States)

    Stiel, Leonard; Samuels, Philip; Spangler, Kimberly; Iwaniuk, Daniel; Cornell, Rodger; Baker, Ernest

    2017-06-01

    Detonation performance analyses were conducted for a number of evolving and potential high explosive materials. The calculations were completed for theoretical maximum densities of the explosives using the Jaguar thermo-chemical equation of state computer programs for performance evaluations and JWL/JWLB equations of state parameterizations. A number of recently synthesized materials were investigated for performance characterizations and comparisons to existing explosives, including TNT, RDX, HMX, and Cl-20. The analytic cylinder model was utilized to establish cylinder and Gurney velocities as functions of the radial expansions of the cylinder for each explosive. The densities and heats of formulation utilized in the calculations are primarily experimental values from Picatinny Arsenal and other sources. Several of the new materials considered were predicted to have enhanced detonation characteristics compared to conventional explosives. In order to confirm the accuracy of the Jaguar and analytic cylinder model results, available experimental detonation and Gurney velocities for representative energetic molecules and their formulations were compared with the corresponding calculated values. Close agreement was obtained with most of the data. Presently at NATO.

  16. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  17. Thermal and hydraulic analyses of the System 81 cold traps

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.

    1977-06-15

    Thermal and hydraulic analyses of the System 81 Type I and II cold traps were completed except for thermal transients analysis. Results are evaluated, discussed, and reported. Analytical models were developed to determine the physical dimensions of the cold traps and to predict the performance. The FFTF cold trap crystallizer performances were simulated using the thermal model. This simulation shows that the analytical model developed predicts reasonably conservative temperatures. Pressure drop and sodium residence time calculations indicate that the present design will meet the requirements specified in the E-Specification. Steady state temperature data for the critical regions were generated to assess the magnitude of the thermal stress.

  18. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...... works, articles of theoretical nature, application-oriented studies and articles dealing with the topic of law and policy. The analysis of the research results reveals that more than half of the literature pool deals with the application and actual function tests of Prediction Markets. The results...

  19. Efficient ALL vs. ALL collision risk analyses

    Science.gov (United States)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  20. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  1. A20 Functional Domains Regulate Subcellular Localization and NF-Kappa B Activation

    Science.gov (United States)

    2013-08-15

    demonstrate that mutation or elimination of the κB elements leads to a loss of induction by TNF- in a Jurkat T-lymphocyte model. In addition...elements with nuclear extract from TNF- treated or untreated Jurkat T-cells leads to the formation of a specific TNF--inducible DNA-protein complex...elements were able to compete with A20 κB elements for binding with the nuclear extracts derived from TNF- treated Jurkat T-cells. These studies gave

  2. Dynamic characteristics of a 20 kHz resonant power system - Fault identification and fault recovery

    Science.gov (United States)

    Wasynczuk, O.

    1988-01-01

    A detailed simulation of a dc inductor resonant driver and receiver is used to demonstrate the transient characteristics of a 20 kHz resonant power system during fault and overload conditions. The simulated system consists of a dc inductor resonant inverter (driver), a 50-meter transmission cable, and a dc inductor resonant receiver load. Of particular interest are the driver and receiver performance during fault and overload conditions and on the recovery characteristics following removal of the fault. The information gained from these studies sets the stage for further work in fault identification and autonomous power system control.

  3. Proliferative myositis of the latissimus dorsi presenting in a 20-year-old male athlete

    LENUS (Irish Health Repository)

    Mc Hugh, N

    2017-08-01

    We describe the case of a 20-year-old rower presenting with an uncommon condition of Proliferative Myositis (PM) affecting the Latissimus Dorsi (LD). PM is a rare, benign tumour infrequently developing in the upper back. Its rapid growth and firm consistency may mistake it for sarcoma at presentation. Therefore, careful multidisciplinary work-up is crucial, and should involve appropriate radiological and histopathological investigations. Here, we propose the aetiology of LD PM to be persistent myotrauma induced by repetitive rowing motions. Symptoms and rate of progression ultimately determine the management which includes surveillance and\\/or conservative resection. There have been no documented cases of recurrence or malignant transformation.

  4. A retrospective analysis of mathieu and tip urethroplasty techniques for distal hypospadias repair; A 20 year experience.

    Science.gov (United States)

    Oztorun, Kenan; Bagbanci, Sahin; Dadali, Mumtaz; Emir, Levent; Karabulut, Ayhan

    2017-09-01

    We aimed to identify the changes in the application rate of two surgical techniques in distal hypospadias repair in years and compare the most popular two surgical repair techniques for distal hypospadias in terms of surgical outcomes, the factors that affect the outcomes, which were performed over a 20 year period. In this study, the records of 492 consecutive patients that had undergone an operation for distal hypospadias in the urology clinic of Ankara between May 1990 and December 2010 using either Mathieu or TIPU surgical techniques were reviewed retrospectively. The patients who had glanular, coronal, and subcoronal meatus, were accepted as distal hypospadias cases. Among the 492 examined medical records, it was revealed that 331 and 161 surgical interventions were performed by using the Mathieu urethroplasty technique (Group-1) and TIP urethroplasty technique (Group-2), respectively. Group-1 was divided into two subgroups; namely Group-1a (patients with primary hypospadias) and Group-1b (patients with previous hypospadias operation). Likewise, Group-2 was divided into two subgroups; namely group-2a and group-2b. The patients' ages, number of previously urethroplasty operations, localization of the external urethral meatus prior to the operation, chordee state, length of the newly formed urethra, whether urinary diversion was done or not, post-operative complications and data regarding the follow-up period were evaluated, and the effects of these variables on the surgical outcome were investigated via statistical analyses. The primary objective of this study is to identify the changes in the application rate of two surgical techniques in distal hypospadias repair over the a 20 year period, and the secondary objectives are to compare the most popular two surgical repair techniques for distal hypospadias in terms of surgical outcomes, and the factors affecting the outcomes. Independent samples t test and Pearson's Chisquare test was used for statistical

  5. Superconductor design and loss analysis for a 20 MJ induction heating coil

    International Nuclear Information System (INIS)

    Walker, M.S.; Declercq, J.G.; Zeitlin, B.A.

    1980-01-01

    The design of a 50 k Ampere conductor for use in a 20 MJ Induction Heating Coil is described. The conductor is a wide flat cable of 36 subcables, each of which contains six NbTi strands around a stainless steel core strand. The 2.04 mm (0.080'') diameter monolithic strands allow bubble clearing for cryostable operation at a pool boiling heat transfer from the unoccluded strand surface of 0.26 Watts/cm 2 . A thin, tough polyester amide-imide (Westinghouse Omega) insulation provides a rugged coating that will resist flaking and chipping during the cabling and compaction operations and provide (1) a reliable adherent surface for enhanced heat transfer, and (2) a low voltage standoff preventing interstrand coupling losses. The strands are uniquely configured using CuNi elements to provide low ac losses with NbTi filaments in an all-copper matrix. AC losses are expected to be approximately 0.3% of 20 MJ for a -7.5 T to 7.5 T one-second 1/2-cosinusoidal bipolar operation in a 20 MJ coil. They will be approximately 0.1% of 100 MJ for 1.8 second -8 T and +8 T ramped operation in a 100 MJ coil. The design is firmly based on the results of tests performed on prototype strands and subcables

  6. [TRENDS OF PERMANENT PACEMAKER IMPLANTATION IN A SINGLE CENTER OVER A 20-YEAR PERIOD].

    Science.gov (United States)

    Antonelli, Dante; Ilan, Limor Bushar; Freedberg, Nahum A; Feldman, Alexander; Turgeman, Yoav

    2015-05-01

    To review the changes in permanent pacemaker implantation indications, pacing modes and patients' demographics over a 20-year period. We retrospectively retrieved data on patients who underwent first implantation of the pacemaker between 1-1-1991 and 31-12-2010. One thousand and nine (1,009) patients underwent a first pacemaker implantation during that period; 535 were men (53%), their mean age was 74.6±19.5 years; the highest rate of implanted pacemaker was in patients ranging in age from 70-79 years, however there was an increasing number of patients aged over 80 years. The median survival time after initial pacemaker implantation was 8 years. Syncope was the most common symptom (62.5%) and atrioventricular block was the most common electrocardiographic indication (56.4%) leading to pacemaker implantation. There was increased utilization of dual chamber and rate responsive pacemakers over the years. There was no difference regarding mode selection between genders. Pacemaker implantation rates have increased over a 20-year period. Dual chamber replaced most of the single ventricular chamber pacemaker and rate responsive pacemakers became the norm. The data of a small volume center are similar to those reported in pacemaker surveys of high volume pacemaker implantation centers. They confirm adherence to the published guidelines for pacing.

  7. Hydrodynamic phonon drift and second sound in a (20,20) single-wall carbon nanotube

    International Nuclear Information System (INIS)

    Lee, Sangyeop; Lindsay, Lucas

    2017-01-01

    Here, two hydrodynamic features of phonon transport, phonon drift and second sound, in a (20,20) single wall carbon nanotube (SWCNT) are discussed using lattice dynamics calculations employing an optimized Tersoff potential for atomic interactions. We formally derive a formula for the contribution of drift motion of phonons to total heat flux at steady state. It is found that the drift motion of phonons carry more than 70% and 90% of heat at 300 K and 100 K, respectively, indicating that phonon flow can be reasonably approximated as hydrodynamic if the SWCNT is long enough to avoid ballistic phonon transport. The dispersion relation of second sound is derived from the Peierls-Boltzmann transport equation with Callaway s scattering model and quantifies the speed of second sound and its relaxation. The speed of second sound is around 4000 m/s in a (20,20) SWCNT and the second sound can propagate more than 10 m in an isotopically pure (20,20) SWCNT for frequency around 1 GHz at 100 K.

  8. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  9. Analyses of bundle experiment data using MATRA-h

    Energy Technology Data Exchange (ETDEWEB)

    Lim, In Cheol; Chea, Hee Taek [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    When the construction and operation license for HANARO was renewed in 1995, 25% of CHF penalty was imposed. The reason for this was that the validation work related to the CHF design calculation was not enough for the assurance of CHF margin. As a part of the works to recover this CHF penalty, MATRA-h was developed by implementing the new correlations for the heat transfer, CHF prediction, subcooled void to the MATRA-a, which is the modified version of COBRA-IV-I done by KAERI. Using MATRA-h, the subchannel analyses for the bundle experiment data were performed. The comparison of the code predictions with the experimental results, it was found that the code would give the conservative predictions as far as the CHF in the bundle geometry is concerned. (author). 12 refs., 25 figs., 16 tabs.

  10. The effects of HIV/AIDS on rural communities in East Africa: a 20-year perspective.

    Science.gov (United States)

    Seeley, Janet; Dercon, Stefan; Barnett, Tony

    2010-03-01

    Much of the research on implications of the HIV epidemic for individual households and broader rural economies in the 1980s and early 1990s predicted progressive declines in agricultural production, with dire consequences for rural livelihoods. Restudies in Tanzania and Uganda show that from 1986 to the present, HIV and AIDS have sometimes thrown households into disarray and poverty, but more often have reduced development. The progressive and systematic decline predicted in earlier work has not come to pass. However, poverty remains, as does endemic HIV disease.

  11. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  12. Study of proton and 2 protons emission from light neutron deficient nuclei around A=20; Etude de l'emission proton et de deux protons dans les noyaux legers deficients en neutrons de la region A=20

    Energy Technology Data Exchange (ETDEWEB)

    Zerguerras, T

    2001-09-01

    Proton and two proton emission from light neutron deficient nuclei around A=20 have been studied. A radioactive beam of {sup 18}Ne, {sup 17}F and {sup 20}Mg, produced at the Grand Accelerateur National d'Ions Lourds by fragmentation of a {sup 24}Mg primary beam at 95 MeV/A, bombarded a {sup 9}Be target to form unbound states. Proton(s) and nuclei from the decay were detected respectively in the MUST array and the SPEG spectrometer. From energy and angle measurements, the invariant mass of the decaying nucleus could be reconstructed. Double coincidence events between a proton and {sup 17}F, {sup 16}O, {sup 15}O, {sup 14}O and {sup 18}Ne were registered to obtain excitation energy spectra of {sup 18}Ne, {sup 17}F, {sup 16}F, {sup 15}F et {sup 19}Na. Generally, the masses measures are in agreement with previous experiments. In the case of {sup 18}Ne, excitation energy and angular distributions agree well with the predictions of a break up model calculation. From {sup 17}Ne proton coincidences, a first experimental measurement of the ground state mass excess of {sup 18}Na has been obtained and yields 24,19(0,15)MeV. Two proton emission from {sup 17}Ne and {sup 18}Ne excited states and the {sup 19}Mg ground state was studied through triple coincidences between two proton and {sup 15}O, {sup 16}O and {sup 17}Ne respectively. In the first case, the proton-proton relative angle distribution in the center of mass has been compared with model calculation. Sequential emission from excited states of {sup 17}Ne, above the proton emission threshold, through {sup 16}F is dominant but a {sup 2}He decay channel could not be excluded. No {sup 2}He emission from the 1.288 MeV {sup 17}Ne state, or from the 6.15 MeV {sup 18}Ne state has been observed. Only one coincidence event between {sup 17}Ne and two proton was registered, the value of the one neutron stripping reaction cross section of {sup 20}Mg being much lower than predicted. (author)

  13. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    Institute of Information Technology (IIIT), Hyderabad. The study was made in various direc- tions to compare the results our system developed with the out put of Hindi Analyser developed by IIIT, Hyderabad. 3.1 Comparison of rule-based morphological analyser with unsupervised morphological analyser. 3.1a Calculation of ...

  14. Changes in mood status and neurotic levels during a 20-day bed rest

    Science.gov (United States)

    Ishizaki, Yuko; Ishizaki, Tatsuro; Fukuoka, Hideoki; Kim, Chang-Sun; Fujita, Masayo; Maegawa, Yuko; Fujioka, Hiroshi; Katsura, Taisaku; Suzuki, Yoji; Gunji, Atsuaki

    2002-04-01

    This study evaluated changes of mood status and depressive and neurotic levels in nine young male subjects during a 20-day 6° head-down tilting bed rest and examined whether exercise training modified these changes. Participants were asked to complete psychometrical inventories on before, during, and after the bed rest experiment. Depressive and neurotic levels were enhanced during bed rest period according to the Japanese version of Zung's Self-rating Depression Scale and the Japanese version of the General Health Questionnaire. Mood state "vigor" was impaired and "confusion" was increased during bed rest and recumbent control periods compared to pre-bed rest and ambulatory control periods according to the Japanese version of Profiles of Mood State, whereas the mood "tension-anxiety", "depression-dejection", "anger-hostility" and "fatigue" were relatively stable during experiment. Isometric exercise training did not modify these results. Microgravity, along with confinement to bed and isolation from familiar environments, induced impairment of mental status.

  15. Interaction cross-sections and matter radii of A = 20 isobars

    International Nuclear Information System (INIS)

    Chulkov, L.; Bochkarev, O.; Geissel, H.; Golovkov, M.; Janas, Z.; Keller, H.; Kobayashi, T.; Muenzenberg, G.; Nickel, F.; Ogloblin, A.; Patra, S.; Piechaczek, A.; Roeckl, E.; Schwab, W.; Suemmerer, K.; Suzuki, T.; Tanihata, I.; Yoshida, K.

    1995-11-01

    High-energy interaction cross-sections of A=20 nuclei ( 20 N, 20 O, 20 F, 20 Ne, 20 Na, 20 Mg) on carbon were measured with accuracies of ∼1%. The nuclear matter rms radii derived from the measured cross-sections show an irregular dependence on isospin projection. The largest difference in radii, which amounts to approximately 0.2 fm, has been obtained for the mirror nuclei 20 O and 20 Mg. The influenc of nuclear deformation and binding energy on the radii is discussed. By evaluating the difference in rms radii of neutron and proton distributions, evidence has been found for the existence of a proton skin for 20 Mg and of a neutron skin for 20 N. (orig.)

  16. Environmental Assessment for Construction of a 20-Slip Boat Dock Structure MacDill AFB, Florida

    Science.gov (United States)

    2008-01-01

    polychaete worms, pink shrimp, blue crab (Callinectes sapidus), seahorses ( Hippocampus spp.), snapper, mullet (Mugilidae spp.), and bonefish (FNAI and... sinking ordamage from high winds, this plan fails to predict how the dock would handle this many l~ge boats in high winds or surge. This is an open

  17. The Cardiff dental study: a 20-year critical evaluation of the psychological health gain from orthodontic treatment.

    Science.gov (United States)

    Kenealy, Pamela M; Kingdon, Anne; Richmond, Stephen; Shaw, William C

    2007-02-01

    Despite the widespread belief that orthodontics improves psychological well-being and self-esteem, there is little objective evidence to support this (Kenealy et al., 1989a; Shaw, O'Brien, Richmond, & Brook, 1991). A 20 year follow-up study compared the dental and psychosocial status of individuals who received, or did not receive, orthodontics as teenagers. A prospective longitudinal cohort design with four studies of the effect of orthodontic treatment. Secondary analysis of outcome data incorporated orthodontic need at baseline and treatment received in a 2 x 2 factorial design. A multidisciplinary research programme studied a cohort of 1,018, 11-12 year old participants in 1981. Extensive assessment of dental health and psychosocial well-being was conducted; facial and dental photographs and plaster casts of dentition were obtained and rated for attractiveness and pre-treatment need. No recommendations about orthodontic treatment were made, and an observational approach was adopted. At the third follow-up 337 (30-31 year olds) were re-examined in 2001. Participants with a prior need for orthodontic treatment as children who obtained treatment demonstrated better tooth alignment and satisfaction. However when self-esteem at baseline was controlled for, orthodontics had little positive impact on psychological health and quality of life in adulthood. Lack of orthodontic treatment where there was a prior need did not lead to psychological difficulties in later life. Dental status alone was a weak predictor of self-esteem at outcome explaining 8% of the variance. Self-esteem in adulthood was more strongly predicted (65% of the variance) by psychological variables at outcome: perception of quality of life, life satisfaction, self-efficacy, depression, social anxiety, emotional health, and by self-perception of attractiveness. Longitudinal analysis revealed that the observed effect of orthodontic treatment on self esteem at outcome was accounted for by self esteem at

  18. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  19. Genetic Prediction.

    Science.gov (United States)

    Turkheimer, Eric

    2015-01-01

    The fundamental reason that the genetics of behavior has remained so controversial for so long is that the layer of theory between data and their interpretation is thicker and more opaque than in more established areas of science. The finding that variations in tiny snippets of DNA have small but detectable relations to variation in behavior surprises no one, at least no one who was paying attention to the twin studies. How such snippets of DNA are related to differences in behavior-known as the gene-to-behavior pathway-is the great theoretical problem of modern behavioral genetics. Given that intentional human breeding is a horrific prospect, what kind of technology might we want (or fear) out of human behavioral genetics? One possibility is a technology that could predict important behavioral characteristics of humans based on their genomes alone. A moment's thought suggests significant benefits and risks that might be associated with such a possibility, but for the moment, just consider how convincing it would be if on the day of a baby's birth we could make meaningful predictions about whether he or she would become a concert pianist or an alcoholic. This article will consider where we are right now as regards that possibility, using human height and intelligence as the primary examples. © 2015 The Hastings Center.

  20. Predicting gangrenous cholecystitis.

    Science.gov (United States)

    Wu, Bin; Buddensick, Thomas J; Ferdosi, Hamid; Narducci, Dusty Marie; Sautter, Amanda; Setiawan, Lisa; Shaukat, Haroon; Siddique, Mustafa; Sulkowski, Gisela N; Kamangar, Farin; Kowdley, Gopal C; Cunningham, Steven C

    2014-09-01

    Gangrenous cholecystitis (GC) is often challenging to treat. The objectives of this study were to determine the accuracy of pre-operative diagnosis, to assess the rate of post-cholecystectomy complications and to assess models to predict GC. A retrospective single-institution review identified patients undergoing a cholecystectomy. Logistic regression models were used to examine the association of variables with GC and to build risk-assessment models. Of 5812 patients undergoing a cholecystectomy, 2219 had acute, 4837 chronic and 351 GC. Surgeons diagnosed GC pre-operatively in only 9% of cases. Patients with GC had more complications, including bile-duct injury, increased estimated blood loss (EBL) and more frequent open cholecystectomies. In unadjusted analyses, variables significantly associated with GC included: age >45 years, male gender, heart rate (HR) >90, white blood cell count (WBC) >13,000/mm(3), gallbladder wall thickening (GBWT) ≥ 4 mm, pericholecystic fluid (PCCF) and American Society of Anesthesiology (ASA) >2. In adjusted analyses, age, WBC, GBWT and HR, but not gender, PCCF or ASA remained statistically significant. A 5-point scoring system was created: 0 points gave a 2% probability of GC and 5 points a 63% probability. Using models can improve a diagnosis of GC pre-operatively. A prediction of GC pre-operatively may allow surgeons to be better prepared for a difficult operation. © 2014 International Hepato-Pancreato-Biliary Association.

  1. Preschool Personality Antecedents of Narcissism in Adolescence and Emergent Adulthood: A 20-Year Longitudinal Study

    OpenAIRE

    Carlson, Kevin S.; Gjerde, Per F.

    2009-01-01

    This prospective study examined relations between preschool personality attributes and narcissism during adolescence and emerging adulthood. We created five a priori preschool scales anticipated to foretell future narcissism. Independent assessors evaluated the participants' personality at ages 14, 18, and 23. Based upon these evaluations, we generated observer-based narcissism scales for each of these three ages. All preschool scales predicted subsequent narcissism, except Interpersonal Anta...

  2. Predicting supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Heinemeyer, S. [Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Weiglein, G. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-07-15

    We review the result of SUSY parameter fits based on frequentist analyses of experimental constraints from electroweak precision data, (g-2){sub {mu}}, B physics and cosmological data. We investigate the parameters of the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking mass parameters, and a model with common non-universal Higgs mass parameters in the superpotential (NUHM1). Shown are the results for the SUSY and Higgs spectrum of the models. Many sparticle masses are highly correlated in both the CMSSM and NUHM1, and parts of the regions preferred at the 68% C.L. are accessible to early LHC running. The best-fit points could be tested even with 1 fb{sup -1} at {radical}(s)=7 TeV. (orig.)

  3. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  4. Successfully breaking a 20-year cycle of hospitalizations with recovery-oriented cognitive therapy for schizophrenia.

    Science.gov (United States)

    Grant, Paul M; Reisweber, Jarrod; Luther, Lauren; Brinen, Aaron P; Beck, Aaron T

    2014-05-01

    Individuals with severe and persistent schizophrenia can present challenges (e.g., difficulties sustaining motivation and conducting information processing tasks) to the implementation of recovery-oriented care. We present a successful application of recovery-oriented cognitive therapy (CT-R), a fusion of the spirit and principles of the recovery movement with the evidence base and know-how of cognitive therapy, that helped an individual with schizophrenia move along her recovery path by overcoming specific obstacles, including a 20-year cycle of hospitalizations (five per year), daily phone calls to local authorities, threatening and berating "voices," the belief that she would be killed at any moment, and social isolation. Building on strengths, treatment included collaboratively identifying meaningful personal goals that were broken down into successfully accomplishable tasks (e.g., making coffee) that disconfirmed negative beliefs and replaced the phone calling. At the end of treatment and at a 6-month follow-up, the phone calls had ceased, psychosocial functioning and neurocognitive performance had increased, and avolition and positive symptoms had decreased. She was not hospitalized once in 24 months. Results suggest that individuals with schizophrenia have untapped potential for recovery that can be mobilized through individualized, goal-focused psychosocial interventions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  5. Conceptual design of a 20 Tesla pulsed solenoid for a laser solenoid fusion reactor

    International Nuclear Information System (INIS)

    Nolan, J.J.; Averill, R.J.

    1977-01-01

    Design considerations are described for a strip wound solenoid which is pulsed to 20 tesla while immersed in a 20 tesla bias field so as to achieve within the bore of the pulsed solenoid at net field sequence starting at 20 tesla and going first down to zero, then up to 40 tesla, and finally back to 20 tesla in a period of about 5 x 10 -3 seconds. The important parameters of the solenoid, e.g., aperture, build, turns, stored and dissipated energy, field intensity and powering circuit, are given. A numerical example for a specific design is presented. Mechanical stresses in the solenoid and the subsequent choice of materials for coil construction are discussed. Although several possible design difficulties are not discussed in this preliminary report of a conceptual magnet design, such as uniformity of field, long-term stability of insulation under neutron bombardment and choice of structural materials of appropriate tensile strength and elasticity to withstand magnetic forces developed, these questions are addressed in detail in the complete design report and in part in reference one. Furthermore, the authors feel that the problems encountered in this conceptual design are surmountable and are not a hindrance to the construction of such a magnet system

  6. Antiproton production and accumulation for a 20-TeV anti pp collider

    International Nuclear Information System (INIS)

    Lambertson, G.R.; Leemann, C.W.

    1982-10-01

    Stochastic cooling is capable of providing a flux of 3.10 8 anti p/second, adequate to fill a 20 TeV collider in 12 hours for operation at luminosity L = 10 32 cm - 2 s - 1 . This represents an order of magnitude improvement over the FNAL-source design goal. It is made possible mainly due to higher bandwidth (4 GHz vs 1 GHz) and a lower psi/sub max//psi/sub max/ ratio (350 vs 10 4 ). Systems operating in the 4 to 8 GHz band are necessary, a technology which goes beyond the 2 to 4 GHz systems presently under development, but appears within reach with a moderate R and D effort. Stochastic cooling in the 8 to 16 GHz range holds the promise of small transverse emittance (less than or equal to 1 μm normalized), allowing even shorter filling times or the use of lower field, larger circumference colliders, should such devices prove more cost-effective

  7. Natural course of posttraumatic stress disorder: a 20-month prospective study of Turkish earthquake survivors.

    Science.gov (United States)

    Karamustafalioglu, Oguz K; Zohar, Joseph; Güveli, Mustafa; Gal, Gilad; Bakim, Bahadir; Fostick, Leah; Karamustafalioglu, Nesrin; Sasson, Yehuda

    2006-06-01

    A 20-month prospective follow-up of survivors of the severe earthquake in Turkey in 1999 examined the natural course of posttraumatic stress disorder (PTSD) and the contribution of different symptom clusters to the emergence of PTSD. Subjects were randomly sampled in a suburb of Istanbul that was severely affected by the earthquake. A total of 464 adults were assessed with a self-report instrument for PTSD symptoms on 3 consecutive surveys that were administered 1 to 3, 6 to 10, and 18 to 20 months following the earthquake. The prevalence of PTSD was 30.2% on the first survey and decreased to 26.9% and 10.6% on the second and third surveys, respectively. Female subjects showed initially higher (34.8%) PTSD rates compared with male subjects (19.1%). However, gender differences disappeared by the time of the third survey due to high spontaneous remission rates in female subjects. Low levels of chronic and delayed-onset PTSD were observed. A major contribution of the avoidance symptoms to PTSD diagnosis was identified by statistical analysis. Initial PTSD following an earthquake may be as prevalent as in other natural disasters, but high rates of spontaneous remission lead to low prevalence 1.5 years following the earthquake. Initial avoidance characteristics play a major role in the emergence of PTSD.

  8. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  9. Genome-Facilitated Analyses of Geomicrobial Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kenneth H. Nealson

    2012-05-02

    This project had the goal(s) of understanding the mechanism(s) of extracellular electron transport (EET) in the microbe Shewanella oneidensis MR-1, and a number of other strains and species in the genus Shewanella. The major accomplishments included sequencing, annotation, and analysis of more than 20 Shewanella genomes. The comparative genomics enabled the beginning of a systems biology approach to this genus. Another major contribution involved the study of gene regulation, primarily in the model organism, MR-1. As part of this work, we took advantage of special facilities at the DOE: e.g., the synchrotron radiation facility at ANL, where we successfully used this system for elemental characterization of single cells in different metabolic states (1). We began work with purified enzymes, and identification of partially purified enzymes, leading to initial characterization of several of the 42 c-type cytochromes from MR-1 (2). As the genome became annotated, we began experiments on transcriptome analysis under different conditions of growth, the first step towards systems biology (3,4). Conductive appendages of Shewanella, called bacterial nanowires were identified and characterized during this work (5, 11, 20,21). For the first time, it was possible to measure the electron transfer rate between single cells and a solid substrate (20), a rate that has been confirmed by several other laboratories. We also showed that MR-1 cells preferentially attach to cells at a given charge, and are not attracted, or even repelled by other charges. The interaction with the charged surfaces begins with a stimulation of motility (called electrokinesis), and eventually leads to attachment and growth. One of the things that genomics allows is the comparative analysis of the various Shewanella strains, which led to several important insights. First, while the genomes predicted that none of the strains looked like they should be able to degrade N-acetyl glucosamine (NAG), the monomer

  10. Validation of a 20-year forecast of US childhood lead poisoning: Updated prospects for 2010

    International Nuclear Information System (INIS)

    Jacobs, David E.; Nevin, Rick

    2006-01-01

    We forecast childhood lead poisoning and residential lead paint hazard prevalence for 1990-2010, based on a previously unvalidated model that combines national blood lead data with three different housing data sets. The housing data sets, which describe trends in housing demolition, rehabilitation, window replacement, and lead paint, are the American Housing Survey, the Residential Energy Consumption Survey, and the National Lead Paint Survey. Blood lead data are principally from the National Health and Nutrition Examination Survey. New data now make it possible to validate the midpoint of the forecast time period. For the year 2000, the model predicted 23.3 million pre-1960 housing units with lead paint hazards, compared to an empirical HUD estimate of 20.6 million units. Further, the model predicted 498,000 children with elevated blood lead levels (EBL) in 2000, compared to a CDC empirical estimate of 434,000. The model predictions were well within 95% confidence intervals of empirical estimates for both residential lead paint hazard and blood lead outcome measures. The model shows that window replacement explains a large part of the dramatic reduction in lead poisoning that occurred from 1990 to 2000. Here, the construction of the model is described and updated through 2010 using new data. Further declines in childhood lead poisoning are achievable, but the goal of eliminating children's blood lead levels ≥10 μg/dL by 2010 is unlikely to be achieved without additional action. A window replacement policy will yield multiple benefits of lead poisoning prevention, increased home energy efficiency, decreased power plant emissions, improved housing affordability, and other previously unrecognized benefits. Finally, combining housing and health data could be applied to forecasting other housing-related diseases and injuries

  11. Can NGOs Make a Difference? Revisiting and Reframing a 20-year Debate

    DEFF Research Database (Denmark)

    Opoku-Mensah, Paul Yaw

    2007-01-01

    The article seeks to connect the vibrant debates in the Nordic region on NGOs and the aid system with the international comparative debates on NGOs and development alternatives. It argues for a    reformulation of the international debate on NGOs and development alternatives to address...... the foundational questions related to the formative role and structural impact of the international aid system on NGOs and their roles. This reformulation moves the discussions further and enables analyses that provide understanding of the actual and potential role of NGOs to transform development  processes....

  12. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  13. Family cohesion and posttraumatic intrusion and avoidance among war veterans: a 20-year longitudinal study.

    Science.gov (United States)

    Zerach, Gadi; Solomon, Zahava; Horesh, Danny; Ein-Dor, Tsachi

    2013-02-01

    The bi-directional relationships between combat-induced posttraumatic symptoms and family relations are yet to be understood. The present study assesses the longitudinal interrelationship of posttraumatic intrusion and avoidance and family cohesion among 208 Israeli combat veterans from the 1982 Lebanon War. Two groups of veterans were assessed with self-report questionnaires 1, 3 and 20 years after the war: a combat stress reaction (CSR) group and a matched non-CSR control group. Latent Trajectories Modeling showed that veterans of the CSR group reported higher intrusion and avoidance than non-CSR veterans at all three points of time. With time, there was a decline in these symptoms in both groups, but the decline was more salient among the CSR group. The latter also reported lower levels of family cohesion. Furthermore, an incline in family cohesion levels was found in both groups over the years. Most importantly, Autoregressive Cross-Lagged Modeling among CSR and non-CSR veterans revealed that CSR veterans' posttraumatic symptoms in 1983 predicted lower family cohesion in 1985, and lower family cohesion, in turn, predicted posttraumatic symptoms in 2002. The findings suggest that psychological breakdown on the battlefield is a marker for future family cohesion difficulties. Our results lend further support for the bi-directional mutual effects of posttraumatic symptoms and family cohesion over time.

  14. Towards a 20 kA high temperature superconductor current lead module using REBCO tapes

    Science.gov (United States)

    Heller, R.; Bagrets, N.; Fietz, W. H.; Gröner, F.; Kienzler, A.; Lange, C.; Wolf, M. J.

    2018-01-01

    Most of the large fusion devices presently under construction or in operation consisting of superconducting magnets like EAST, Wendelstein 7-X (W7-X), JT-60SA, and ITER, use high temperature superconductor (HTS) current leads (CL) to reduce the cryogenic load and operational cost. In all cases, the 1st generation HTS material Bi-2223 is used which is embedded in a low-conductivity matrix of AgAu. In the meantime, industry worldwide concentrates on the production of the 2nd generation HTS REBCO material because of the better field performance in particular at higher temperature. As the new material can only be produced in a multilayer thin-film structure rather than as a multi-filamentary tape, the technology developed for Bi-2223-based current leads cannot be transferred directly to REBCO. Therefore, several laboratories are presently investigating the design of high current HTS current leads made of REBCO. Karlsruhe Institute of Technology is developing a 20 kA HTS current lead using brass-stabilized REBCO tapes—as a further development to the Bi-2223 design used in the JT-60SA current leads. The same copper heat exchanger module as in the 20 kA JT-60SA current lead will be used for simplicity, which will allow a comparison of the newly developed REBCO CL with the earlier produced and investigated CL for JT-60SA. The present paper discusses the design and accompanying test of single tape and stack REBCO mock-ups. Finally, the fabrication of the HTS module using REBCO stacks is described.

  15. Phosphorus retention in a 20-year-old septic system filter bed.

    Science.gov (United States)

    Robertson, W D

    2012-01-01

    Septic systems in lakeshore environments often occur where thin soils overlie bedrock and, consequently, filter beds may be constructed of imported filter sand. The objective of this study was to assess the mobility of wastewater phosphorus (P) in such a potentially vulnerable setting by examining a 20-yr-old domestic septic system located near Parry Sound, ON, Canada, where the filter bed is constructed of imported noncalcareous sand. The groundwater plume is acidic (pH 6.0) and has a zone of elevated PO-P (up to 3.1 ± 1.7 mg L) below the tile lines but no elevated PO-P is present beyond 5 m from the tile lines. Elevated concentrations of desorbable P (up to 137 mg kg) and acid-extractable P (up to 3210 mg kg) occur in the filter sand within 1 m below four of seven tile lines present and the total mass of excess acid-extractable P (39 kg) is similar to the estimated total lifetime P loading to the system (33 kg). Microprobe images reveal abundant Fe and Al-rich authigenic mineral coatings on the sand grains that are increasingly P rich (up to 10% w/w P) near the tile lines. Additionally, 6 yr of monitoring data show that groundwater PO concentrations are not increasing. This indicates that mineral precipitation, not adsorption, dominates P immobilization at this site. This example of robust long-term P retention opens up the possibility of improving P removal in on-site treatment systems by prescribing specific sand types for filter bed construction. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  16. Prenatal Diagnosis of Transposition of the Great Arteries over a 20-Year Period: Improved but Imperfect

    Science.gov (United States)

    Escobar-Diaz, Maria C; Freud, Lindsay R; Bueno, Alejandra; Brown, David W; Friedman, Kevin; Schidlow, David; Emani, Sitaram; del Nido, Pedro; Tworetzky, Wayne

    2015-01-01

    Objective To evaluate temporal trends in prenatal diagnosis of transposition of the great arteries with intact ventricular septum (TGA/IVS) and its impact on neonatal morbidity and mortality. Methods Newborns with TGA/IVS referred for surgical management to our center over a 20-year period (1992 – 2011) were included. The study time was divided into 5 four-year periods, and the primary outcome was rate of prenatal diagnosis. Secondary outcomes included neonatal pre-operative status and perioperative survival. Results Of the 340 patients, 81 (24%) had a prenatal diagnosis. Prenatal diagnosis increased over the study period from 6% to 41% (p<0.001). Prenatally diagnosed patients underwent a balloon atrial septostomy (BAS) earlier than postnatally diagnosed patients (0 vs. 1 day, p<0.001) and fewer required mechanical ventilation (56% vs. 69%, p=0.03). There were no statistically significant differences in pre-operative acidosis (16% vs. 26%, p=0.1) and need for preoperative ECMO (2% vs. 3%, p=1.0). There was also no significant mortality difference (1 pre-operative and no post-operative deaths among prenatally diagnosed patients, as compared to 4 pre-operative and 6 post-operative deaths among postnatally diagnosed patients). Conclusion The prenatal detection rate of TGA/IVS has improved but still remains below 50%, suggesting the need for strategies to increase detection rates. The mortality rate was not statistically different between pre- and postnatally diagnosed patients; however, there were significant pre-operative differences with regard to earlier BAS and less mechanical ventilation. Ongoing study is required to elucidate whether prenatal diagnosis confers long-term benefit. PMID:25484180

  17. Socioeconomic factors and mortality in emergency general surgery: trends over a 20-year period.

    Science.gov (United States)

    Armenia, Sarah J; Pentakota, Sri Ram; Merchant, Aziz M

    2017-05-15

    Socioeconomic factors such as race, insurance, and income quartiles have been identified as independent risk factors in emergency general surgery (EGS), but this impact has not been studied over time. We sought to identify trends in disparities in EGS-related operative mortality over a 20-y period. The National Inpatient Sample was used to identify patient encounters coded for EGS in 1993, 2003, and 2013. Logistic regression models were used to examine the adjusted relationship between race, primary payer status, and median income quartiles and in-hospital mortality after adjusting for patients' age, gender, Elixhauser comorbidity score, and hospital region, size, and location-cum-teaching status. We identified 391,040 patient encounters. In 1993, Black race was associated with higher odds of in-hospital mortality (odds ratio [95% confidence interval]: 1.35 [1.20-1.53]) than White race, although this difference dissipated in subsequent years. Medicare, Medicaid, and underinsured patients had a higher odds of mortality than those with private insurance for the entire 20-y period; only the disparity in the underinsured decreased over time (1993, 1.63 [1.35-1.98]; 2013, 1.41 [1.20-1.67]). In 2003 (1.23 [1.10-1.38]) and 2013 (1.23 [1.11-1.37]), patients from the lowest income quartile were more likely to die after EGS than patients from the highest income quartile. Socioeconomic disparities in EGS-related operative morality followed inconsistent trends. Over time, while gaps in in-hospital mortality among Blacks and Whites have narrowed, disparities among patients belonging to lowest income quartile have worsened. Medicare and Medicaid beneficiaries continued to experience higher odds of in-hospital mortality relative to those with private insurance. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Quality assurance for Chinese herbal formulae: standardization of IBS-20, a 20-herb preparation

    Directory of Open Access Journals (Sweden)

    Bensoussan Alan

    2010-02-01

    Full Text Available Abstract Background The employment of well characterized test samples prepared from authenticated, high quality medicinal plant materials is key to reproducible herbal research. The present study aims to demonstrate a quality assurance program covering the acquisition, botanical validation, chemical standardization and good manufacturing practices (GMP production of IBS-20, a 20-herb Chinese herbal formula under study as a potential agent for the treatment of irritable bowel syndrome. Methods Purity and contaminant tests for the presence of toxic metals, pesticide residues, mycotoxins and microorganisms were performed. Qualitative chemical fingerprint analysis and quantitation of marker compounds of the herbs, as well as that of the IBS-20 formula was carried out with high-performance liquid chromatography (HPLC. Extraction and manufacture of the 20-herb formula were carried out under GMP. Chemical standardization was performed with liquid chromatography-mass spectrometry (LC-MS analysis. Stability of the formula was monitored with HPLC in real time. Results Quality component herbs, purchased from a GMP supplier were botanically and chemically authenticated and quantitative HPLC profiles (fingerprints of each component herb and of the composite formula were established. An aqueous extract of the mixture of the 20 herbs was prepared and formulated into IBS-20, which was chemically standardized by LC-MS, with 20 chemical compounds serving as reference markers. The stability of the formula was monitored and shown to be stable at room temperature. Conclusion A quality assurance program has been developed for the preparation of a standardized 20-herb formulation for use in the clinical studies for the treatment of irritable bowel syndrome (IBS. The procedures developed in the present study will serve as a protocol for other poly-herbal Chinese medicine studies.

  19. Quality assurance for Chinese herbal formulae: standardization of IBS-20, a 20-herb preparation.

    Science.gov (United States)

    Ip, Siu-Po; Zhao, Ming; Xian, Yanfang; Chen, Mengli; Zong, Yuying; Tjong, Yung-Wui; Tsai, Sam-Hip; Sung, Joseph J Y; Bensoussan, Alan; Berman, Brian; Fong, Harry H S; Che, Chun-Tao

    2010-02-22

    The employment of well characterized test samples prepared from authenticated, high quality medicinal plant materials is key to reproducible herbal research. The present study aims to demonstrate a quality assurance program covering the acquisition, botanical validation, chemical standardization and good manufacturing practices (GMP) production of IBS-20, a 20-herb Chinese herbal formula under study as a potential agent for the treatment of irritable bowel syndrome. Purity and contaminant tests for the presence of toxic metals, pesticide residues, mycotoxins and microorganisms were performed. Qualitative chemical fingerprint analysis and quantitation of marker compounds of the herbs, as well as that of the IBS-20 formula was carried out with high-performance liquid chromatography (HPLC). Extraction and manufacture of the 20-herb formula were carried out under GMP. Chemical standardization was performed with liquid chromatography-mass spectrometry (LC-MS) analysis. Stability of the formula was monitored with HPLC in real time. Quality component herbs, purchased from a GMP supplier were botanically and chemically authenticated and quantitative HPLC profiles (fingerprints) of each component herb and of the composite formula were established. An aqueous extract of the mixture of the 20 herbs was prepared and formulated into IBS-20, which was chemically standardized by LC-MS, with 20 chemical compounds serving as reference markers. The stability of the formula was monitored and shown to be stable at room temperature. A quality assurance program has been developed for the preparation of a standardized 20-herb formulation for use in the clinical studies for the treatment of irritable bowel syndrome (IBS). The procedures developed in the present study will serve as a protocol for other poly-herbal Chinese medicine studies.

  20. Treating a 20 mm Hg gradient alleviates myocardial hypertrophy in experimental aortic coarctation.

    Science.gov (United States)

    Wendell, David C; Friehs, Ingeborg; Samyn, Margaret M; Harmann, Leanne M; LaDisa, John F

    2017-10-01

    Children with coarctation of the aorta (CoA) can have a hyperdynamic and remodeled left ventricle (LV) from increased afterload. Literature from an experimental model suggests the putative 20 mm Hg blood pressure gradient (BPG) treatment guideline frequently implemented in CoA studies may permit irreversible vascular changes. LV remodeling from pressure overload has been studied, but data are limited following correction and using a clinically representative BPG. Rabbits underwent CoA at 10 weeks to induce a 20 mm Hg BPG using permanent or dissolvable suture thereby replicating untreated and corrected CoA, respectively. Cardiac function was evaluated at 32 weeks by magnetic resonance imaging using a spoiled cine GRE sequence (TR/TE/FA 8/2.9/20), 14 × 14-cm FOV, and 3-mm slice thickness. Images (20 frames/cycle) were acquired in 6-8 short axis views from the apex to the mitral valve annulus. LV volume, ejection fraction (EF), and mass were quantified. LV mass was elevated for CoA (5.2 ± 0.55 g) versus control (3.6 ± 0.16 g) and corrected (4.0 ± 0.44 g) rabbits, resulting in increased LV mass/volume ratio for CoA rabbits. A trend toward increased EF and stroke volume was observed but did not reach significance. Elevated EF by volumetric analysis in CoA rabbits was supported by concomitant increases in total aortic flow by phase-contrast magnetic resonance imaging. The indices quantified trended toward a persistent hyperdynamic LV despite correction, but differences were not statistically significant versus control rabbits. These findings suggest the current putative 20 mm Hg BPG for treatment may be reasonable from the LV's perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  2. Roux 105 PHONETIC DATA AND PHONOLOGICAL ANALYSES ...

    African Journals Online (AJOL)

    This paper is basically concerned with the relationship between phonetic data and phonological analyses. I) It will be shown that phonological analyses based on unverified phonetic data tend to accommodate ad hoc, unmotivated, and even phonetically implausible phonological rules. On the other hand, it will be ...

  3. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  4. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  5. Novel Algorithms for Astronomical Plate Analyses

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness ...

  6. Involvement of Ubiquitin-Editing Protein A20 in Modulating Inflammation in Rat Cochlea Associated with Silver Nanoparticle-Induced CD68 Upregulation and TLR4 Activation

    Science.gov (United States)

    Feng, Hao; Pyykkö, Ilmari; Zou, Jing

    2016-05-01

    Silver nanoparticles (AgNPs) were shown to temporarily impair the biological barriers in the skin of the external ear canal, mucosa of the middle ear, and inner ear, causing partially reversible hearing loss after delivery into the middle ear. The current study aimed to elucidate the molecular mechanism, emphasizing the TLR signaling pathways in association with the potential recruitment of macrophages in the cochlea and the modulation of inflammation by ubiquitin-editing protein A20. Molecules potentially involved in these signaling pathways were thoroughly analysed using immunohistochemistry in the rat cochlea exposed to AgNPs at various concentrations through intratympanic injection. The results showed that 0.4 % AgNPs but not 0.02 % AgNPs upregulated the expressions of CD68, TLR4, MCP1, A20, and RNF11 in the strial basal cells, spiral ligament fibrocytes, and non-sensory supporting cells of Corti's organ. 0.4 % AgNPs had no effect on CD44, TLR2, MCP2, Rac1, myosin light chain, VCAM1, Erk1/2, JNK, p38, IL-1β, TNF-α, TNFR1, TNFR2, IL-10, or TGF-β. This study suggested that AgNPs might confer macrophage-like functions on the strial basal cells and spiral ligament fibrocytes and enhance the immune activities of non-sensory supporting cells of Corti's organ through the upregulation of CD68, which might be involved in TLR4 activation. A20 and RNF11 played roles in maintaining cochlear homeostasis via negative regulation of the expressions of inflammatory cytokines.

  7. Frequency and changes in trends of leading risk factors of coronary heart disease in women in the city of Novi Sad during a 20-year period

    Directory of Open Access Journals (Sweden)

    Rakić Dušica

    2012-01-01

    Full Text Available Backround/Aim. From 1984 to 2004 the city of Novi Sad participated through its Health Center “Novi Sad” in the international Multinational MONItoring of Trends and Determinants in CArdiovascular Disease (MONICA project, as one of the 38 research centers in 21 countries around the world. The aim of this study was to determine frequency and changes of trends in leading risk factors of coronary heart disease (CHD and to analyze the previous trend of movement of coronary event in women in Novi Sad during a 20- year period. Methods. In 2004, the fourth survey within MONICA project was conducted in the city of Novi Sad. The representative sample included 1,041 women between the age of 25 and 74. The prevalence of risk factors in CHD such as smoking, high blood pressure, elevated blood cholesterol, elevated blood glucose and obesity was determined. Also, indicators of risk factors and rates of coronary events in women were compared with the results from MONICA project obtained in previous three screens, as well as with the results from other research centres. χ2-test, linear trend and correlartion coefficient were used in statistical analysis of results obtained. Results. It was observed that during a 20-year period covered by the study, the prevalence of the leading risk factors for the development of CHD in the surveyed women was significantly increasing and in positive correlation with the values of linear trend. Also, the increase of morbidity rates and mortality rates of coronary event were in positive correlation. The decrease was only recorded in the period from 1985-1989 (the implementation of the intervention programme. Conclusion. Upon analysing the increase in prevalence of leading risk factors of CHD and significant increase in the rates of coronary event, we can conclude that health status of women in Novi Sad during a 20-year period was deteriorating.

  8. The potential impact of a 20% tax on sugar-sweetened beverages on obesity in South African adults: a mathematical model.

    Directory of Open Access Journals (Sweden)

    Mercy Manyema

    Full Text Available BACKGROUND/OBJECTIVES: The prevalence of obesity in South Africa has risen sharply, as has the consumption of sugar-sweetened beverages (SSBs. Research shows that consumption of SSBs leads to weight gain in both adults and children, and reducing SSBs will significantly impact the prevalence of obesity and its related diseases. We estimated the effect of a 20% tax on SSBs on the prevalence of and obesity among adults in South Africa. METHODS: A mathematical simulation model was constructed to estimate the effect of a 20% SSB tax on the prevalence of obesity. We used consumption data from the 2012 SA National Health and Nutrition Examination Survey and a previous meta-analysis of studies on own- and cross-price elasticities of SSBs to estimate the shift in daily energy consumption expected of increased prices of SSBs, and energy balance equations to estimate shifts in body mass index. The population distribution of BMI by age and sex was modelled by fitting measured data from the SA National Income Dynamics Survey 2012 to the lognormal distribution and shifting the mean values. Uncertainty was assessed with Monte Carlo simulations. RESULTS: A 20% tax is predicted to reduce energy intake by about 36 kJ per day (95% CI: 9-68 kJ. Obesity is projected to reduce by 3.8% (95% CI: 0.6%-7.1% in men and 2.4% (95% CI: 0.4%-4.4% in women. The number of obese adults would decrease by over 220 000 (95% CI: 24 197-411 759. CONCLUSIONS: Taxing SSBs could impact the burden of obesity in South Africa particularly in young adults, as one component of a multi-faceted effort to prevent obesity.

  9. The potential impact of a 20% tax on sugar-sweetened beverages on obesity in South African adults: a mathematical model.

    Science.gov (United States)

    Manyema, Mercy; Veerman, Lennert J; Chola, Lumbwe; Tugendhaft, Aviva; Sartorius, Benn; Labadarios, Demetre; Hofman, Karen J

    2014-01-01

    The prevalence of obesity in South Africa has risen sharply, as has the consumption of sugar-sweetened beverages (SSBs). Research shows that consumption of SSBs leads to weight gain in both adults and children, and reducing SSBs will significantly impact the prevalence of obesity and its related diseases. We estimated the effect of a 20% tax on SSBs on the prevalence of and obesity among adults in South Africa. A mathematical simulation model was constructed to estimate the effect of a 20% SSB tax on the prevalence of obesity. We used consumption data from the 2012 SA National Health and Nutrition Examination Survey and a previous meta-analysis of studies on own- and cross-price elasticities of SSBs to estimate the shift in daily energy consumption expected of increased prices of SSBs, and energy balance equations to estimate shifts in body mass index. The population distribution of BMI by age and sex was modelled by fitting measured data from the SA National Income Dynamics Survey 2012 to the lognormal distribution and shifting the mean values. Uncertainty was assessed with Monte Carlo simulations. A 20% tax is predicted to reduce energy intake by about 36 kJ per day (95% CI: 9-68 kJ). Obesity is projected to reduce by 3.8% (95% CI: 0.6%-7.1%) in men and 2.4% (95% CI: 0.4%-4.4%) in women. The number of obese adults would decrease by over 220 000 (95% CI: 24 197-411 759). Taxing SSBs could impact the burden of obesity in South Africa particularly in young adults, as one component of a multi-faceted effort to prevent obesity.

  10. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    -to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...... and needs among population groups with a low ability to pay. Instead of cost-benefit analyses, impact analyses evaluating the likely effects of project alternatives against a wide range of societal goals is recommended, with quantification and economic valorisation only for impact categories where this can......This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural...

  11. Vitamin E γ-Tocotrienol Inhibits Cytokine-Stimulated NF-κB Activation by Induction of Anti-Inflammatory A20 via Stress Adaptive Response Due to Modulation of Sphingolipids.

    Science.gov (United States)

    Wang, Yun; Park, Na-Young; Jang, Yumi; Ma, Averil; Jiang, Qing

    2015-07-01

    NF-κB plays a central role in pathogenesis of inflammation and cancer. Many phytochemicals, including γ-tocotrienol (γTE), a natural form of vitamin E, have been shown to inhibit NF-κB activation, but the underlying mechanism has not been identified. In this study, we show that γTE inhibited cytokine-triggered activation of NF-κB and its upstream regulator TGF-β-activated kinase-1 in murine RAW 264.7 macrophages and primary bone marrow-derived macrophages. In these cells, γTE induced upregulation of A20, an inhibitor of NF-κB. Knockout of A20 partially diminished γTE's anti-NF-κB effect, but γTE increased another NF-κB inhibitor, Cezanne, in A20(-/-) cells. In search of the reason for A20 upregulation, we found that γTE treatment increased phosphorylation of translation initiation factor 2, IκBα, and JNK, indicating induction of endoplasmic reticulum stress. Liquid chromatography-tandem mass spectrometry analyses revealed that γTE modulated sphingolipids, including enhancement of intracellular dihydroceramides, sphingoid bases in de novo synthesis of the sphingolipid pathway. Chemical inhibition of de novo sphingolipid synthesis partially reversed γTE's induction of A20 and the anti-NF-κB effect. The importance of dihydroceramide increase is further supported by the observation that C8-dihydroceramide mimicked γTE in upregulating A20, enhancing endoplasmic reticulum stress, and attenuating TNF-triggered NF-κB activation. Our study identifies a novel anti-NF-κB mechanism where A20 is induced by stress-induced adaptive response as a result of modulation of sphingolipids, and it demonstrates an immunomodulatory role of dihydrocermides. Copyright © 2015 by The American Association of Immunologists, Inc.

  12. Secondary structural analyses of ITS1 in Paramecium.

    Science.gov (United States)

    Hoshina, Ryo

    2010-01-01

    The nuclear ribosomal RNA gene operon is interrupted by internal transcribed spacer (ITS) 1 and ITS2. Although the secondary structure of ITS2 has been widely investigated, less is known about ITS1 and its structure. In this study, the secondary structure of ITS1 sequences for Paramecium and other ciliates was predicted. Each Paramecium ITS1 forms an open loop with three helices, A through C. Helix B was highly conserved among Paramecium, and similar helices were found in other ciliates. A phylogenetic analysis using the ITS1 sequences showed high-resolution, implying that ITS1 is a good tool for species-level analyses.

  13. Predictable Medea

    Directory of Open Access Journals (Sweden)

    Elisabetta Bertolino

    2010-01-01

    Full Text Available By focusing on the tragedy of the 'unpredictable' infanticide perpetrated by Medea, the paper speculates on the possibility of a non-violent ontological subjectivity for women victims of gendered violence and whether it is possible to respond to violent actions in non-violent ways; it argues that Medea did not act in an unpredictable way, rather through the very predictable subject of resentment and violence. 'Medea' represents the story of all of us who require justice as retribution against any wrong. The presupposition is that the empowered female subjectivity of women’s rights contains the same desire of mastering others of the masculine current legal and philosophical subject. The subject of women’s rights is grounded on the emotions of resentment and retribution and refuses the categories of the private by appropriating those of the righteous, masculine and public subject. The essay opposes the essentialised stereotypes of the feminine and the maternal with an ontological approach of people as singular, corporeal, vulnerable and dependent. There is therefore an emphasis on the excluded categories of the private. Forgiveness is taken into account as a category of the private and a possibility of responding to violence with newness. A violent act is seen in relations to the community of human beings rather than through an isolated setting as in the case of the individual of human rights. In this context, forgiveness allows to risk again and being with. The result is also a rethinking of feminist actions, feminine subjectivity and of the maternal. Overall the paper opens up the Arendtian category of action and forgiveness and the Cavarerian unique and corporeal ontology of the selfhood beyond gendered stereotypes.

  14. Ginkgo Biloba Extract and Long-Term Cognitive Decline: A 20-Year Follow-Up Population-Based Study

    Science.gov (United States)

    Amieva, Hélène; Meillon, Céline; Helmer, Catherine; Barberger-Gateau, Pascale; Dartigues, Jean François

    2013-01-01

    Background Numerous studies have looked at the potential benefits of various nootropic drugs such as Ginkgo biloba extract (EGb761®; Tanakan®) and piracetam (Nootropyl®) on age-related cognitive decline often leading to inconclusive results due to small sample sizes or insufficient follow-up duration. The present study assesses the association between intake of EGb761® and cognitive function of elderly adults over a 20-year period. Methods and Findings The data were gathered from the prospective community-based cohort study ‘Paquid’. Within the study sample of 3612 non-demented participants aged 65 and over at baseline, three groups were compared: 589 subjects reporting use of EGb761® at at least one of the ten assessment visits, 149 subjects reporting use of piracetam at one of the assessment visits and 2874 subjects not reporting use of either EGb761® or piracetam. Decline on MMSE, verbal fluency and visual memory over the 20-year follow-up was analysed with a multivariate mixed linear effects model. A significant difference in MMSE decline over the 20-year follow-up was observed in the EGb761® and piracetam treatment groups compared to the ‘neither treatment’ group. These effects were in opposite directions: the EGb761® group declined less rapidly than the ‘neither treatment’ group, whereas the piracetam group declined more rapidly (β = −0.6). Regarding verbal fluency and visual memory, no difference was observed between the EGb761® group and the ‘neither treatment’ group (respectively, β = 0.21 and β = −0.03), whereas the piracetam group declined more rapidly (respectively, β = −1.40 and β = −0.44). When comparing the EGb761® and piracetam groups directly, a different decline was observed for the three tests (respectively β = −1.07, β = −1.61 and β = −0.41). Conclusion Cognitive decline in a non-demented elderly population was lower in subjects who reported using EGb761® than in

  15. Ginkgo biloba extract and long-term cognitive decline: a 20-year follow-up population-based study.

    Directory of Open Access Journals (Sweden)

    Hélène Amieva

    Full Text Available Numerous studies have looked at the potential benefits of various nootropic drugs such as Ginkgo biloba extract (EGb761®; Tanakan® and piracetam (Nootropyl® on age-related cognitive decline often leading to inconclusive results due to small sample sizes or insufficient follow-up duration. The present study assesses the association between intake of EGb761® and cognitive function of elderly adults over a 20-year period.The data were gathered from the prospective community-based cohort study 'Paquid'. Within the study sample of 3612 non-demented participants aged 65 and over at baseline, three groups were compared: 589 subjects reporting use of EGb761® at at least one of the ten assessment visits, 149 subjects reporting use of piracetam at one of the assessment visits and 2874 subjects not reporting use of either EGb761® or piracetam. Decline on MMSE, verbal fluency and visual memory over the 20-year follow-up was analysed with a multivariate mixed linear effects model. A significant difference in MMSE decline over the 20-year follow-up was observed in the EGb761® and piracetam treatment groups compared to the 'neither treatment' group. These effects were in opposite directions: the EGb761® group declined less rapidly than the 'neither treatment' group, whereas the piracetam group declined more rapidly (β = -0.6. Regarding verbal fluency and visual memory, no difference was observed between the EGb761® group and the 'neither treatment' group (respectively, β = 0.21 and β = -0.03, whereas the piracetam group declined more rapidly (respectively, β = -1.40 and β = -0.44. When comparing the EGb761® and piracetam groups directly, a different decline was observed for the three tests (respectively β = -1.07, β = -1.61 and β = -0.41.Cognitive decline in a non-demented elderly population was lower in subjects who reported using EGb761® than in those who did not. This effect may be a specific medication

  16. A 20-year experience of ocular herpes virus detection using immunofluorescence and polymerase chain reaction.

    Science.gov (United States)

    Satpathy, Gita; Behera, Himansu S; Sharma, Anjana; Mishra, Abhisek K; Mishra, Deepanshi; Sharma, Namrata; Tandon, Radhika; Agarwal, Tushar; Titiyal, Jeewan S

    2018-03-06

    To detect the presence of herpes virus in corneal scrapings/corneal grafts of suspected herpetic keratitis patients attending the outpatient department/casualty of the Dr Rajendra Prasad Centre for Ophthalmic Sciences, All India Institute of Medical Sciences, New Delhi for the past 20 years with immunofluorescence assay and to analyse the efficacy of polymerase chain reaction over immunofluorescence for routine laboratory diagnosis in some of the specimens. Corneal scrapings and corneal grafts were collected by the ophthalmologists from 1,926 suspected herpetic keratitis patients between 1996 and 2015, among whom 1,863 patients were processed with immunofluorescence assay and 302 patients were processed with polymerase chain reaction assay for the detection of herpes virus. Of the 302 patients, clinical specimens from 239 patients were analysed by both polymerase chain reaction and immunofluorescence assay. Of the 1,863 suspected herpetic keratitis patients diagnosed with immunofluorescence assay, 277 (14.9 per cent) were found positive for herpes simplex virus 1 antigen. Similarly, of the 302 suspected herpetic keratitis patients diagnosed by polymerase chain reaction, 70 (23.2 per cent) were found positive for herpes simplex virus DNA. Of the 239 patients diagnosed by both polymerase chain reaction and immunofluorescence assay, 35 (14.6 per cent) were found positive with immunofluorescence assay, 59 (24.7 per cent) were found positive with polymerase chain reaction, 30 (12.5 per cent) were positive with both immunofluorescence and polymerase chain reaction assay. Efficacy and accuracy of the polymerase chain reaction assay was greater compared to the immunofluorescence assay for detection of herpes virus in corneal scrapings/corneal grafts of suspected herpetic keratitis patients. Although the immunofluorescence assay is a rapid test for the detection of herpes virus in suspected herpetic keratitis patients, a combination of polymerase chain reaction with

  17. Juvenile dermatomyositis: a 20-year retrospective analysis of treatment and clinical outcomes.

    Science.gov (United States)

    Sun, Chi; Lee, Jyh-Hong; Yang, Yao-Hsu; Yu, Hsin-Hui; Wang, Li-Chieh; Lin, Yu-Tsan; Chiang, Bor-Luen

    2015-02-01

    Juvenile dermatomyositis is a rare childhood multisystem autoimmune disease involving primarily the skin and muscles, and it may lead to long-term disability. This study aimed to describe the clinical course of juvenile dermatomyositis and determine if any early clinical or laboratory features could predict outcome. Medical charts of patients aged ≤18 years and diagnosed with juvenile dermatomyositis (according to the criteria of Bohan and Peter) at the Pediatric Department, National Taiwan University Hospital, between 1989 and 2009 were reviewed. The endpoints for disease assessment were complete clinical response and complete clinical remission. Cox's proportional hazards model was fitted to identify important predictors of complete clinical remission. A total of 39 patients with juvenile dermatomyositis were reviewed. Two-thirds were females, and the mean age at disease onset was 81.97 ± 46.63 months. The most common initial presentations were Gottron's papule (82.1%) and muscle weakness (82.1%). After excluding one patient with an incomplete record, the remaining 31 patients who had muscle weakness were analyzed; among them, 22 (70.97%) achieved complete clinical response, but only six (19.4%) achieved complete clinical remission. Multivariate analysis showed that female sex, negative Gowers' sign at disease onset, and positive photosensitivity at disease onset were favorable factors to achieve complete clinical remission. Moreover, covariate-adjusted survival curves were drawn for making predictions of complete clinical remission. Only 13 (33.33%) patients were symptom free at the end of follow up, whereas the other 26 suffered from different kinds of complications. None of them developed malignancy, but two (5.13%) patients died during the follow-up period. Factors such as male sex and Gowers' sign were unlikely to favor the achievement of complete clinical remission in juvenile dermatomyositis. Certain complications cannot be avoided, and thus more

  18. Publication of noninferiority clinical trials: changes over a 20-year interval.

    Science.gov (United States)

    Suda, Katie J; Hurley, Anne M; McKibbin, Trevor; Motl Moroney, Susannah E

    2011-09-01

    The primary objective was to evaluate the change in publication rate of noninferiority trials over a 20-year interval (1989-2009). Secondary objectives were to analyze the frequency of noninferiority trials by therapeutic category, the frequency of noninferiority trial publication by journal, the impact factors of the publishing journals, any potential special advantages of the study drug over the control, the funding sources of the trials, pharmaceutical industry affiliation of the authors, and the use of ghostwriters in the creation of manuscripts. Retrospective literature review of 583 articles. PubMed (January 1989-December 2009) and EMBASE (first quarter 1989-fourth quarter 2009) databases. A total of 583 articles of the results of randomized controlled clinical trials with a noninferiority study design that evaluated drug therapies, published in English, between 1989 and 2009, were included in the analysis. A consistent increase was noted in their yearly publication rates, with no trials published in 1989 versus 133 in 2009. One hundred twenty-six articles (21.6%) were in the therapeutic category of infectious diseases, followed by 78 (13.4%) in cardiology. Among the journals identified, The New England Journal of Medicine had the highest publication rate of trials with a noninferiority design, with 29 (5.0%) of the identified trials published in this journal. The median impact factor of the journals publishing noninferiority trials was 4.807 (interquartile range 3.064-7.5). The most common advantage of the study drug over the control was reduced duration of treatment or reduced pill burden (80 studies [22.9%]). A total of 425 trials (72.9%) listed the pharmaceutical industry as the only funding source. Among 369 trials with authors employed by the pharmaceutical industry, 101 (17.3%) disclosed an acknowledgment to an individual, other than those listed as authors, who contributed to writing the manuscript and who was affiliated with a medical information

  19. Middle School Injuries: A 20-Year (1988–2008) Multisport Evaluation

    Science.gov (United States)

    Beachy, Glenn; Rauh, Mitchell

    2014-01-01

    Context: Data on the incidence of injury in middle school sports are limited. Objective: To describe overall, practice, and game injury rate patterns in 29 middle school sports. Design: Descriptive epidemiology study. Setting: Injury data collected over a 20-year period (1988–2008) at a single school. Patients or Other Participants: Boy (n = 8078) and girl (n = 5960) athletes participating in 14 and 15 middle school sports, respectively. Main Outcome Measure(s): Injury status and athlete-exposures (AEs) were collected by certified athletic trainers. Incidence rates per 1000 AEs (injuries/AEs) were calculated for overall incidence, practices and games, injury location, injury type, and injury severity (time lost from participation). Rate ratios (RRs) and 95% confidence intervals (CIs) were used to compare injury rates for sex-matched sports. Results: Football had the highest injury rate for all injuries (16.03/1000 AEs) and for time-loss injuries (8.486/1000 AEs). In matched middle school sports, girls exhibited a higher injury rate for all injuries (7.686/1000 AEs, RR = 1.15, 95% CI = 1.1, 1.2) and time-loss injuries (2.944/1000 AEs, RR = 1.09, 95% CI = 1.0, 1.2) than boys (all injuries: 6.684/1000 AEs, time-loss injuries: 2.702/1000 AEs). Girls had a higher injury rate during practices (3.30/1000 AEs) than games (1.67/1000 AEs, RR = 1.97, 95% CI = 1.7, 2.4) for all sports. Only gymnastics (RR = 0.96, 95% CI = 0.3, 3.8) had a higher game injury rate for girls. Practice and game injury rates were nearly identical for boys in all sports (RR = 0.99, 95% CI = 0.9, 1.1). Only football (RR = 0.49, 95% CI = 0.4, 0.6) and boys' wrestling (RR = 0.50, 95% CI = 0.3, 0.8) reported higher game injury rates. Tendinitis injuries accounted for 19.1% of all middle school injuries. Conclusions: The risk for sport-related injury at the middle school level was greater during practices than games and greater for girls than boys in sex-matched sports. Conditioning programs may be

  20. Middle school injuries: a 20-year (1988-2008) multisport evaluation.

    Science.gov (United States)

    Beachy, Glenn; Rauh, Mitchell

    2014-01-01

    Data on the incidence of injury in middle school sports are limited. To describe overall, practice, and game injury rate patterns in 29 middle school sports. Descriptive epidemiology study. Injury data collected over a 20-year period (1988-2008) at a single school. Boy (n = 8078) and girl (n = 5960) athletes participating in 14 and 15 middle school sports, respectively. Injury status and athlete-exposures (AEs) were collected by certified athletic trainers. Incidence rates per 1000 AEs (injuries/AEs) were calculated for overall incidence, practices and games, injury location, injury type, and injury severity (time lost from participation). Rate ratios (RRs) and 95% confidence intervals (CIs) were used to compare injury rates for sex-matched sports. Football had the highest injury rate for all injuries (16.03/1000 AEs) and for time-loss injuries (8.486/1000 AEs). In matched middle school sports, girls exhibited a higher injury rate for all injuries (7.686/1000 AEs, RR = 1.15, 95% CI = 1.1, 1.2) and time-loss injuries (2.944/1000 AEs, RR = 1.09, 95% CI = 1.0, 1.2) than boys (all injuries: 6.684/1000 AEs, time-loss injuries: 2.702/1000 AEs). Girls had a higher injury rate during practices (3.30/1000 AEs) than games (1.67/1000 AEs, RR = 1.97, 95% CI = 1.7, 2.4) for all sports. Only gymnastics (RR = 0.96, 95% CI = 0.3, 3.8) had a higher game injury rate for girls. Practice and game injury rates were nearly identical for boys in all sports (RR = 0.99, 95% CI = 0.9, 1.1). Only football (RR = 0.49, 95% CI = 0.4, 0.6) and boys' wrestling (RR = 0.50, 95% CI = 0.3, 0.8) reported higher game injury rates. Tendinitis injuries accounted for 19.1% of all middle school injuries. The risk for sport-related injury at the middle school level was greater during practices than games and greater for girls than boys in sex-matched sports. Conditioning programs may be needed to address the high rate of tendinitis injuries.

  1. A20 is critical for the induction of Pam3CSK4-tolerance in monocytic THP-1 cells.

    Directory of Open Access Journals (Sweden)

    Jinyue Hu

    Full Text Available A20 functions to terminate Toll-like receptor (TLR-induced immune response, and play important roles in the induction of lipopolysacchride (LPS-tolerance. However, the molecular mechanism for Pam3CSK4-tolerance is uncertain. Here we report that TLR1/2 ligand Pam3CSK4 induced tolerance in monocytic THP-1 cells. The pre-treatment of THP-1 cells with Pam3CSK4 down-regulated the induction of pro-inflammatory cytokines induced by Pam3CSK4 re-stimulation. Pam3CSK4 pre-treatment also down-regulated the signaling transduction of JNK, p38 and NF-κB induced by Pam3CSK4 re-stimulation. The activation of TLR1/2 induced a rapid and robust up-regulation of A20, suggesting that A20 may contribute to the induction of Pam3CSK4-tolerance. This hypothesis was proved by the observation that the over-expression of A20 by gene transfer down-regulated Pam3CSK4-induced inflammatory responses, and the down-regulation of A20 by RNA interference inhibited the induction of tolerance. Moreover, LPS induced a significant up-regulation of A20, which contributed to the induction of cross-tolerance between LPS and Pam3CSK4. A20 was also induced by the treatment of THP-1 cells with TNF-α and IL-1β. The pre-treatment with TNF-α and IL-1β partly down-regulated Pam3CSK4-induced activation of MAPKs. Furthermore, pharmacologic inhibition of GSK3 signaling down-regulated Pam3CSK4-induced A20 expression, up-regulated Pam3CSK4-induced inflammatory responses, and partly reversed Pam3CSK4 pre-treatment-induced tolerance, suggesting that GSK3 is involved in TLR1/2-induced tolerance by up-regulation of A20 expression. Taken together, these results indicated that A20 is a critical regulator for TLR1/2-induced pro-inflammatory responses.

  2. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  3. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje

    2014-01-01

    To investigate the role of neurological abnormalities and magnetic resonance imaging (MRI) lesions in predicting global functional decline in a cohort of initially independent-living elderly subjects. The Leukoaraiosis And DISability (LADIS) Study, involving 11 European centres, was primarily aimed...... at evaluating age-related white matter changes (ARWMC) as an independent predictor of the transition to disability (according to Instrumental Activities of Daily Living scale) or death in independent elderly subjects that were followed up for 3 years. At baseline, a standardized neurological examination.......0 years, 45 % males), 327 (51.7 %) presented at the initial visit with ≥1 neurological abnormality and 242 (38 %) reached the main study outcome. Cox regression analyses, adjusting for MRI features and other determinants of functional decline, showed that the baseline presence of any neurological...

  4. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  5. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  6. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  7. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  8. Publication Trends in Acupuncture Research: A 20-Year Bibliometric Analysis Based on PubMed.

    Science.gov (United States)

    Ma, Yan; Dong, Ming; Zhou, Kehua; Mita, Carol; Liu, Jianping; Wayne, Peter M

    2016-01-01

    Acupuncture has become popular and widely practiced in many countries around the world. Despite the large amount of acupuncture-related literature that has been published, broader trends in the prevalence and scope of acupuncture research remain underexplored. The current study quantitatively analyzes trends in acupuncture research publications in the past 20 years. A bibliometric approach was used to search PubMed for all acupuncture-related research articles including clinical and animal studies. Inclusion criteria were articles published between 1995 and 2014 with sufficient information for bibliometric analyses. Rates and patterns of acupuncture publication within the 20 year observational period were estimated, and compared with broader publication rates in biomedicine. Identified eligible publications were further analyzed with respect to study type/design, clinical condition addressed, country of origin, and journal impact factor. A total of 13,320 acupuncture-related publications were identified using our search strategy and eligibility criteria. Regression analyses indicated an exponential growth in publications over the past two decades, with a mean annual growth rate of 10.7%. This compares to a mean annual growth rate of 4.5% in biomedicine. A striking trend was an observed increase in the proportion of randomized clinical trials (RCTs), from 7.4% in 1995 to 20.3% in 2014, exceeding the 4.5% proportional growth of RCTs in biomedicine. Over the 20 year period, pain was consistently the most common focus of acupuncture research (37.9% of publications). Other top rankings with respect to medical focus were arthritis, neoplasms/cancer, pregnancy or labor, mood disorders, stroke, nausea/vomiting, sleep, and paralysis/palsy. Acupuncture research was conducted in 60 countries, with the top 3 contributors being China (47.4%), United States (17.5%), and United Kingdom (8.2%). Retrieved articles were published mostly in complementary and alternative medicine (CAM

  9. Comparative sequence analyses of genome and transcriptome ...

    Indian Academy of Sciences (India)

    /fulltext/jbsc/040/05/0891-0907. Keywords. Asian elephant; comparative genomics; gene prediction; transcriptome. Abstract. The Asian elephant Elephas maximus and the African elephant Loxodonta africana that diverged 5-7 million years ...

  10. Predictable earthquakes?

    Science.gov (United States)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  11. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  12. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  13. Thermal Analyse sof Cross-Linked Polyethylene

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2007-01-01

    Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.

  14. Acute abdomen in pregnancy requiring surgical management: a 20-case series.

    Science.gov (United States)

    Unal, Aysun; Sayharman, Sema Etiz; Ozel, Leyla; Unal, Ethem; Aka, Nurettin; Titiz, Izzet; Kose, Gultekin

    2011-11-01

    The obstetrician often has a difficult task in diagnosing and managing the acute abdomen in pregnancy. A reluctance to operate during pregnancy adds unnecessary delay, which may increase morbidity for both mother and fetus. In this study, we present our experience in pregnant patients with acute abdomen. Pregnant patients with acute abdomen requiring surgical exploration were enrolled from 2007 to 2010. Demographics, gestational age, symptoms, fetal loss, preterm delivery, imaging studies, operative results, postoperative complications and histopathologic evaluations were recorded. Ultrasound (US) and magnetic resonance (MR) imaging studies were evaluated. Data analyses were performed with Microsoft Excel and statistical evaluations were done by using Student's t-test. There were 20 patients with a mean age of 32 years. The rate of emergency surgery was seen to be significantly higher in the second trimester (pacute abdomen (30% and 15%, respectively). All patients tolerated surgery well, and postoperative complications included wound infection, 10%, preterm labor, 5%, and prolonged paralytic ileus, 5%. One patient died from advanced gastric carcinoma and the only fetal death was seen in this case. Prompt diagnosis and appropriate therapy are crucial in pregnant with acute abdomen. The use of US may be limited and CT is not desirable due to fetal irradiation. MR has thus become increasingly popular in the evaluation of such patients. Adhesive small bowel obstruction should be kept in mind as an important etiology. Copyright © 2011. Published by Elsevier Ireland Ltd.

  15. Sex determination from the calcaneus in a 20th century Greek population using discriminant function analysis.

    Science.gov (United States)

    Peckmann, Tanya R; Orr, Kayla; Meek, Susan; Manolis, Sotiris K

    2015-12-01

    The skull and post-cranium have been used for the determination of sex for unknown human remains. However, in forensic cases where skeletal remains often exhibit postmortem damage and taphonomic changes the calcaneus may be used for the determination of sex as it is a preservationally favored bone. The goal of the present research was to derive discriminant function equations from the calcaneus for estimation of sex from a contemporary Greek population. Nine parameters were measured on 198 individuals (103 males and 95 females), ranging in age from 20 to 99 years old, from the University of Athens Human Skeletal Reference Collection. The statistical analyses showed that all variables were sexually dimorphic. Discriminant function score equations were generated for use in sex determination. The average accuracy of sex classification ranged from 70% to 90% for the univariate analysis, 82.9% to 87.5% for the direct method, and 86.2% for the stepwise method. Comparisons to other populations were made. Overall, the cross-validated accuracies ranged from 48.6% to 56.1% with males most often identified correctly and females most often misidentified. The calcaneus was shown to be useful for sex determination in the twentieth century Greek population. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  16. TALC: a new deployable concept for a 20m far-infrared space telescope

    Science.gov (United States)

    Durand, Gilles; Sauvage, Marc; Bonnet, Aymeric; Rodriguez, Louis; Ronayette, Samuel; Chanial, Pierre; Scola, Loris; Révéret, Vincent; Aussel, Hervé; Carty, Michael; Durand, Matthis; Durand, Lancelot; Tremblin, Pascal; Pantin, Eric; Berthe, Michel; Martignac, Jérôme; Motte, Frédérique; Talvard, Michel; Minier, Vincent; Bultel, Pascal

    2014-08-01

    TALC, Thin Aperture Light Collector is a 20 m space observatory project exploring some unconventional optical solutions (between the single dish and the interferometer) allowing the resolving power of a classical 27 m telescope. With TALC, the principle is to remove the central part of the prime mirror dish, cut the remaining ring into 24 sectors and store them on top of one-another. The aim of this far infrared telescope is to explore the 600 μm to 100 μm region. With this approach we have shown that we can store a ring-telescope of outer diameter 20m and ring thickness of 3m inside the fairing of Ariane 5 or Ariane 6. The general structure is the one of a bicycle wheel, whereas the inner sides of the segments are in compression to each other and play the rule of a rim. The segments are linked to each other using a pantograph scissor system that let the segments extend from a pile of dishes to a parabolic ring keeping high stiffness at all time during the deployment. The inner corners of the segments are linked to a central axis using spokes as in a bicycle wheel. The secondary mirror and the instrument box are built as a solid unit fixed at the extremity of the main axis. The tensegrity analysis of this structure shows a very high stiffness to mass ratio, resulting into 3 Hz Eigen frequency. The segments will consist of two composite skins and honeycomb CFRP structure build by replica process. Solid segments will be compared to deformable segments using the controlled shear of the rear surface. The adjustment of the length of the spikes and the relative position of the side of neighbor segments let control the phasing of the entire primary mirror. The telescope is cooled by natural radiation. It is protected from sun radiation by a large inflatable solar screen, loosely linked to the telescope. The orientation is performed by inertia-wheels. This telescope carries a wide field bolometer camera using cryocooler at 0.3K as one of the main instruments. This

  17. PBFA Z: A 20-MA z-pinch driver for plasma radiation sources

    International Nuclear Information System (INIS)

    Spielman, R.B.; Breeze, S.F.; Deeney, C.

    1996-01-01

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be a z-pinch driver capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of ∼ 40 cm/μs. Design constraints resulted in an accelerator with a 0.12-Ω impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four, self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15--mg mass, coupling 1.5 MJ into kinetic energy. We present 2-D Rad-Hydro calculations showing MJ x-ray outputs from tungsten wire-array z pinches

  18. PBFA Z: A 20-MA Z-pinch driver for plasma radiation sources

    International Nuclear Information System (INIS)

    Spielman, R.B.; Breeze, S.F.; Deeney, C.

    1996-01-01

    Sandia National Laboratories is completing a major modification to the PBFA-II facility. PBFA Z will be capable of delivering up to 20 MA to a z-pinch load. It optimizes the electrical coupling to the implosion energy of z pinches at implosion velocities of ∼ 40 cm/μs. Design constraints resulted in an accelerator with a 0.12-Ω impedance, a 10.25-nH inductance, and a 120-ns pulse width. The design required new water transmission lines, insulator stack, and vacuum power feeds. Current is delivered to the z-pinch load through four self-magnetically-insulated vacuum transmission lines and a double post-hole convolute. A variety of design codes are used to model the power flow. These predict a peak current of 20 MA to a z-pinch load having a 2-cm length, a 2-cm radius, and a 15-mg mass, coupling 1.5 MJ into kinetic energy. Calculations are presented showing MJ x-ray outputs from tungsten wire-array z pinches. (author). 4 figs., 14 refs

  19. Nuclear criticality predictability

    International Nuclear Information System (INIS)

    Briggs, J.B.

    1999-01-01

    As a result of lots of efforts, a large portion of the tedious and redundant research and processing of critical experiment data has been eliminated. The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined, and valuable criticality safety experimental data is preserved. Criticality safety personnel in 31 different countries are now using the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Much has been accomplished by the work of the ICSBEP. However, evaluation and documentation represents only one element of a successful Nuclear Criticality Safety Predictability Program and this element only exists as a separate entity, because this work was not completed in conjunction with the experimentation process. I believe; however, that the work of the ICSBEP has also served to unify the other elements of nuclear criticality predictability. All elements are interrelated, but for a time it seemed that communications between these elements was not adequate. The ICSBEP has highlighted gaps in data, has retrieved lost data, has helped to identify errors in cross section processing codes, and has helped bring the international criticality safety community together in a common cause as true friends and colleagues. It has been a privilege to associate with those who work so diligently to make the project a success. (J.P.N.)

  20. Finite element analyses of tool stresses in metal cutting processes

    Energy Technology Data Exchange (ETDEWEB)

    Kistler, B.L. [Sandia National Labs., Livermore, CA (United States)

    1997-01-01

    In this report, we analytically predict and examine stresses in tool tips used in high speed orthogonal machining operations. Specifically, one analysis was compared to an existing experimental measurement of stresses in a sapphire tool tip cutting 1020 steel at slow speeds. In addition, two analyses were done of a carbide tool tip in a machining process at higher cutting speeds, in order to compare to experimental results produced as part of this study. The metal being cut was simulated using a Sandia developed damage plasticity material model, which allowed the cutting to occur analytically without prespecifying the line of cutting/failure. The latter analyses incorporated temperature effects on the tool tip. Calculated tool forces and peak stresses matched experimental data to within 20%. Stress contours generally agreed between analysis and experiment. This work could be extended to investigate/predict failures in the tool tip, which would be of great interest to machining shops in understanding how to optimize cost/retooling time.

  1. Specific recognition of linear polyubiquitin by A20 zinc finger 7 is involved in NF-κB regulation

    Science.gov (United States)

    Tokunaga, Fuminori; Nishimasu, Hiroshi; Ishitani, Ryuichiro; Goto, Eiji; Noguchi, Takuya; Mio, Kazuhiro; Kamei, Kiyoko; Ma, Averil; Iwai, Kazuhiro; Nureki, Osamu

    2012-01-01

    LUBAC (linear ubiquitin chain assembly complex) activates the canonical NF-κB pathway through linear polyubiquitination of NEMO (NF-κB essential modulator, also known as IKKγ) and RIP1. However, the regulatory mechanism of LUBAC-mediated NF-κB activation remains elusive. Here, we show that A20 suppresses LUBAC-mediated NF-κB activation by binding linear polyubiquitin via the C-terminal seventh zinc finger (ZF7), whereas CYLD suppresses it through deubiquitinase (DUB) activity. We determined the crystal structures of A20 ZF7 in complex with linear diubiquitin at 1.70–1.98 Å resolutions. The crystal structures revealed that A20 ZF7 simultaneously recognizes the Met1-linked proximal and distal ubiquitins, and that genetic mutations associated with B cell lymphomas map to the ubiquitin-binding sites. Our functional analysis indicated that the binding of A20 ZF7 to linear polyubiquitin contributes to the recruitment of A20 into a TNF receptor (TNFR) signalling complex containing LUBAC and IκB kinase (IKK), which results in NF-κB suppression. These findings provide new insight into the regulation of immune and inflammatory responses. PMID:23032187

  2. Hybrid approaches to physiologic modeling and prediction

    Science.gov (United States)

    Olengü, Nicholas O.; Reifman, Jaques

    2005-05-01

    This paper explores how the accuracy of a first-principles physiological model can be enhanced by integrating data-driven, "black-box" models with the original model to form a "hybrid" model system. Both linear (autoregressive) and nonlinear (neural network) data-driven techniques are separately combined with a first-principles model to predict human body core temperature. Rectal core temperature data from nine volunteers, subject to four 30/10-minute cycles of moderate exercise/rest regimen in both CONTROL and HUMID environmental conditions, are used to develop and test the approach. The results show significant improvements in prediction accuracy, with average improvements of up to 30% for prediction horizons of 20 minutes. The models developed from one subject's data are also used in the prediction of another subject's core temperature. Initial results for this approach for a 20-minute horizon show no significant improvement over the first-principles model by itself.

  3. Three distinct suppressors of RNA silencing encoded by a 20-kb viral RNA genome

    Science.gov (United States)

    Lu, Rui; Folimonov, Alexey; Shintaku, Michael; Li, Wan-Xiang; Falk, Bryce W.; Dawson, William O.; Ding, Shou-Wei

    2004-11-01

    Viral infection in both plant and invertebrate hosts requires a virus-encoded function to block the RNA silencing antiviral defense. Here, we report the identification and characterization of three distinct suppressors of RNA silencing encoded by the 20-kb plus-strand RNA genome of citrus tristeza virus (CTV). When introduced by genetic crosses into plants carrying a silencing transgene, both p20 and p23, but not coat protein (CP), restored expression of the transgene. Although none of the CTV proteins prevented DNA methylation of the transgene, export of the silencing signal (capable of mediating intercellular silencing spread) was detected only from the F1 plants expressing p23 and not from the CP- or p20-expressing F1 plants, demonstrating suppression of intercellular silencing by CP and p20 but not by p23. Thus, intracellular and intercellular silencing are each targeted by a CTV protein, whereas the third, p20, inhibits silencing at both levels. Notably, CP suppresses intercellular silencing without interfering with intracellular silencing. The novel property of CP suggests a mechanism distinct to p20 and all of the other viral suppressors known to interfere with intercellular silencing and that this class of viral suppressors may not be consistently identified by Agrobacterium coinfiltration because it also induces RNA silencing against the infiltrated suppressor transgene. Our analyses reveal a sophisticated viral counter-defense strategy that targets the silencing antiviral pathway at multiple steps and may be essential for protecting CTV with such a large RNA genome from antiviral silencing in the perennial tree host. RNA interference | citrus tristeza virus | virus synergy | antiviral immunity

  4. Insights into a 20-ha multi-contaminated brownfield megasite: An environmental forensics approach

    Energy Technology Data Exchange (ETDEWEB)

    Gallego, J.R., E-mail: jgallego@uniovi.es; Rodríguez-Valdés, E.; Esquinas, N.; Fernández-Braña, A.; Afif, E.

    2016-09-01

    Here we addressed the contamination of soils in an abandoned brownfield located in an industrial area. Detailed soil and waste characterisation guided by historical information about the site revealed pyrite ashes (a residue derived from the roasting of pyrite ores) as the main environmental risk. In fact, the disposal of pyrite ashes and the mixing of these ashes with soils have affected a large area of the site, thereby causing heavy metal(loid) pollution (As and Pb levels reaching several thousands of ppm). A full characterisation of the pyrite ashes was thus performed. In this regard, we determined the bioavailable metal species present and their implications, grain-size distribution, mineralogy, and Pb isotopic signature in order to obtain an accurate conceptual model of the site. We also detected significant concentrations of pyrogenic benzo(a)pyrene and other PAHs, and studied the relation of these compounds with the pyrite ashes. In addition, we examined other waste and spills of minor importance within the study site. The information gathered offered an insight into pollution sources, unravelled evidence from the industrial processes that took place decades ago, and identified the co-occurrence of contaminants by means of multivariate statistics. The environmental forensics study carried out provided greater information than conventional analyses for risk assessment purposes and for the selection of clean-up strategies adapted to future land use. - Highlights: • Complex legacy of contamination afflicts 20-ha brownfield • As and Pb highest soil pollutants • Forensic study reveals main waste and spills. • Comprehensive study of pyrite ashes (multi-point source of pollution) • Co-occurrence of PAH also linked to pyrite ashes.

  5. The stability of parental bonding reports: a 20-year follow-up.

    Science.gov (United States)

    Murphy, Eleanor; Wickramaratne, Priya; Weissman, Myrna

    2010-09-01

    Addressing the long-term reliability of retrospectively assessed parenting is underscored by the well-documented association between parenting behaviors, and mood disorders in offspring. The rarity of longitudinal research with follow-up periods exceeding 10 years creates a need for additional studies. 134 offspring of depressed and non-depressed parents were assessed on Parental Bonding Instrument (PBI) scores, lifetime major depression (MDD), and current depressive symptoms at four waves across 20 years. PBI rank order and mean level stability, individual trajectories, and the impact of baseline age, gender, and lifetime MDD on stability, were obtained using multiple regression and linear mixed model analyses. Besides paternal overprotection which showed a 1.6-point average decrease, the PBI domains remained non-significant for mean level change over 20 years. However, there was a significant individual variation for all PBI domains. Lifetime MDD and age did not significantly impact retest correlations; older age at baseline was associated with higher average paternal overprotection. Sons had lower retest correlations than daughters, but did not differ from daughters on mean level stability. Current depressive symptoms were associated with PBI scores, but did not impact the effect of lifetime MDD, gender or age on mean level stability and individual trajectories. Small sample sizes and measuring lifetime MDD as present or absent may have restricted our ability to detect effects of MDD history on PBI stability. The PBI is a robust measure of an important environmental risk for depressive disorders, and can be variably sensitive to sample characteristics, the passage of time and mood fluctuations. However, this sensitivity does not appear to significantly bias the long-term stability of this instrument. 2010 Elsevier B.V. All rights reserved.

  6. Insights into a 20-ha multi-contaminated brownfield megasite: An environmental forensics approach

    International Nuclear Information System (INIS)

    Gallego, J.R.; Rodríguez-Valdés, E.; Esquinas, N.; Fernández-Braña, A.; Afif, E.

    2016-01-01

    Here we addressed the contamination of soils in an abandoned brownfield located in an industrial area. Detailed soil and waste characterisation guided by historical information about the site revealed pyrite ashes (a residue derived from the roasting of pyrite ores) as the main environmental risk. In fact, the disposal of pyrite ashes and the mixing of these ashes with soils have affected a large area of the site, thereby causing heavy metal(loid) pollution (As and Pb levels reaching several thousands of ppm). A full characterisation of the pyrite ashes was thus performed. In this regard, we determined the bioavailable metal species present and their implications, grain-size distribution, mineralogy, and Pb isotopic signature in order to obtain an accurate conceptual model of the site. We also detected significant concentrations of pyrogenic benzo(a)pyrene and other PAHs, and studied the relation of these compounds with the pyrite ashes. In addition, we examined other waste and spills of minor importance within the study site. The information gathered offered an insight into pollution sources, unravelled evidence from the industrial processes that took place decades ago, and identified the co-occurrence of contaminants by means of multivariate statistics. The environmental forensics study carried out provided greater information than conventional analyses for risk assessment purposes and for the selection of clean-up strategies adapted to future land use. - Highlights: • Complex legacy of contamination afflicts 20-ha brownfield • As and Pb highest soil pollutants • Forensic study reveals main waste and spills. • Comprehensive study of pyrite ashes (multi-point source of pollution) • Co-occurrence of PAH also linked to pyrite ashes

  7. Tumors of the parapharyngeal space: the VU University Medical Center experience over a 20-year period.

    Science.gov (United States)

    van Hees, Thijs; van Weert, Stijn; Witte, Birgit; René Leemans, C

    2018-04-01

    Tumors of the parapharyngeal space (PPS) are rare, accounting for 0.5-1.5% of all head and neck tumors. The anatomy of the PPS is responsible for a wide variety of tumors arising from the PPS. This series of 99 PPS tumors provides an overview of the clinical course and management of PPS tumors. This retrospective study included clinical data from patients treated for PPS tumors from 1991 to 2012 (warranting at least a 4-year follow-up) at the VU University Medical Center, Amsterdam, The Netherlands. Fifty percent were salivary gland tumors, 41% were neurogenic and 9% had a different origin. 18.2% of the PPS tumors were malignant. The most reported symptom at presentation was swelling of the neck and throat. In 14%, the PPS tumor was an accidental finding following imaging for other diagnostic reasons. Cytology showed an accuracy rate of 73.1% (19/26). The positive predictive value of a malignant cytology result was 86% (95% CI 42.1-99.6%). Surgery was performed in 55 patients (56%). The most frequently performed approach (56%) was the cervical-transparotid approach, followed by the cervical (25%), transmandibular (16%) and transoral (2%) approach. Nine patients died of the disease, of which seven patients had a malignant salivary gland tumor, one patient had a pleomorphic adenoma at first diagnosis which degenerated into carcinoma ex pleomorphic adenoma and one patient died of metastatic renal cell carcinoma. This large single-centre report on PPS tumors shows that careful diagnostic work up and proper surgical planning are important in this specific and rare group of head and neck tumors. Surgery was the main treatment (56%) for parapharyngeal tumors. Management of parapharyngeal neurogenic neoplasms generally consists of active surveillance due to peri-operative risk for permanent cranial nerve damage. The histopathological diagnoses were consistent with previous reports.

  8. MicroRNA-125b-5p suppresses Brucella abortus intracellular survival via control of A20 expression.

    Science.gov (United States)

    Liu, Ning; Wang, Lin; Sun, Changjiang; Yang, Li; Sun, Wanchun; Peng, Qisheng

    2016-07-29

    Brucella may establish chronic infection by regulating the expression of miRNAs. However, the role of miRNAs in modulating the intracellular growth of Brucella remains unclear. In this study, we show that Brucella. abortus infection leads to downregulation of miR-125b-5p in macrophages. We establish that miR-125b-5p targets A20, an inhibitor of the NF-kB activation. Additionally, expression of miR-125b-5p decreases A20 expression in B. abortus-infected macrophages and leads to NF-kB activation and increased production of TNFα. Furthermore, B. abortus survival is attenuated in the presence of miR-125b-5p. These results uncover a role for miR-125b-5p in the regulation of B. abortus intracellular survival via the control of A20 expression.

  9. Has life satisfaction in Norway increased over a 20-year period? Exploring age and gender differences in a prospective longitudinal study, HUNT.

    Science.gov (United States)

    Lysberg, Frode; Gjerstad, PåL; Småstuen, Milada Cvancarova; Innstrand, Siw Tone; Høie, Magnhild Mjåvatn; Arild Espnes, Geir

    2018-02-01

    The aim of the present study was to investigate the change in overall life satisfaction for different age groups and between genders over a 20-year period. Data from 1984 to 2008 were extracted from a large prospective longitudinal health study of Nord-Trøndelag (HUNT), Norway. The study included more than 176,000 participants ranging from 20 to 70+ years of age. Data were analysed using logistic regression and adjusted for gender. The analyses revealed an increase in life satisfaction for all age groups from 1984-1986 (HUNT 1) to 1995-1997 (HUNT 2), with the highest levels being reached at 2006-2008 (HUNT 3). For all age groups, the data showed an increase of about 20% for the period from 1984-1986 (HUNT 1) to 1995-1997 (HUNT 2). From 1995-1997 (HUNT 2) to 2006-2008 (HUNT 3), the increase in overall life satisfaction was 16% for the younger age groups, and about 32% for the older age groups (40-69 and 70+ years). Women's scores for overall life satisfaction were higher for nearly all age groups when compared to men using HUNT 3 as a reference. These findings suggest an increase in life satisfaction for all age groups from 1984 to 2008, especially for the older age group (40-69 and 70+ years). The data indicate that women score higher on life satisfaction for most age groups as compared to men.

  10. The use of a 20-gauge valved cannula during pars plana phacofragmentation with a 23-gauge ultrasonic fragmatome.

    Science.gov (United States)

    Kim, Jee Taek; Eom, Youngsub; Ahn, Jaemoon; Kim, Seong-Woo; Huh, Kuhl

    2014-01-01

    To evaluate the usefulness of a 20-gauge cannula to maintain a self-sealing sclerotomy wound after 23-gauge phacofragmentation. This retrospective study compared the suture rates after 23-gauge phacofragmentation when the 23-gauge cannula was temporarily replaced with a 20-gauge valved metal cannula versus when the 23-gauge fragmatome was inserted at the sclerotomy site without a cannula. Whereas a sclerotomy was sutured in all 31 eyes in the without-cannula group, only one eye of 14 in the cannula group required a sclerotomy suture (P gauge metal cannula, but fragmatome tip fracture can occur during fragmentation. Copyright 2014, SLACK Incorporated.

  11. Pegasys: software for executing and integrating analyses of biological sequences.

    Science.gov (United States)

    Shah, Sohrab P; He, David Y M; Sawkins, Jessica N; Druce, Jeffrey C; Quon, Gerald; Lett, Drew; Zheng, Grace X Y; Xu, Tao; Ouellette, B F Francis

    2004-04-19

    We present Pegasys--a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  12. Multivariate differential analyses of adolescents' experiences of aggression in families

    Directory of Open Access Journals (Sweden)

    Chris Myburgh

    2011-01-01

    Full Text Available Aggression is part of South African society and has implications for the mental health of persons living in South Africa. If parents are aggressive adolescents are also likely to be aggressive and that will impact negatively on their mental health. In this article the nature and extent of adolescents' experiences of aggression and aggressive behaviour in the family are investigated. A deductive explorative quantitative approach was followed. Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers Cronbach Alpha, various consecutive first and second order factor analyses, correlations, multiple regression, MANOVA, ANOVA and Scheffè/ Dunnett tests were used. It was found that aggression correlated negatively with the independent variables; and the correlations between adolescents and their parents were significant. Regression analyses indicated that different predictors predicted aggression. Furthermore, differences between adolescents and their parents indicated that the experienced levels of aggression between adolescents and their parents were small. Implications for education are given.

  13. Regional Scale Analyses of Climate Change Impacts on Agriculture

    Science.gov (United States)

    Wolfe, D. W.; Hayhoe, K.

    2006-12-01

    New statistically downscaled climate modeling techniques provide an opportunity for improved regional analysis of climate change impacts on agriculture. Climate modeling outputs can often simultaneously meet the needs of those studying impacts on natural as well as managed ecosystems. Climate outputs can be used to drive existing forest or crop models, or livestock models (e.g., temperature-humidity index model predicting dairy milk production) for improved information on regional impact. High spatial resolution climate forecasts, combined with knowledge of seasonal temperatures or rainfall constraining species ranges, can be used to predict shifts in suitable habitat for invasive weeds, insects, and pathogens, as well as cash crops. Examples of climate thresholds affecting species range and species composition include: minimum winter temperature, duration of winter chilling (vernalization) hours (e.g., hours below 7.2 C), frost-free period, and frequency of high temperature stress days in summer. High resolution climate outputs can also be used to drive existing integrated pest management models predicting crop insect and disease pressure. Collectively, these analyses can be used to test hypotheses or provide insight into the impact of future climate change scenarios on species range shifts and threat from invasives, shifts in crop production zones, and timing and regional variation in economic impacts.

  14. The descriptive epidemiology of sitting. A 20-country comparison using the International Physical Activity Questionnaire (IPAQ).

    Science.gov (United States)

    Bauman, Adrian; Ainsworth, Barbara E; Sallis, James F; Hagströmer, Maria; Craig, Cora L; Bull, Fiona C; Pratt, Michael; Venugopal, Kamalesh; Chau, Josephine; Sjöström, Michael

    2011-08-01

    Recent epidemiologic evidence points to the health risks of prolonged sitting, that are independent of physical activity, but few papers have reported the descriptive epidemiology of sitting in population studies with adults. This paper reports the prevalence of "high sitting time" and its correlates in an international study in 20 countries. Representative population samples from 20 countries were collected 2002-2004, and a question was asked on usual weekday hours spent sitting. This question was part of the International Prevalence Study, using the International Physical Activity Questionnaire (IPAQ). The sitting measure has acceptable reliability and validity. Daily sitting time was compared among countries, and by age group, gender, educational attainment, and physical activity. Data were available for 49,493 adults aged 18-65 years from 20 countries. The median reported sitting time was 300 minutes/day, with an interquartile range of 180-480 minutes. Countries reporting the lowest amount of sitting included Portugal, Brazil, and Colombia (medians ≤180 min/day), whereas adults in Taiwan, Norway, Hong Kong, Saudi Arabia, and Japan reported the highest sitting times (medians ≥360 min/day). In adjusted analyses, adults aged 40-65 years were significantly less likely to be in the highest quintile for sitting than adults aged 18-39 years (AOR=0.796), and those with postschool education had higher sitting times compared with those with high school or less education (OR=1.349). Physical activity showed an inverse relationship, with those reporting low activity on the IPAQ three times more likely to be in the highest-sitting quintile compared to those reporting high physical activity. Median sitting time varied widely across countries. Assessing sitting time is an important new area for preventive medicine, in addition to assessing physical activity and sedentary behaviors. Population surveys that monitor lifestyle behaviors should add measures of sitting time to

  15. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...... through the analysis of one of the earliest recorded examples of preschool education (initiated by J. F. Oberlin in northeastern France in 1767). The general idea of societal need is elaborated as a way of analysing practices, and a general analytic schema is presented for characterising preschool...

  16. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  17. Advanced Toroidal Facility vacuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advanced Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described

  18. Proteomic Analyses of the Vitreous Humour

    Directory of Open Access Journals (Sweden)

    Martina Angi

    2012-01-01

    Full Text Available The human vitreous humour (VH is a transparent, highly hydrated gel, which occupies the posterior segment of the eye between the lens and the retina. Physiological and pathological conditions of the retina are reflected in the protein composition of the VH, which can be sampled as part of routine surgical procedures. Historically, many studies have investigated levels of individual proteins in VH from healthy and diseased eyes. In the last decade, proteomics analyses have been performed to characterise the proteome of the human VH and explore networks of functionally related proteins, providing insight into the aetiology of diabetic retinopathy and proliferative vitreoretinopathy. Recent proteomic studies on the VH from animal models of autoimmune uveitis have identified new signalling pathways associated to autoimmune triggers and intravitreal inflammation. This paper aims to guide biological scientists through the different proteomic techniques that have been used to analyse the VH and present future perspectives for the study of intravitreal inflammation using proteomic analyses.

  19. A 256-channel pulse-height analyser

    International Nuclear Information System (INIS)

    Berset, J.C.; Delavallade, G.; Lindsay, J.

    1975-01-01

    The design, construction, and testing of a small, low-cost 256-channel pulse-height analyser is briefly discussed. The analyser, intended for use in the setting up of experiments in high-energy physics, is fully compatible with the CERN/NIM nucleonic instrumentation. It has a digital display of channel and content as well as outputs for printing, plotting, and binary transfer. The logic circuitry is made with TTL integrated circuits and has a static random-access MOS memory. Logic and timing diagrams are given. Detailed specifications are also included. (Author)

  20. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  1. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  2. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-11-22

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  3. Prediction of heart disease using apache spark analysing decision trees and gradient boosting algorithm

    Science.gov (United States)

    Chugh, Saryu; Arivu Selvan, K.; Nadesh, RK

    2017-11-01

    Numerous destructive things influence the working arrangement of human body as hypertension, smoking, obesity, inappropriate medication taking which causes many contrasting diseases as diabetes, thyroid, strokes and coronary diseases. The impermanence and horribleness of the environment situation is also the reason for the coronary disease. The structure of Apache start relies on the evolution which requires gathering of the data. To break down the significance of use programming focused on data structure the Apache stop ought to be utilized and it gives various central focuses as it is fast in light as it uses memory worked in preparing. Apache Spark continues running on dispersed environment and chops down the data in bunches giving a high profitability rate. Utilizing mining procedure as a part of the determination of coronary disease has been exhaustively examined indicating worthy levels of precision. Decision trees, Neural Network, Gradient Boosting Algorithm are the various apache spark proficiencies which help in collecting the information.

  4. Behavioral and Physiological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction

    Science.gov (United States)

    Ninness, Chris; Lauter, Judy L.; Coffee, Michael; Clary, Logan; Kelly, Elizabeth; Rumph, Marilyn; Rumph, Robin; Kyle, Betty; Ninness, Sharon K.

    2012-01-01

    Using 3 diversified datasets, we explored the pattern-recognition ability of the Self-Organizing Map (SOM) artificial neural network as applied to diversified nonlinear data distributions in the areas of behavioral and physiological research. Experiment 1 employed a dataset obtained from the UCI Machine Learning Repository. Data for this study…

  5. Predicting an optimal function for diagnostic and prognostic analyses with gene expression data

    NARCIS (Netherlands)

    Jong, V.L.

    2017-01-01

    The completion of the human genome and the advancement of high-throughput technologies have enable the quantification of thousands of genes for precision medicine. The problem with gene expression data is that the number of genes (as variables) greatly supersedes the number of samples thereby

  6. Variability in spectrophotometric pyruvate analyses for predicting onion pungency and nutraceutical value.

    Science.gov (United States)

    Beretta, Vanesa H; Bannoud, Florencia; Insani, Marina; Galmarini, Claudio R; Cavagnaro, Pablo F

    2017-06-01

    Onion pyruvate concentration is used as a predictor of flavor intensity and nutraceutical value. The protocol of Schwimmer and Weston (SW) (1961) is the most widespread methodology for estimating onion pyruvate. Anthon and Barret (AB) (2003) proposed modifications to this procedure. Here, we compared these spectrophotometry-based procedures for pyruvate analysis using a diverse collection of onion cultivars. The SW method always led to over-estimation of pyruvate levels in colored, but not in white onions, by up to 65%. Identification of light-absorbance interfering compounds was performed by spectrophotometry and HPLC analysis. Interference by quercetin and anthocyanins, jointly, accounted for more than 90% of the over-estimation of pyruvate. Pyruvate determinations according to AB significantly reduced absorbance interference from compounds other than pyruvate. This study provides evidence about the mechanistic basis underlying differences between the SW and AB methods for indirect assessment of onion flavor and nutraceutical value. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Identifying the role of Wilms tumor 1 associated protein in cancer prediction using integrative genomic analyses.

    Science.gov (United States)

    Wu, Li-Sheng; Qian, Jia-Yi; Wang, Minghai; Yang, Haiwei

    2016-09-01

    The Wilms tumor suppressor, WT1 was first identified due to its essential role in the normal development of the human genitourinary system. Wilms tumor 1 associated protein (WTAP) was subsequently revealed to interact with WT1 using yeast two-hybrid screening. The present study identified 44 complete WTAP genes in the genomes of vertebrates, including fish, amphibians, birds and mammals. The vertebrate WTAP proteins clustered into the primate, rodent and teleost lineages using phylogenetic tree analysis. From 1,347 available SNPs in the human WTAP gene, 19 were identified to cause missense mutations. WTAP was expressed in bladder, blood, brain, breast, colorectal, esophagus, eye, head and neck, lung, ovarian, prostate, skin and soft tissue cancers. A total of 17 out of 328 microarrays demonstrated an association between WTAP gene expression and cancer prognosis. However, the association between WTAP gene expression and prognosis varied in distinct types of cancer, and even in identical types of cancer from separate microarray databases. By searching the Catalogue of Somatic Mutations in Cancer database, 65 somatic mutations were identified in the human WTAP gene from the cancer tissue samples. These results suggest that the function of WTAP in tumor formation may be multidimensional. Furthermore, signal transducer and activator of transcription 1, forkhead box protein O1, interferon regulatory factor 1, glucocorticoid receptor and peroxisome proliferator-activated receptor γ transcription factor binding sites were identified in the upstream (promoter) region of the human WTAP gene, suggesting that these transcription factors may be involved in WTAP functions in tumor formation.

  8. Sequence analyses and 3D structure prediction of two Type III ...

    African Journals Online (AJOL)

    Internet

    2012-04-17

    Apr 17, 2012 ... the usefulness of protein structure models for molecular replacement. Bioinformatics, 21(2): 72-76. Gomez JM, Loir M, Le Gac F (1998). Growth hormone receptors in testis and liver during the spermatogenetic cycle in rainbow trout. (Oncorhynchus mykiss). Biol. Reprod. 58: 483-491. Harvey S, Scanes CG.

  9. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  10. Good Governance Analysing Performance of Economic Community ...

    African Journals Online (AJOL)

    Good Governance Analysing Performance of Economic Community of West African States and Southern African Development Community Members on Mo Ibrahim Index of ... The Index is important, significant and appropriate because it outlines criteria and conditions deemed essential for Africans to live meaningful lives.

  11. Regression og geometrisk data analyse (2. del)

    DEFF Research Database (Denmark)

    Brinkkjær, Ulf

    2010-01-01

    Artiklen søger at vise, hvordan regressionsanalyse og geometrisk data analyse kan integreres. Det er interessant, fordi disse metoder ofte opstilles som modsætninger f.eks. som en modsætning mellem beskrivende og forklarende metoder. Artiklens første del bragtes i Praktiske Grunde 3-4 / 2007....

  12. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Abstract. The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of programmes was used to analyse reproductive data. Heritabilities and product-moment correlations between.

  13. Phonetic data and phonological analyses | Roux | Stellenbosch ...

    African Journals Online (AJOL)

    Stellenbosch Papers in Linguistics. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 1 (1978) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register. Phonetic data and phonological analyses. JC Roux. Abstract.

  14. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  15. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  16. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    The phylogenetic relationships among the tayassuids are unclear and have insti- gated debate over the ... [Adega F., Chaves R. and Guedes-Pinto H. 2007 Chromosomal evolution and phylogenetic analyses in Tayassu pecari and Pecari tajacu. (Tayassuidae): tales ..... Chromosome banding in Amphibia. XXV. Karyotype ...

  17. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  18. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  19. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show...

  20. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... must clearly identify and differentiate between the roles performed by the natural disposal site... and segregation requirements will be met and that adequate barriers to inadvertent intrusion will be... need for ongoing active maintenance after closure must be based upon analyses of active natural...

  1. Chemical Analyses of Silicon Aerogel Samples

    Energy Technology Data Exchange (ETDEWEB)

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  2. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  3. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  4. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  5. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1993-01-01

    In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques. (J.P.N.)

  6. Microbiological And Physicochemical Analyses Of Oil Contaminated ...

    African Journals Online (AJOL)

    Michael Horsfall

    The physicochemical properties of the soil samples analysed shows the pH ... Keywords: Oil contaminated soil, microbial isolates, mechanical workshops and physicochemical parameters. Pollution of the environment by petroleum ... strains capable of degrading Poly aromatic hydrocarbons have been isolated from soil and.

  7. Pecheries maritimes artisanales Togolaises : analyse des ...

    African Journals Online (AJOL)

    Pecheries maritimes artisanales Togolaises : analyse des debarquements et de la valeur commerciale des captures. K.M. Sedzro, E.D. Fiogbe, E.B. Guerra. Abstract. Description du sujet : La connaissance scientifique de la pression des pêcheries artisanales sur les ressources marines togolaises s'avère nécessaire pour ...

  8. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  9. The Economic Cost of Homosexuality: Multilevel Analyses

    Science.gov (United States)

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  10. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  11. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  12. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  13. Disruption prediction at JET

    International Nuclear Information System (INIS)

    Milani, F.

    1998-12-01

    The sudden loss of the plasma magnetic confinement, known as disruption, is one of the major issue in a nuclear fusion machine as JET (Joint European Torus). Disruptions pose very serious problems to the safety of the machine. The energy stored in the plasma is released to the machine structure in few milliseconds resulting in forces that at JET reach several Mega Newtons. The problem is even more severe in the nuclear fusion power station where the forces are in the order of one hundred Mega Newtons. The events that occur during a disruption are still not well understood even if some mechanisms that can lead to a disruption have been identified and can be used to predict them. Unfortunately it is always a combination of these events that generates a disruption and therefore it is not possible to use simple algorithms to predict it. This thesis analyses the possibility of using neural network algorithms to predict plasma disruptions in real time. This involves the determination of plasma parameters every few milliseconds. A plasma boundary reconstruction algorithm, XLOC, has been developed in collaboration with Dr. D. O'Brien and Dr. J. Ellis capable of determining the plasma wall/distance every 2 milliseconds. The XLOC output has been used to develop a multilayer perceptron network to determine plasma parameters as l i and q ψ with which a machine operational space has been experimentally defined. If the limits of this operational space are breached the disruption probability increases considerably. Another approach for prediction disruptions is to use neural network classification methods to define the JET operational space. Two methods have been studied. The first method uses a multilayer perceptron network with softmax activation function for the output layer. This method can be used for classifying the input patterns in various classes. In this case the plasma input patterns have been divided between disrupting and safe patterns, giving the possibility of

  14. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  15. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  16. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  17. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  18. Introduction à l'analyse fonctionnelle

    CERN Document Server

    Reischer, Corina; Hengartner, Walter

    1981-01-01

    Fruit de la collaboration des professeur Walter Hengarther de l'Université Laval, Marcel Lambert et Corina Reischer de l'Université du Québec à Trois-Rivières, Introduction à l'analyse fonctionnelle se distingue tant par l'étendue de son contenu que par l'accessibilité de sa présentation. Sans céder quoi que ce soit sur la rigueur, il est parfaitement adapté à un premier cours d'analyse fonctionnelle. Tout en étant d'abord destiné aux étudiants en mathématiques, il pourra certes être utile aux étudiants de second cycle en sciences et en génie.

  19. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  20. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  1. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  2. Analysing qualitative research data using computer software.

    Science.gov (United States)

    McLafferty, Ella; Farley, Alistair H

    An increasing number of clinical nurses are choosing to undertake qualitative research. A number of computer software packages are available designed for the management and analysis of qualitative data. However, while it is claimed that the use of these programs is also increasing, this claim is not supported by a search of recent publications. This paper discusses the advantages and disadvantages of using computer software packages to manage and analyse qualitative data.

  3. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  4. Analysing customer behaviour in mobile app usage

    OpenAIRE

    Chen, Qianling; Zhang, Min; Zhao, Xiande

    2017-01-01

    Purpose – Big data produced by mobile apps contains valuable knowledge about customers and markets and has been viewed as productive resources. This study proposes a multiple methods approach to elicit intelligence and value from big data by analysing customer behaviour in mobile app usage. Design/methodology/approach – The big data analytical approach is developed using three data mining techniques: RFM (Recency, Frequency, Monetary) analysis, link analysis, and association rule learning. We...

  5. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  6. Steady State Thermal Analyses of SCEPTOR X-57 Wingtip Propulsion

    Science.gov (United States)

    Schnulo, Sydney L.; Chin, Jeffrey C.; Smith, Andrew D.; Dubois, Arthur

    2017-01-01

    Electric aircraft concepts enable advanced propulsion airframe integration approaches that promise increased efficiency as well as reduced emissions and noise. NASA's fully electric Maxwell X-57, developed under the SCEPTOR program, features distributed propulsion across a high aspect ratio wing. There are 14 propulsors in all: 12 high lift motor that are only active during take off and climb, and 2 larger motors positioned on the wingtips that operate over the entire mission. The power electronics involved in the wingtip propulsion are temperature sensitive and therefore require thermal management. This work focuses on the high and low fidelity heat transfer analysis methods performed to ensure that the wingtip motor inverters do not reach their temperature limits. It also explores different geometry configurations involved in the X-57 development and any thermal concerns. All analyses presented are performed at steady state under stressful operating conditions, therefore predicting temperatures which are considered the worst-case scenario to remain conservative.

  7. Spent fuel shipping costs for transportation logistics analyses

    International Nuclear Information System (INIS)

    Cole, B.M.; Cross, R.E.; Cashwell, J.W.

    1983-05-01

    Logistics analyses supplied to the nuclear waste management programs of the U.S. Department of Energy through the Transportation Technology Center (TTC) at Sandia National Laboratories are used to predict nuclear waste material logistics, transportation packaging demands, shipping and receiving rates and transportation-related costs for alternative strategies. This study is an in-depth analysis of the problems and contingencies associated with the costs of shipping irradiated reactor fuel. These costs are extremely variable however, and have changed frequently (sometimes monthly) during the past few years due to changes in capital, fuel, and labor costs. All costs and charges reported in this study are based on January 1982 data using existing transport cask systems and should be used as relative indices only. Actual shipping costs would be negotiable for each origin-destination combination

  8. Gamma-ray spectrometric analyses of some Nigerian rock samples

    International Nuclear Information System (INIS)

    Uwah, E.J.; Ajakaiye, D.E.

    1990-01-01

    Approximate uranium concentrations in rock samples from the Sokoto Basin of Nigeria were predicted from γ-equivalent uranium and thorium (eU and eTh) measurements which made use of similar rock samples, previously analyzed by delayed neutron counting (DNC) and x-ray fluorescence (XRF) techniques, as reference materials. Comparison of the results of the 3 techniques shows that the eU values approximate the DNC results more than the XRF results do, with standard error of estimate of ±6.68 ppm eU and correlation coefficient of 0.984. Corresponding values for XRF analyses are ±32.53 ppm U and 0.730, respectively. (author)

  9. Compilation of Sandia coal char combustion data and kinetic analyses

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.E.; Hurt, R.H.; Baxter, L.L.; Hardesty, D.R.

    1992-06-01

    An experimental project was undertaken to characterize the physical and chemical processes that govern the combustion of pulverized coal chars. The experimental endeavor establishes a database on the reactivities of coal chars as a function of coal type, particle size, particle temperature, gas temperature, and gas and composition. The project also provides a better understanding of the mechanism of char oxidation, and yields quantitative information on the release rates of nitrogen- and sulfur-containing species during char combustion. An accurate predictive engineering model of the overall char combustion process under technologically relevant conditions in a primary product of this experimental effort. This document summarizes the experimental effort, the approach used to analyze the data, and individual compilations of data and kinetic analyses for each of the parent coals investigates.

  10. Contract Dynamics : Lessons from Empirical Analyses

    OpenAIRE

    Magali Chaudey

    2010-01-01

    Working paper GATE 2010-35; The recognition that contracts have a time dimension has given rise to a very abundant literature since the end of the 1980s. In such a dynamic context, the contract may take place over several periods and develop repeated interactions. Then, the principal topics of the analysis are commitment, reputation, memory and the renegotiation of the contract. Few papers have tried to apply the predictions of dynamic contract theory to data. The examples of applications int...

  11. Comparative genomic analyses of the Taylorellae.

    Science.gov (United States)

    Hauser, Heidi; Richter, Daniel C; van Tonder, Andries; Clark, Louise; Preston, Andrew

    2012-09-14

    Contagious equine metritis (CEM) is an important venereal disease of horses that is of concern to the thoroughbred industry. Taylorella equigenitalis is a causative agent of CEM but very little is known about it or its close relative Taylorella asinigenitalis. To reveal novel information about Taylorella biology, comparative genomic analyses were undertaken. Whole genome sequencing was performed for the T. equigenitalis type strain, NCTC11184. Draft genome sequences were produced for a second T. equigenitalis strain and for a strain of T. asinigenitalis. These genome sequences were analysed and compared to each other and the recently released genome sequence of T. equigenitalis MCE9. These analyses revealed that T. equigenitalis strains appear to be very similar to each other with relatively little strain-specific DNA content. A number of genes were identified that encode putative toxins and adhesins that are possibly involved in infection. Analysis of T. asinigenitalis revealed that it has a very similar gene repertoire to that of T. equigenitalis but shares surprisingly little DNA sequence identity with it. The generation of genome sequence information greatly increases knowledge of these poorly characterised bacteria and greatly facilitates study of them. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  13. Rotational distortion in conventional allometric analyses.

    Science.gov (United States)

    Packard, Gary C

    2011-08-01

    Three data sets from the recent literature were submitted to new analyses to illustrate the rotational distortion that commonly accompanies traditional allometric analyses and that often causes allometric equations to be inaccurate and misleading. The first investigation focused on the scaling of evaporative water loss to body mass in passerine birds; the second was concerned with the influence of body size on field metabolic rates of rodents; and the third addressed interspecific variation in kidney mass among primates. Straight lines were fitted to logarithmic transformations by Ordinary Least Squares and Generalized Linear Models, and the resulting equations then were re-expressed as two-parameter power functions in the original arithmetic scales. The re-expressed models were displayed on bivariate graphs together with tracings for equations fitted directly to untransformed data by nonlinear regression. In all instances, models estimated by back-transformation failed to describe major features of the arithmetic distribution whereas equations fitted by nonlinear regression performed quite well. The poor performance of equations based on models fitted to logarithms can be traced to the increased weight and leverage exerted in those analyses by observations for small species and to the decreased weight and leverage exerted by large ones. The problem of rotational distortion can be avoided by performing exploratory analysis on untransformed values and by validating fitted models in the scale of measurement. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Percutaneous nephrolithotomy vs. extracorporeal shockwave lithotripsy for treating a 20-30 mm single renal pelvic stone.

    Science.gov (United States)

    Hassan, Mohammed; El-Nahas, Ahmed R; Sheir, Khaled Z; El-Tabey, Nasr A; El-Assmy, Ahmed M; Elshal, Ahmed M; Shokeir, Ahmed A

    2015-09-01

    To compare the efficacy, safety and cost of extracorporeal shockwave lithotripsy (ESWL) and percutaneous nephrolithotomy (PNL) for treating a 20-30 mm single renal pelvic stone. The computerised records of patients who underwent PNL or ESWL for a 20-30 mm single renal pelvic stone between January 2006 and December 2012 were reviewed retrospectively. Patients aged PNL. The re-treatment rate (75% vs. 5%), the need for secondary procedures (25% vs. 4.7%) and total number of procedures (three vs. one) were significantly higher in the ESWL group (P PNL group (95% vs. 75%, P PNL (US$ 1120 vs. 490; P PNL was more effective than ESWL for treating a single renal pelvic stone of 20-30 mm. However, ESWL was associated with fewer complications and a lower cost.

  15. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  16. Predictability of blocking

    International Nuclear Information System (INIS)

    Tosi, E.; Ruti, P.; Tibaldi, S.; D'Andrea, F.

    1994-01-01

    Tibaldi and Molteni (1990, hereafter referred to as TM) had previously investigated operational blocking predictability by the ECMWF model and the possible relationships between model systematic error and blocking in the winter season of the Northern Hemisphere, using seven years of ECMWF operational archives of analyses and day 1 to 10 forecasts. They showed that fewer blocking episodes than in the real atmosphere were generally simulated by the model, and that this deficiency increased with increasing forecast time. As a consequence of this, a major contribution to the systematic error in the winter season was shown to derive from the inability of the model to properly forecast blocking. In this study, the analysis performed in TM for the first seven winter seasons of the ECMWF operational model is extended to the subsequent five winters, during which model development, reflecting both resolution increases and parametrisation modifications, continued unabated. In addition the objective blocking index developed by TM has been applied to the observed data to study the natural low frequency variability of blocking. The ability to simulate blocking of some climate models has also been tested

  17. Predicting responsiveness to intervention in dyslexia using dynamic assessment

    NARCIS (Netherlands)

    Aravena, S.; Tijms, J.; Snellings, P.; van der Molen, M.W.

    In the current study we examined the value of a dynamic test for predicting responsiveness to reading intervention for children diagnosedwith dyslexia. The test consisted of a 20-minute training aimed at learning eight basic letter–speech sound correspondences within an artificial orthography,

  18. Analyses of the OSU-MASLWR Experimental Test Facility

    Directory of Open Access Journals (Sweden)

    F. Mascari

    2012-01-01

    Full Text Available Today, considering the sustainability of the nuclear technology in the energy mix policy of developing and developed countries, the international community starts the development of new advanced reactor designs. In this framework, Oregon State University (OSU has constructed, a system level test facility to examine natural circulation phenomena of importance to multi-application small light water reactor (MASLWR design, a small modular pressurized water reactor (PWR, relying on natural circulation during both steady-state and transient operation. The target of this paper is to give a review of the main characteristics of the experimental facility, to analyse the main phenomena characterizing the tests already performed, the potential transients that could be investigated in the facility, and to describe the current IAEA International Collaborative Standard Problem that is being hosted at OSU and the experimental data will be collected at the OSU-MASLWR test facility. A summary of the best estimate thermal hydraulic system code analyses, already performed, to analyze the codes capability in predicting the phenomena typical of the MASLWR prototype, thermal hydraulically characterized in the OSU-MASLWR facility, is presented as well.

  19. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  20. ANALYSE THE PERFORMANCE OF ENSEMBLE CLASSIFIERS USING SAMPLING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    M. Balamurugan

    2016-07-01

    Full Text Available In Ensemble classifiers, the Combination of multiple prediction models of classifiers is important for making progress in a variety of difficult prediction problems. Ensemble of classifiers proved potential in getting higher accuracy compared to single classifier. Even though by the usage ensemble classifiers, still there is in-need to improve its performance. There are many possible ways available to increase the performance of ensemble classifiers. One of the ways is sampling, which plays a major role for improving the quality of ensemble classifier. Since, it helps in reducing the bias in input data set of ensemble. Sampling is the process of extracting the subset of samples from the original dataset. In this research work, analysis is done on sampling techniques for ensemble classifiers. In ensemble classifier, specifically one of the probability based sampling techniques is being always used. Samples are gathered in a process which gives all the individuals in the population of equal chances, such that, sampling bias is removed. In this paper, analyse the performance of ensemble classifiers by using various sampling techniques and list out their drawbacks.

  1. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  2. NASA Cold Land Processes Experiment (CLPX 2002/03): Atmospheric analyses datasets

    Science.gov (United States)

    Glen E. Liston; Daniel L. Birkenheuer; Christopher A. Hiemstra; Donald W. Cline; Kelly Elder

    2008-01-01

    This paper describes the Local Analysis and Prediction System (LAPS) and the 20-km horizontal grid version of the Rapid Update Cycle (RUC20) atmospheric analyses datasets, which are available as part of the Cold Land Processes Field Experiment (CLPX) data archive. The LAPS dataset contains spatially and temporally continuous atmospheric and surface variables over...

  3. Multivariate Analyses of Predictors of Heavy Episodic Drinking and Drinking-Related Problems among College Students

    Science.gov (United States)

    Fenzel, L. Mickey

    2005-01-01

    The present study examines predictors of heavy drinking frequency and drinking-related problems among more than 600 college students. Controlling for high school drinking frequency, results of multiple regression analyses showed that more frequent heavy drinking was predicted by being male and risk factors of more frequent marijuana and tobacco…

  4. Systems Analyses Reveal Shared and Diverse Attributes of Oct4 Regulation in Pluripotent Cells

    DEFF Research Database (Denmark)

    Ding, Li; Paszkowski-Rogacz, Maciej; Winzi, Maria

    2015-01-01

    of Oct4, a key regulator of pluripotency. Our data signify that there are similarities, but also fundamental differences in Oct4 regulation in EpiSCs versus embryonic stem cells (ESCs). Through multiparametric data analyses, we predict that Tox4 is associating with the Paf1C complex, which maintains cell...

  5. Analyses of expressed sequence tags from apple.

    Science.gov (United States)

    Newcomb, Richard D; Crowhurst, Ross N; Gleave, Andrew P; Rikkerink, Erik H A; Allan, Andrew C; Beuning, Lesley L; Bowen, Judith H; Gera, Emma; Jamieson, Kim R; Janssen, Bart J; Laing, William A; McArtney, Steve; Nain, Bhawana; Ross, Gavin S; Snowden, Kimberley C; Souleyre, Edwige J F; Walton, Eric F; Yauk, Yar-Khing

    2006-05-01

    The domestic apple (Malus domestica; also known as Malus pumila Mill.) has become a model fruit crop in which to study commercial traits such as disease and pest resistance, grafting, and flavor and health compound biosynthesis. To speed the discovery of genes involved in these traits, develop markers to map genes, and breed new cultivars, we have produced a substantial expressed sequence tag collection from various tissues of apple, focusing on fruit tissues of the cultivar Royal Gala. Over 150,000 expressed sequence tags have been collected from 43 different cDNA libraries representing 34 different tissues and treatments. Clustering of these sequences results in a set of 42,938 nonredundant sequences comprising 17,460 tentative contigs and 25,478 singletons, together representing what we predict are approximately one-half the expressed genes from apple. Many potential molecular markers are abundant in the apple transcripts. Dinucleotide repeats are found in 4,018 nonredundant sequences, mainly in the 5'-untranslated region of the gene, with a bias toward one repeat type (containing AG, 88%) and against another (repeats containing CG, 0.1%). Trinucleotide repeats are most common in the predicted coding regions and do not show a similar degree of sequence bias in their representation. Bi-allelic single-nucleotide polymorphisms are highly abundant with one found, on average, every 706 bp of transcribed DNA. Predictions of the numbers of representatives from protein families indicate the presence of many genes involved in disease resistance and the biosynthesis of flavor and health-associated compounds. Comparisons of some of these gene families with Arabidopsis (Arabidopsis thaliana) suggest instances where there have been duplications in the lineages leading to apple of biosynthetic and regulatory genes that are expressed in fruit. This resource paves the way for a concerted functional genomics effort in this important temperate fruit crop.

  6. [Analyses of deaths can provide meaningful learning].

    Science.gov (United States)

    Jensen, Marie Rosenørn Hviid; Jørsboe, Hanne Blæhr

    2016-05-16

    Learning based on deceased patients has provided medicine with substantial knowledge and is still a source of new information. The basic learning approach has been autopsies, but focus has shifted towards analysis of registry data. This article evaluates different ways to analyse the natural deaths, including autopsies, audits, clinical databases and hospital standardised mortality ratios in regard of clinical learning. We claim that data-powered analysis cannot stand alone, and recommend that clinicians should organise multidisciplinary theoretically based audits, in order to keep learning from the deceased.

  7. Fully Coupled FE Analyses of Buried Structures

    Directory of Open Access Journals (Sweden)

    James T. Baylot

    1994-01-01

    Full Text Available Current procedures for determining the response of buried structures to the effects of the detonation of buried high explosives recommend decoupling the free-field stress analysis from the structure response analysis. A fully coupled (explosive–soil structure finite element analysis procedure was developed so that the accuracies of current decoupling procedures could be evaluated. Comparisons of the results of analyses performed using this procedure with scale-model experiments indicate that this finite element procedure can be used to effectively evaluate the accuracies of the methods currently being used to decouple the free-field stress analysis from the structure response analysis.

  8. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  9. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  10. ORNL analyses of AVR performance and safety

    Energy Technology Data Exchange (ETDEWEB)

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal.

  11. ORNL analyses of AVR performance and safety

    International Nuclear Information System (INIS)

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal

  12. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  13. PREDICT: Satellite tracking and orbital prediction

    Science.gov (United States)

    Magliacane, John A.

    2011-12-01

    PREDICT is an open-source, multi-user satellite tracking and orbital prediction program written under the Linux operating system. PREDICT provides real-time satellite tracking and orbital prediction information to users and client applications through: the system console the command line a network socket the generation of audio speechData such as a spacecraft's sub-satellite point, azimuth and elevation headings, Doppler shift, path loss, slant range, orbital altitude, orbital velocity, footprint diameter, orbital phase (mean anomaly), squint angle, eclipse depth, the time and date of the next AOS (or LOS of the current pass), orbit number, and sunlight and visibility information are provided on a real-time basis. PREDICT can also track (or predict the position of) the Sun and Moon. PREDICT has the ability to control AZ/EL antenna rotators to maintain accurate orientation in the direction of communication satellites. As an aid in locating and tracking satellites through optical means, PREDICT can articulate tracking coordinates and visibility information as plain speech.

  14. Depressive disorder in the last phase of life in patients with cardiovascular disease, cancer, and COPD: data from a 20-year follow-up period in general practice.

    Science.gov (United States)

    Warmenhoven, Franca; Bor, Hans; Lucassen, Peter; Vissers, Kris; van Weel, Chris; Prins, Judith; Schers, Henk

    2013-05-01

    Depression is assumed to be common in chronically ill patients during their last phase of life and is associated with poorer outcomes. The prevalence of depression is widely varying in previous studies due to the use of different terminology, classification, and assessment methods. To explore the reported incidence of depressive disorder, as registered in the last phase of life of patients who died from cardiovascular disease, cancer or COPD, in a sample of primary care patients. A historic cohort study, using a 20-year period registration database of medical records in four Dutch general practices (a dynamic population based on the Continuous Morbidity Registration database). Medical history of the sample cohort was analysed for the diagnosis of a new episode of depressive disorder and descriptive statistics were used. In total 982 patients were included, and 19 patients (1.9%) were diagnosed with a new depressive disorder in the last year of their life. The lifetime prevalence of depressive disorder in this sample was 8.2%. The incidence of depressive disorder in the last phase of life is remarkably low in this study. These data were derived from actual patient care in general practice. Psychiatric diagnoses were made by GPs in the context of both patient needs and delivered care. A broader concept of depression in general practice is recommended to improve the diagnosis and treatment of mood disorders in patients in the last phase of life.

  15. Determinants of Aortic Root Dilatation and Reference Values Among Young Adults Over a 20-Year Period: Coronary Artery Risk Development in Young Adults Study.

    Science.gov (United States)

    Teixido-Tura, Gisela; Almeida, Andre L C; Choi, Eui-Young; Gjesdal, Ola; Jacobs, David R; Dietz, Harry C; Liu, Kiang; Sidney, Stephen; Lewis, Cora E; Garcia-Dorado, David; Evangelista, Artur; Gidding, Samuel; Lima, João A C

    2015-07-01

    Aortic size increases with age, but factors related to such dilatation in healthy young adult population have not been studied. We aim to evaluate changes in aortic dimensions and its principal correlates among young adults over a 20-year time period. Reference values for aortic dimensions in young adults by echocardiography are also provided. Healthy Coronary Artery Risk Development in Young Adults (CARDIA) study participants aged 23 to 35 years in 1990-1991 (n=3051) were included after excluding 18 individuals with significant valvular dysfunction. Aortic root diameter (ARD) by M-mode echocardiography at year-5 (43.7% men; age, 30.2 ± 3.6 years) and year-25 CARDIA exams was obtained. Univariable and multivariable analyses were performed to assess associations of ARD with clinical data at years-5 and -25. ARD from year-5 was used to establish reference values of ARD in healthy young adults. ARD at year-25 was greater in men (33.3 ± 3.7 versus 28.7 ± 3.4 mm; Pyoung adulthood. Our study also provides reference values for ARD in young adults. © 2015 American Heart Association, Inc.

  16. CHEMICAL ANALYSES OF SODIUM SYSTEMS FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, W. O.; Yunker, W. H.; Scott, F. A.

    1970-06-01

    BNWL-1407 summarizes information gained from the Chemical Analyses of Sodium Systems Program pursued by Battelle- Northwest over the period from July 1967 through June 1969. Tasks included feasibility studies for performing coulometric titration and polarographic determinations of oxygen in sodium, and the development of new separation techniques for sodium impurities and their subsequent analyses. The program was terminated ahead of schedule so firm conclusions were not obtained in all areas of the work. At least 40 coulometric titrations were carried out and special test cells were developed for coulometric application. Data indicated that polarographic measurements are theoretically feasible, but practical application of the method was not verified. An emission spectrographic procedure for trace metal impurities was developed and published. Trace metal analysis by a neutron activation technique was shown to be feasible; key to the success of the activation technique was the application of a new ion exchange resin which provided a sodium separation factor of 10{sup 11}. Preliminary studies on direct scavenging of trace metals produced no conclusive results.

  17. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  18. Hierarchical regression for analyses of multiple outcomes.

    Science.gov (United States)

    Richardson, David B; Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R; Chu, Haitao

    2015-09-01

    In cohort mortality studies, there often is interest in associations between an exposure of primary interest and mortality due to a range of different causes. A standard approach to such analyses involves fitting a separate regression model for each type of outcome. However, the statistical precision of some estimated associations may be poor because of sparse data. In this paper, we describe a hierarchical regression model for estimation of parameters describing outcome-specific relative rate functions and associated credible intervals. The proposed model uses background stratification to provide flexible control for the outcome-specific associations of potential confounders, and it employs a hierarchical "shrinkage" approach to stabilize estimates of an exposure's associations with mortality due to different causes of death. The approach is illustrated in analyses of cancer mortality in 2 cohorts: a cohort of dioxin-exposed US chemical workers and a cohort of radiation-exposed Japanese atomic bomb survivors. Compared with standard regression estimates of associations, hierarchical regression yielded estimates with improved precision that tended to have less extreme values. The hierarchical regression approach also allowed the fitting of models with effect-measure modification. The proposed hierarchical approach can yield estimates of association that are more precise than conventional estimates when one wishes to estimate associations with multiple outcomes. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  20. Autisme et douleur – analyse bibliographique

    Science.gov (United States)

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  1. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  2. Hitchhikers’ guide to analysing bird ringing data

    Directory of Open Access Journals (Sweden)

    Harnos Andrea

    2015-12-01

    Full Text Available Bird ringing datasets constitute possibly the largest source of temporal and spatial information on vertebrate taxa available on the globe. Initially, the method was invented to understand avian migration patterns. However, data deriving from bird ringing has been used in an array of other disciplines including population monitoring, changes in demography, conservation management and to study the effects of climate change to name a few. Despite the widespread usage and importance, there are no guidelines available specifically describing the practice of data management, preparation and analyses of ringing datasets. Here, we present the first of a series of comprehensive tutorials that may help fill this gap. We describe in detail and through a real-life example the intricacies of data cleaning and how to create a data table ready for analyses from raw ringing data in the R software environment. Moreover, we created and present here the R package; ringR, designed to carry out various specific tasks and plots related to bird ringing data. Most methods described here can also be applied to a wide range of capture-recapture type data based on individual marking, regardless to taxa or research question.

  3. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  4. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  5. Analysing wind farm efficiency on complex terrains

    International Nuclear Information System (INIS)

    Castellani, Francesco; Astolfi, Davide; Terzi, Ludovico; Hansen, Kurt Schaldemose; Rodrigo, Javier Sanz

    2014-01-01

    Actual performances of onshore wind farms are deeply affected both by wake interactions and terrain complexity: therefore monitoring how the efficiency varies with the wind direction is a crucial task. Polar efficiency plot is therefore a useful tool for monitoring wind farm performances. The approach deserves careful discussion for onshore wind farms, where orography and layout commonly affect performance assessment. The present work deals with three modern wind farms, owned by Sorgenia Green, located on hilly terrains with slopes from gentle to rough. Further, onshore wind farm of Nprrekffir Enge has been analysed as a reference case: its layout is similar to offshore wind farms and the efficiency is mainly driven by wakes. It is shown and justified that terrain complexity imposes a novel and more consistent way for defining polar efficiency. Dependency of efficiency on wind direction, farm layout and orography is analysed and discussed. Effects of atmospheric stability have been also investigated through MERRA reanalysis data from NASA satellites. Monin-Obukhov Length has been used to discriminate climate regimes

  6. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  7. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  8. Ada Compiler Validation Summary Report: Certificate Number: 891126I1. 10221 Telesoft TeleGen2 Ada Development System for HP 9000/340 x MVME133a-20.

    Science.gov (United States)

    1989-11-26

    Computer & Software Engineering Division D Ib Institute for Defense Analyses 0"’c FA8 Alexandria, VA 22311 Juzhc, .... "J <I ~ By Ai Joint Program Office...are detected at compile time and the program executes to produce a PASSED message. Class 5 tests check ttat a compilar 4etects illegal language...by the following designations of hardware And software components: Host: Hewlett-Packard 9000/340 under HP/UX 6.5 Target: Motorola IM IE133A-20

  9. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-02-01

    Full Text Available Objectives: The study examined and identified the factors that affect lawyers’ attitudes to knowledge sharing, and their knowledge sharing behaviour. Specifically, it investigated the relationship between the salient beliefs affecting the knowledge sharing attitude of lawyers’, and applied a modified version of the Theory of Reasoned Action (TRA in the knowledge sharing context, to predict how these factors affect their knowledge sharing behaviour.Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance.Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’.Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers. 

  10. Monitoring and prediction of natural disasters

    International Nuclear Information System (INIS)

    Kondratyev, K. Ya; Krapivin, V. F.

    2004-01-01

    The problems of natural disaster predicting and accomplishing a synthesis of environmental monitoring systems to collect, store, and process relevant information for their solution are analysed. A three-level methodology is proposed for making decisions concerning the natural disaster dynamics. The methodology is based on the assessment of environmental indicators and the use of numerical models of the environment

  11. Wrinkling prediction with adaptive mesh refinement

    NARCIS (Netherlands)

    Selman, A.; Meinders, Vincent T.; van den Boogaard, Antonius H.; Huetink, Han

    2000-01-01

    An adaptive mesh refinement procedure for wrinkling prediction analyses is presented. First the critical values are determined using Hutchinson’s bifurcation functional. A wrinkling risk factor is then defined and used to determined areas of potential wrinkling risk. Finally, a mesh refinement is

  12. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  13. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  14. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  15. Feasibility Analyses of Integrated Broiler Production

    Directory of Open Access Journals (Sweden)

    L. Komalasari

    2010-12-01

    Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.

  16. Preserving the nuclear option: analyses and recommendations

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    It is certain that a future role for nuclear power will depend on substantial changes in the management and regulation of the enterprise. It is widely believed that institutional, rather than technological, change is, at least in the short term, the key to resuscitating the nuclear option. Several recent analyses of the problems facing nuclear power, together with the current congressional hearing on the Nuclear Regulatory Commission's fiscal year 1986 budget request, have examined both the future of nuclear power and what can be done to address present institutional shortcomings. The congressional sessions have provided an indication of the views of both legislators and regulators, and this record, although mixed, generally shows continued optimism about the prospects of the nuclear option if needed reforms are accomplished

  17. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  18. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  19. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  20. An introduction to modern missing data analyses.

    Science.gov (United States)

    Baraldi, Amanda N; Enders, Craig K

    2010-02-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional techniques. This article explains the theoretical underpinnings of missing data analyses, gives an overview of traditional missing data techniques, and provides accessible descriptions of maximum likelihood and multiple imputation. In particular, this article focuses on maximum likelihood estimation and presents two analysis examples from the Longitudinal Study of American Youth data. One of these examples includes a description of the use of auxiliary variables. Finally, the paper illustrates ways that researchers can use intentional, or planned, missing data to enhance their research designs.

  1. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  2. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  3. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  4. Precursor analyses for German nuclear power plants

    International Nuclear Information System (INIS)

    Babst, Siegfried; Gaenssmantel, Gerhard; Stueck, Reinhard

    2009-01-01

    Precursor analysis is an internationally recognized method for quantifying the safety-relevance of operational events in nuclear power plants. Precursors are operational events in nuclear power plants which had no serious impact, but which could have led to serious impacts, if additional malfunctions had occurred. Examples of such operational events are component failures or transients, for example, the loss of main feedwater. On the basis of the probabilities for the occurrence of additional malfunctions or initiating events precursor analyses determine the probability with which these additional malfunctions during the event occurred would have led to core damage. This conditional probability is a measure for the safety relevance of the operational event occurred. Events, for which the probability of core damages is > 10 -6 , are internationally classified as ''precursor''. (orig.)

  5. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  6. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in-creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack...... with social engineering. Due to this combination of attack steps on technical and social levels, risk assessment in socio-technical systems is complex. Therefore, established risk assessment methods often abstract away the internal structure of an organisation and ignore human factors when modelling...... and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...

  7. Antarctic observations available for IMS correlative analyses

    International Nuclear Information System (INIS)

    Rycroft, M.J.

    1982-01-01

    A review is provided of the wide-ranging observational programs of 25 stations operating on and around the continent of Antarctica during the International Magnetospheric Study (IMS). Attention is given to observations of geomagnetism, short period fluctuations of the earth's electromagnetic field, observations of the ionosphere and of whistler mode signals, observational programs in ionospheric and magnetospheric physics, upper atmosphere physics observations, details of magnetospheric programs conducted at Kerguelen, H-component magnetograms, magnetic field line oscillations, dynamic spectra of whistlers, and the variation of plasmapause position derived from recorded whistlers. The considered studies suggest that, in principle, if the level of magnetic activity is known, predictions can be made concerning the time at which the trough occurs, and the shape and the movement of the main trough

  8. Analysing CMS transfers using Machine Learning techniques

    CERN Document Server

    Diotalevi, Tommaso

    2016-01-01

    LHC experiments transfer more than 10 PB/week between all grid sites using the FTS transfer service. In particular, CMS manages almost 5 PB/week of FTS transfers with PhEDEx (Physics Experiment Data Export). FTS sends metrics about each transfer (e.g. transfer rate, duration, size) to a central HDFS storage at CERN. The work done during these three months, here as a Summer Student, involved the usage of ML techniques, using a CMS framework called DCAFPilot, to process this new data and generate predictions of transfer latencies on all links between Grid sites. This analysis will provide, as a future service, the necessary information in order to proactively identify and maybe fix latency issued transfer over the WLCG.

  9. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  10. GPU based framework for geospatial analyses

    Science.gov (United States)

    Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus

    2017-04-01

    Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric

  11. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  12. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  13. Analyses of tropistic responses using metabolomics.

    Science.gov (United States)

    Millar, Katherine D L; Kiss, John Z

    2013-01-01

    Characterization of phototropism and gravitropism has been through gene expression studies, assessment of curvature response, and protein expression experiments. To our knowledge, the current study is the first to determine how the metabolome, the complete set of small-molecule metabolites within a plant, is impacted during these tropisms. We have determined the metabolic profile of plants during gravitropism and phototropism. Seedlings of Arabidopsis thaliana wild type (WT) and phyB mutant were exposed to unidirectional light (red or blue) or reoriented to induce a tropistic response, and small-molecule metabolites were assayed and quantified. A subset of the WT was analyzed using microarray experiments to obtain gene profiling data. Analyses of the metabolomic data using principal component analysis showed a common profile in the WT during the different tropistic curvatures, but phyB mutants produced a distinctive profile for each tropism. Interestingly, the gravity treatment elicited the greatest changes in gene expression of the WT, followed by blue light, then by red light treatments. For all tropisms, we identified genes that were downregulated by a large magnitude in carbohydrate metabolism and secondary metabolism. These included ATCSLA15, CELLULOSE SYNTHASE-LIKE, and ATCHS/SHS/TT4, CHALCONE SYNTHASE. In addition, genes involved in amino acid biosynthesis were strongly upregulated, and these included THA1 (THREONINE ALDOLASE 1) and ASN1 (DARK INDUCIBLE asparagine synthase). We have established the first metabolic profile of tropisms in conjunction with transcriptomic analyses. This approach has been useful in characterizing the similarities and differences in the molecular mechanisms involved with phototropism and gravitropism.

  14. A 20-year-old man with large gastric lipoma--imaging, clinical symptoms, pathological findings and surgical treatment.

    Science.gov (United States)

    Kapetanakis, Stylianos; Papathanasiou, Jiannis; Fiska, Aliki; Ververidis, Athanasios; Dimitriou, Thespis; Hristov, Zheliazko; Paskalev, George

    2010-01-01

    A broad search of the available literature yielded no other report of gastric lipoma of that size (13.5 x 6.5 x 4.5 cm) at this early age. The patient (a 20-year-old man with giant lipoma in the anterior gastric wall) presented with haematemesis and melena after excessive alcohol consumption. Gastric resection was performed. At 5-year follow up the patient is healthy and doing well. Epidemiology of gastric lipoma, the differential diagnosis, means of diagnosis and treatment are discussed.

  15. Predicting outdoor sound

    CERN Document Server

    Attenborough, Keith; Horoshenkov, Kirill

    2014-01-01

    1. Introduction  2. The Propagation of Sound Near Ground Surfaces in a Homogeneous Medium  3. Predicting the Acoustical Properties of Outdoor Ground Surfaces  4. Measurements of the Acoustical Properties of Ground Surfaces and Comparisons with Models  5. Predicting Effects of Source Characteristics on Outdoor Sound  6. Predictions, Approximations and Empirical Results for Ground Effect Excluding Meteorological Effects  7. Influence of Source Motion on Ground Effect and Diffraction  8. Predicting Effects of Mixed Impedance Ground  9. Predicting the Performance of Outdoor Noise Barriers  10. Predicting Effects of Vegetation, Trees and Turbulence  11. Analytical Approximations including Ground Effect, Refraction and Turbulence  12. Prediction Schemes  13. Predicting Sound in an Urban Environment.

  16. The prediction of different experiences of longterm illness

    DEFF Research Database (Denmark)

    Blank, N; Diderichsen, Finn

    1996-01-01

    To analyse the role played by socioeconomic factors and self rated general health in the prediction of the reporting of severe longterm illness, and the extent to which these factors explain social class differences in the reporting of such illness.......To analyse the role played by socioeconomic factors and self rated general health in the prediction of the reporting of severe longterm illness, and the extent to which these factors explain social class differences in the reporting of such illness....

  17. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  18. Structural and mutational analyses of cis-acting sequences in the 5'-untranslated region of satellite RNA of bamboo mosaic potexvirus

    International Nuclear Information System (INIS)

    Annamalai, Padmanaban; Hsu, Y.-H.; Liu, Y.-P.; Tsai, C.-H.; Lin, N.-S.

    2003-01-01

    The satellite RNA of Bamboo mosaic virus (satBaMV) contains on open reading frame for a 20-kDa protein that is flanked by a 5'-untranslated region (UTR) of 159 nucleotides (nt) and a 3'-UTR of 129 nt. A secondary structure was predicted for the 5'-UTR of satBaMV RNA, which folds into a large stem-loop (LSL) and a small stem-loop. Enzymatic probing confirmed the existence of LSL (nt 8-138) in the 5'-UTR. The essential cis-acting sequences in the 5'-UTR required for satBaMV RNA replication were determined by deletion and substitution mutagenesis. Their replication efficiencies were analyzed in Nicotiana benthamiana protoplasts and Chenopodium quinoa plants coinoculated with helper BaMV RNA. All deletion mutants abolished the replication of satBaMV RNA, whereas mutations introduced in most of the loop regions and stems showed either no replication or a decreased replication efficiency. Mutations that affected the positive-strand satBaMV RNA accumulation also affected the accumulation of negative-strand RNA; however, the accumulation of genomic and subgenomic RNAs of BaMV were not affected. Moreover, covariation analyses of natural satBaMV variants provide substantial evidence that the secondary structure in the 5'-UTR of satBaMV is necessary for efficient replication

  19. Analysing Scenarios of Cell Population System Development

    Directory of Open Access Journals (Sweden)

    M. S. Vinogradova

    2014-01-01

    Full Text Available The article considers an isolated population system consisting of two types of human stem cells, namely normal cells and cells with chromosomal abnormalities (abnormal ones. The system develops in the laboratory (in vitro. The article analyses possible scenarios of the population system development, which are implemented for different values of its parameters. An investigated model of the cell population system takes into account the limited resources. It is represented as a system of two nonlinear differential equations with continuous right-hand part. The model is considered with non-negative values of the variables; the domain is divided into four sets. The model feature is that in each set the right part of the system of differential equations has a different form.The article analyses a quality of the rest points of the system in each of four sets. The analytical conditions for determination of the number of rest points and the quality of rest points, with, at least, one zero coordinate, are obtained.It is shown that the population system under study cannot have more than two points of rest, both coordinates of which are positive (non-zero. It is difficult to determine quality of such rest points depending on the model parameters due to the complexity of the expressions, which define the systems of the first approximation, recorded in a neighborhood of these points of rest. Numerical research results of the stability of these points of rest are obtained, and phase portraits with the specified specific values of the system parameters are demonstrated. The main scenarios for the cell population development are adduced. Analysis of mathematical model shows that a cell population system may remain the system consisting of populations of normal and abnormal cells; it can degenerate into a population of abnormal cells or perish. The scenario, in which there is only the population of normal cells, is not implemented. The numerical simulation

  20. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  1. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  2. Prediction of postoperative pain: a systematic review of predictive experimental pain studies

    DEFF Research Database (Denmark)

    Werner, Mads Utke; Mjöbo, Helena N; Nielsen, Per R

    2010-01-01

    preoperative responses to experimental pain stimuli and clinical postoperative pain and demonstrates that the preoperative pain tests may predict 4-54% of the variance in postoperative pain experience depending on the stimulation methods and the test paradigm used. The predictive strength is much higher than...... previously reported for single factor analyses of demographics and psychologic factors. In addition, some of these studies indicate that an increase in preoperative pain sensitivity is associated with a high probability of development of sustained postsurgical pain....

  3. Rice A20/AN1 zinc-finger containing stress-associated proteins (SAP1/11) and a receptor-like cytoplasmic kinase (OsRLCK253) interact via A20 zinc-finger and confer abiotic stress tolerance in transgenic Arabidopsis plants.

    Science.gov (United States)

    Giri, Jitender; Vij, Shubha; Dansana, Prasant K; Tyagi, Akhilesh K

    2011-08-01

    • The inbuilt mechanisms of plant survival have been exploited for improving tolerance to abiotic stresses. Stress-associated proteins (SAPs), containing A20/AN1 zinc-finger domains, confer abiotic stress tolerance in different plants, however, their interacting partners and downstream targets remain to be identified. • In this study, we have investigated the subcellular interactions of rice SAPs and their interacting partner using yeast two-hybrid and fluorescence resonance energy transfer (FRET) approaches. Their efficacy in improving abiotic stress tolerance was analysed in transgenic Arabidopsis plants. Regulation of gene expression by genome-wide microarray in transgenics was used to identify downstream targets. • It was found that the A20 domain mediates the interaction of OsSAP1 with self, its close homolog OsSAP11 and a rice receptor-like cytoplasmic kinase, OsRLCK253. Such interactions between OsSAP1/11 and with OsRLCK253 occur at nuclear membrane, plasma membrane and in nucleus. Functionally, both OsSAP11 and OsRLCK253 could improve the water-deficit and salt stress tolerance in transgenic Arabidopsis plants via a signaling pathway affecting the expression of several common endogenous genes. • Components of a novel stress-responsive pathway have been identified. Their stress-inducible expression provided the protection against yield loss in transgenic plants, indicating the agronomic relevance of OsSAP11 and OsRLCK253 in conferring abiotic stress tolerance. © 2011 The Authors. New Phytologist © 2011 New Phytologist Trust.

  4. Network-Based and Binless Frequency Analyses.

    Directory of Open Access Journals (Sweden)

    Sybil Derrible

    Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.

  5. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  6. Recent advances in cellular glycomic analyses.

    Science.gov (United States)

    Furukawa, Jun-Ichi; Fujitani, Naoki; Shinohara, Yasuro

    2013-02-21

    A large variety of glycans is intricately located on the cell surface, and the overall profile (the glycome, given the entire repertoire of glycoconjugate-associated sugars in cells and tissues) is believed to be crucial for the diverse roles of glycans, which are mediated by specific interactions that control cell-cell adhesion, immune response, microbial pathogenesis and other cellular events. The glycomic profile also reflects cellular alterations, such as development, differentiation and cancerous change. A glycoconjugate-based approach would therefore be expected to streamline discovery of novel cellular biomarkers. Development of such an approach has proven challenging, due to the technical difficulties associated with the analysis of various types of cellular glycomes; however, recent progress in the development of analytical methodologies and strategies has begun to clarify the cellular glycomics of various classes of glycoconjugates. This review focuses on recent advances in the technical aspects of cellular glycomic analyses of major classes of glycoconjugates, including N- and O-linked glycans, derived from glycoproteins, proteoglycans and glycosphingolipids. Articles that unveil the glycomics of various biologically important cells, including embryonic and somatic stem cells, induced pluripotent stem (iPS) cells and cancer cells, are discussed.

  7. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  8. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  9. Water channel structures analysed by electron crystallography.

    Science.gov (United States)

    Tani, Kazutoshi; Fujiyoshi, Yoshinori

    2014-05-01

    The mechanisms underlying water transport through aquaporin (AQP) have been debated for two decades. The water permeation phenomenon of AQP seems inexplicable because the Grotthuss mechanism does not allow for simultaneous fast water permeability and inhibition of proton transfer through the hydrogen bonds of water molecules. The AQP1 structure determined by electron crystallography provided the first insights into the proton exclusion mechanism despite fast water permeation. Although several studies have provided clues about the mechanism based on the AQP structure, each proposed mechanism remains incomplete. The present review is focused on AQP function and structure solved by electron crystallography in an attempt to fill the gaps between the findings in the absence and presence of lipids. Many AQP structures can be superimposed regardless of the determination method. The AQP fold is preserved even under conditions lacking lipids, but the water arrangement in the channel pore differs. The differences might be explained by dipole moments formed by the two short helices in the lipid bilayer. In addition, structure analyses of double-layered two-dimensional crystals of AQP suggest an array formation and cell adhesive function. Electron crystallography findings not only have contributed to resolve some of the water permeation mechanisms, but have also elucidated the multiple functions of AQPs in the membrane. The roles of AQPs in the brain remain obscure, but their multiple activities might be important in the regulation of brain and other biological functions. This article is part of a Special Issue entitled Aquaporins. © 2013.

  10. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  11. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  12. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  13. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  14. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  15. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  16. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  17. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  18. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  19. Repository simulation system (REPSIMS) for design analyses

    International Nuclear Information System (INIS)

    Griesmeyer, J.M.; Dennis, A.W.

    1989-01-01

    The Repository Simulation System (REPSIMS) combines graphic programming and interactive simulation to facilitate early identification of acceptable design concepts for a nuclear waste repository. REPSIMS is an object-oriented, menu-driven, versatile computer modeling system that allows the facility designer to create visual models of proposed facilities, graphically define operations, and using simulation analyses, determine the efficiencies of proposed designs and their operations. Hierarchical representations of both physical facilities and operations allow REPSIMS to be used early in the evaluation of conceptual designs as well as for the analysis of mature designs. High-level models of conceptual designs can be used to identify critical facility layout and operation issues. These preliminary models can then be refined to investigate those issues and to incorporate additional information as it becomes available. REPSIMS thus supports the typical top-down design process in which general specifications for major systems and operations are successively refined as the design progresses. REPSIMS has been used to determine the impact of using robotic, manual contact, or master/slave operations on cask turnaround times, throughput, and equipment utilization, and to investigate the impact of the ratio between truck and rail shipments to the repository. An analysis of alternative designs for the waste-handling building at Yucca Mountain has begun

  20. Kinematic gait analyses in healthy Golden Retrievers

    Directory of Open Access Journals (Sweden)

    Gabriela C.A. Silva

    2014-12-01

    Full Text Available Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female, aged between 2 and 4 years, weighing 21.5 to 28 kg, clinically normal. Flexion and extension were described for shoulder, elbow, carpal, hip, femorotibialis and tarsal joints. The gait was characterized lateral and had accepted hypothesis of normality for all variables, except for the stance of hip and elbow, considering a confidence level of 95%, significance level α = 0.05. Variations have been attributed to displacement of the stripes during movement and the duplicated number of reviews. The kinematic analysis proved to be a consistent method of evaluation of the movement during canine gait and the data can be used in the diagnosis and evaluation of canine gait in comparison to other studies and treatment of dogs with musculoskeletal disorders.

  1. Theoretical and computational analyses of LNG evaporator

    Science.gov (United States)

    Chidambaram, Palani Kumar; Jo, Yang Myung; Kim, Heuy Dong

    2017-04-01

    Theoretical and numerical analysis on the fluid flow and heat transfer inside a LNG evaporator is conducted in this work. Methane is used instead of LNG as the operating fluid. This is because; methane constitutes over 80% of natural gas. The analytical calculations are performed using simple mass and energy balance equations. The analytical calculations are made to assess the pressure and temperature variations in the steam tube. Multiphase numerical simulations are performed by solving the governing equations (basic flow equations of continuity, momentum and energy equations) in a portion of the evaporator domain consisting of a single steam pipe. The flow equations are solved along with equations of species transport. Multiphase modeling is incorporated using VOF method. Liquid methane is the primary phase. It vaporizes into the secondary phase gaseous methane. Steam is another secondary phase which flows through the heating coils. Turbulence is modeled by a two equation turbulence model. Both the theoretical and numerical predictions are seen to match well with each other. Further parametric studies are planned based on the current research.

  2. Computational Analyses for Transplant Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Anyou eWang

    2015-09-01

    Full Text Available Translational medicine offers a rich promise for improved diagnostics and drug discovery for biomedical research in the field of transplantation, where continued unmet diagnostic and therapeutic needs persist. Current advent of genomics and proteomics profiling called omics provides new resources to develop novel biomarkers for clinical routine. Establishing such a marker system heavily depends on appropriate applications of computational algorithms and software, which are basically based on mathematical theories and models. Understanding these theories would help to apply appropriate algorithms to ensure biomarker systems successful. Here, we review the key advances in theories and mathematical models relevant to transplant biomarker developments. Advantages and limitations inherent inside these models are discussed. The principles of key computational approaches for selecting efficiently the best subset of biomarkers from high dimensional omics data are highlighted. Prediction models are also introduced and the integration of multi-microarray data is also discussed. Appreciating these key advances would help to accelerate the development of clinically reliable biomarker systems.

  3. Population Pharmacokinetic Analyses of Lithium: A Systematic Review.

    Science.gov (United States)

    Methaneethorn, Janthima

    2018-02-01

    Even though lithium has been used for the treatment of bipolar disorder for several decades, its toxicities are still being reported. The major limitation in the use of lithium is its narrow therapeutic window. Several methods have been proposed to predict lithium doses essential to attain therapeutic levels. One of the methods used to guide lithium therapy is population pharmacokinetic approach which accounts for inter- and intra-individual variability in predicting lithium doses. Several population pharmacokinetic studies of lithium have been conducted. The objective of this review is to provide information on population pharmacokinetics of lithium focusing on nonlinear mixed effect modeling approach and to summarize significant factors affecting lithium pharmacokinetics. A literature search was conducted from PubMed database from inception to December, 2016. Studies conducted in humans, using lithium as a study drug, providing population pharmacokinetic analyses of lithium by means of nonlinear mixed effect modeling, were included in this review. Twenty-four articles were identified from the database. Seventeen articles were excluded based on the inclusion and exclusion criteria. A total of seven articles were included in this review. Of these, only one study reported a combined population pharmacokinetic-pharmacodynamic model of lithium. Lithium pharmacokinetics were explained using both one- and two-compartment models. The significant predictors of lithium clearance identified in most studies were renal function and body size. One study reported a significant effect of age on lithium clearance. The typical values of lithium clearance ranged from 0.41 to 9.39 L/h. The magnitude of inter-individual variability on lithium clearance ranged from 12.7 to 25.1%. Only two studies evaluated the models using external data sets. Model methodologies in each study are summarized and discussed in this review. For future perspective, a population pharmacokinetic

  4. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    Energy Technology Data Exchange (ETDEWEB)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions.

  5. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions

  6. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  7. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  8. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  9. Molecular biomarker analyses using circulating tumor cells.

    Directory of Open Access Journals (Sweden)

    Elizabeth A Punnoose

    2010-09-01

    Full Text Available Evaluation of cancer biomarkers from blood could significantly enable biomarker assessment by providing a relatively non-invasive source of representative tumor material. Circulating Tumor Cells (CTCs isolated from blood of metastatic cancer patients hold significant promise in this regard.Using spiked tumor-cells we evaluated CTC capture on different CTC technology platforms, including CellSearch and two biochip platforms, and used the isolated CTCs to develop and optimize assays for molecular characterization of CTCs. We report similar performance for the various platforms tested in capturing CTCs, and find that capture efficiency is dependent on the level of EpCAM expression. We demonstrate that captured CTCs are amenable to biomarker analyses such as HER2 status, qRT-PCR for breast cancer subtype markers, KRAS mutation detection, and EGFR staining by immunofluorescence (IF. We quantify cell surface expression of EGFR in metastatic lung cancer patient samples. In addition, we determined HER2 status by IF and FISH in CTCs from metastatic breast cancer patients. In the majority of patients (89% we found concordance with HER2 status from patient tumor tissue, though in a subset of patients (11%, HER2 status in CTCs differed from that observed in the primary tumor. Surprisingly, we found CTC counts to be higher in ER+ patients in comparison to HER2+ and triple negative patients, which could be explained by low EpCAM expression and a more mesenchymal phenotype of tumors belonging to the basal-like molecular subtype of breast cancer.Our data suggests that molecular characterization from captured CTCs is possible and can potentially provide real-time information on biomarker status. In this regard, CTCs hold significant promise as a source of tumor material to facilitate clinical biomarker evaluation. However, limitations exist from a purely EpCAM based capture system and addition of antibodies to mesenchymal markers could further improve CTC

  10. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  11. EP 1000 steam generator tube rupture analyses

    International Nuclear Information System (INIS)

    Saiu, G.; Frogheri, M.; Schulz, T.L.

    2001-01-01

    European electrical utility organizations together with Westinghouse and Ansaldo are participating in a program to utilize the Westinghouse passive nuclear plant technology to develop a plant which meets the European Utility Requirements (EUR) and is expected to be licensable in Europe. The program was initiated in 1994 and the plant is designated EP1000. The EP1000 design is notable for simplicity that comes from a reliance on passive safety systems to enhance plant safety. The use of passive safety systems has provided significant and measurable improvements in plant simplification, safety, reliability, investment protection and plant costs. These systems use only natural forces such as gravity, natural circulation, and compressed gas to provide the driving forces for the systems to adequately cool the reactor core following an initiating event. The EP1000 builds up on the Westinghouse passive nuclear plant technology to enhance plant safety and meet European Utility Requirements and specific European National Safety Criteria. This paper summarizes the main results of the Steam Generator Tube Rupture (SGTR) analysis activity, performed in Phase 2B of the European Passive Plant Program. The purpose of the study is to provide evidence that the passive safety system performance provides a significant improvement in terms of safety, providing significant margins to steam generator overfilling and reducing the need for operator actions. The behavior of the EP1000 plant following SGTR accidents has been analyzed by means of the RELAP5/Mod3.2 code. Sensitivity cases were performed, to address the impact of varying the number of steam generator tubes that rupture, and the potential adverse interactions that could result from operation of control systems (i.e., Chemical and Volume Control System, Startup Feedwater). Analyses have also been performed to define and verify improved protection system logic to avoid possible steam generator safety valve challenges both in the

  12. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  13. On study design in neuroimaging heritability analyses

    Science.gov (United States)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  14. Analyses of Hypomethylated Oil Palm Gene Space

    Science.gov (United States)

    Jayanthi, Nagappan; Mohd-Amin, Ab Halim; Azizi, Norazah; Chan, Kuang-Lim; Maqbool, Nauman J.; Maclean, Paul; Brauning, Rudi; McCulloch, Alan; Moraga, Roger; Ong-Abdullah, Meilina; Singh, Rajinder

    2014-01-01

    Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC) were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP) markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm. PMID:24497974

  15. Quantitative DNA Analyses for Airborne Birch Pollen.

    Directory of Open Access Journals (Sweden)

    Isabell Müller-Germann

    Full Text Available Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR, which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8 and the other for a multi-copy gene (ITS. The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm, the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  16. Analyses of hypomethylated oil palm gene space.

    Directory of Open Access Journals (Sweden)

    Eng-Ti L Low

    Full Text Available Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm.

  17. Predictive modelling of noise level generated during sawing of rocks ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... Influence of the operating variables and rock properties on the noise level are investigated and analysed. Statistical analyses are then employed and models are built for the prediction of noise levels depending on the operating variables and the rock properties. The derived models are validated through ...

  18. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    Science.gov (United States)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums

  19. Síndrome de Netherton com 20 anos de acompanhamento Netherton's Syndrome with a 20-year follow-up

    Directory of Open Access Journals (Sweden)

    Rodrigo Pereira Duquia

    2006-12-01

    Full Text Available A síndrome de Netherton é doença cutânea autossômica recessiva caracterizada por eritrodermia congênita, anormalidade específica dos pêlos denominada tricorrexe invaginada e manifestações atópicas. Os autores relatam acompanhamento de mais de 20 anos de paciente com essa doença e a melhora importante do cabelo com o uso de Acitretina.Netherton's syndrome is a recessive autosomal skin disease, characterized by congenital erythroderma, hair anomalies such as trichorrhexis invaginata, and atopic manifestations. The case of a female patient with a 20-year follow-up is described, with an important improvement of hair alterations after use of oral acitretine.

  20. Assessing the gap in female authorship in the journal Emergency Radiology: trends over a 20-year period.

    Science.gov (United States)

    McKenzie, Kristopher; Ramonas, Milita; Patlas, Michael; Katz, Douglas S

    2017-12-01

    To examine trends in female authorship in the journal Emergency Radiology from January 1994 to December 2014. We obtained institutional review board approval for our study. We retrospectively reviewed a total of 1617 articles published in the journal Emergency Radiology over a 20-year period. Original articles, case reports, review articles, and pictorial essays were included. The first and last position author's gender was categorized as female or male. We analyzed trends by comparing the first and last position authors of original articles from the first and last year reviewed. We utilized Chi-square test for statistical analysis, with a p value years, there has been a statistically significant upward trend in female last position authors publishing in the journal Emergency Radiology.

  1. Metabolism of [3H]Gibberellin A20 by plants of Bryophyllum daigremontianum under long- and short-day conditions

    International Nuclear Information System (INIS)

    Durley, R.C.; Pharis, R.P.; Zeevaart, J.A.D.

    1975-01-01

    [ 3 H]Gibberellin A 20 ([ 3 H]GA 20 ), a native gibberellin of this plant, was injected into mature leaves of Bryophyllum daigremontianum (Hamet et Perr.) Berg. under long- and short-day conditions. It was converted, in order of decreasing yields, to GA 29 , 3-epi-GA 1 (pseudo GA 1 ), C/D-ring-rearranged GA 20 , and two minor, unidentified metabolites. Identifications were made by gas-liquid chromatography with radioactive monitoring using three different phases. Metabolism to 3-epi-GA 1 was greater under short days, particularly in the treated leaf pair, although the absolute amount of GA 29 was greater than that of 3-epi-GA 1 under both photoperiods. The levels of radioactive metabolites in the shoots above the treated leaf pair gradually increased over a 51-day period, GA 29 reaching 5 times the content of 3-epi-GA 1 . (orig.) [de

  2. Immediately loaded blade implant retrieved from a after a 20-year loading period: a histologic and histomorphometric case report.

    Science.gov (United States)

    Di Stefano, Danilo; Iezzi, Giovanna; Scarano, Antonio; Perrotti, Vittoria; Piattelli, Adriano

    2006-01-01

    Immediate loading of root-form dental implants has shown promising results and offers treatment cost and convenience advantages to patients. Although blade implants have been immediately loaded for over 2 decades, the ability of this implant design to achieve osseointegration has been debated. The aim of the present study was to histologically evaluate the peri-implant tissues of an immediately loaded blade implant retrieved for abutment fracture after a 20-year loading period. Histologic samples were prepared and examined by light microscope. Compact, cortical, mature bone with well-formed osteons was present at the interface of the implant. Bone-to-implant contact was 51% +/- 6%. The histologic data showed that osseointegration was obtained in an immediately loaded blade implant inserted into the mandible, and that mineralized tissues were maintained at the interface over a long period (20 years).

  3. DMINDA: an integrated web server for DNA motif identification and analyses.

    Science.gov (United States)

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. A systematic review of the quality and impact of anxiety disorder meta-analyses.

    Science.gov (United States)

    Ipser, Jonathan C; Stein, Dan J

    2009-08-01

    Meta-analyses are seen as representing the pinnacle of a hierarchy of evidence used to inform clinical practice. Therefore, the potential importance of differences in the rigor with which they are conducted and reported warrants consideration. In this review, we use standardized instruments to describe the scientific and reporting quality of meta-analyses of randomized controlled trials of the treatment of anxiety disorders. We also use traditional and novel metrics of article impact to assess the influence of meta-analyses across a range of research fields in the anxiety disorders. Overall, although the meta-analyses that we examined had some flaws, their quality of reporting was generally acceptable. Neither the scientific nor reporting quality of the meta-analyses was predicted by any of the impact metrics. The finding that treatment meta-analyses were cited less frequently than quantitative reviews of studies in current "hot spots" of research (ie, genetics, imaging) points to the multifactorial nature of citation patterns. A list of the meta-analyses included in this review is available on an evidence-based website of anxiety and trauma-related disorders.

  5. Predictable or not predictable? The MOV question

    International Nuclear Information System (INIS)

    Thibault, C.L.; Matzkiw, J.N.; Anderson, J.W.; Kessler, D.W.

    1994-01-01

    Over the past 8 years, the nuclear industry has struggled to understand the dynamic phenomena experienced during motor-operated valve (MOV) operation under differing flow conditions. For some valves and designs, their operational functionality has been found to be predictable; for others, unpredictable. Although much has been accomplished over this period of time, especially on modeling valve dynamics, the unpredictability of many valves and designs still exists. A few valve manufacturers are focusing on improving design and fabrication techniques to enhance product reliability and predictability. However, this approach does not address these issues for installed and inpredictable valves. This paper presents some of the more promising techniques that Wyle Laboratories has explored with potential for transforming unpredictable valves to predictable valves and for retrofitting installed MOVs. These techniques include optimized valve tolerancing, surrogated material evaluation, and enhanced surface treatments

  6. A genetic study of sex hormone--binding globulin measured before and after a 20-week endurance exercise training program: the HERITAGE Family Study.

    Science.gov (United States)

    An, P; Rice, T; Gagnon, J; Hong, Y; Leon, A S; Skinner, J S; Wilmore, J H; Bouchard, C; Rao, D C

    2000-08-01

    Familial aggregation and a major gene effect were assessed for baseline serum sex hormone-binding globulin (SHBG) levels and the response (post-training minus baseline) to a 20-week endurance training program in a selected sample of 428 non-obese nonhypertensive individuals from 99 white families who were sedentary at baseline in the HERITAGE Family Study. Baseline SHBG levels were not normally distributed, and were therefore logarithmically transformed prior to genetic analyses. In a sample without postmenopausal mothers, maximal (genetic and familial environmental) heritabilities were 50% averaged across sexes, 73% in men, 50% in women, and 31% in men versus women for the age-body mass index (BMI)-adjusted baseline. The estimate reached 64% when the baseline was further adjusted for the effects of estradiol, fasting insulin, and testosterone levels. For the response to training, no sex difference was found and the heritability reached about 25% to 32%. Segregation analysis was separately performed in the whole sample and in the sample without postmenopausal mothers. In addition to a multifactorial effect for both the baseline and the response to training, a major effect for the baseline appeared to be familial environmental in origin, whereas a major effect for the response to training was Mendelian in nature. The major gene effect for the response to training in the whole sample was undetectable in the sample without postmenopausal mothers, and it is therefore possible that the postmenopausal mothers, characterized by decreased sex hormones with or without estrogen replacement therapy for menopause, produced some confounding effects. In addition, the reduced sample size might also be a plausible candidate explanation. The novel finding in this study is that baseline SHBG levels and the response to training were influenced by a multifactorial effect with sex difference for the baseline. The response to training appeared to be additionally influenced by a single

  7. Does a 20-week aerobic exercise training programme increase our capabilities to buffer real-life stressors? A randomized, controlled trial using ambulatory assessment.

    Science.gov (United States)

    von Haaren, Birte; Ottenbacher, Joerg; Muenz, Julia; Neumann, Rainer; Boes, Klaus; Ebner-Priemer, Ulrich

    2016-02-01

    The cross-stressor adaptation hypothesis suggests that regular exercise leads to adaptations in the stress response systems that induce decreased physiological responses to psychological stressors. Even though an exercise intervention to buffer the detrimental effects of psychological stressors on health might be of utmost importance, empirical evidence is mixed. This may be explained by the use of cross-sectional designs and non-personally relevant stressors. Using a randomized controlled trial, we hypothesized that a 20-week aerobic exercise training does reduce physiological stress responses to psychological real-life stressors in sedentary students. Sixty-one students were randomized to either a control group or an exercise training group. The academic examination period (end of the semester) served as a real-life stressor. We used ambulatory assessment methods to assess physiological stress reactivity of the autonomic nervous system (heart rate variability: LF/HF, RMSSD), physical activity and perceived stress during 2 days of everyday life and multilevel models for data analyses. Aerobic capacity (VO2max) was assessed pre- and post-intervention via cardiopulmonary exercise testing to analyze the effectiveness of the intervention. During real-life stressors, the exercise training group showed significantly reduced LF/HF (β = -0.15, t = -2.59, p = .01) and increased RMSSD (β = 0.15, t = 2.34, p = .02) compared to the control group. Using a randomized controlled trial and a real-life stressor, we could show that exercise appears to be a useful preventive strategy to buffer the effects of stress on the autonomic nervous system, which might result into detrimental health outcomes.

  8. Association of early adult modifiable cardiovascular risk factors with left atrial size over a 20-year follow-up period: the CARDIA study

    Science.gov (United States)

    Armstrong, Anderson C; Gidding, Samuel S; Colangelo, Laura A; Kishi, Satoru; Liu, Kiang; Sidney, Stephen; Konety, Suma; Lewis, Cora E; Correia, Luís C L; Lima, Joao A C

    2014-01-01

    Objectives We investigate how early adult and 20-year changes in modifiable cardiovascular risk factors (MRF) predict left atrial dimension (LAD) at age 43–55 years. Methods The Coronary Artery Risk Development in Young Adults (CARDIA) study enrolled black and white adults (1985–1986). We included 2903 participants with echocardiography and MRF assessment in follow-up years 5 and 25. At years 5 and 25, LAD was assessed by M-mode echocardiography, then indexed to body surface area (BSA) or height. Blood pressure (BP), body mass index (BMI), heart rate (HR), smoking, alcohol use, diabetes and physical activity were defined as MRF. Associations of MRF with LAD were assessed using multivariable regression adjusted for age, ethnicity, gender and year-5 left atrial (LA) size. Results The participants were 30±4 years; 55% white; 44% men. LAD and LAD/height were modest but significantly higher over the follow-up period, but LAD/BSA decreased slightly. Increased baseline and 20-year changes in BP were related to enlargement of LAD and indices. Higher baseline and changes in BMI were also related to higher LAD and LAD/height, but the opposite direction was found for LAD/BSA. Increase in baseline HR was related to lower LAD but not LAD indices, when only baseline covariates were included in the model. However, baseline and 20-year changes in HR were significantly associated to LA size. Conclusions In a biracial cohort of young adults, the most robust predictors for LA enlargement over a 20-year follow-up period were higher BP and BMI. However, an inverse direction was found for the relationship between BMI and LAD/BSA. HR showed an inverse relation to LA size. PMID:24384901

  9. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  10. Nonlinear finite element analyses: advances and challenges in dental applications.

    Science.gov (United States)

    Wakabayashi, N; Ona, M; Suzuki, T; Igarashi, Y

    2008-07-01

    To discuss the development and current status of application of nonlinear finite element method (FEM) in dentistry. The literature was searched for original research articles with keywords such as nonlinear, finite element analysis, and tooth/dental/implant. References were selected manually or searched from the PUBMED and MEDLINE databases through November 2007. The nonlinear problems analyzed in FEM studies were reviewed and categorized into: (A) nonlinear simulations of the periodontal ligament (PDL), (B) plastic and viscoelastic behaviors of dental materials, (C) contact phenomena in tooth-to-tooth contact, (D) contact phenomena within prosthodontic structures, and (E) interfacial mechanics between the tooth and the restoration. The FEM in dentistry recently focused on simulation of realistic intra-oral conditions such as the nonlinear stress-strain relationship in the periodontal tissues and the contact phenomena in teeth, which could hardly be solved by the linear static model. The definition of contact area critically affects the reliability of the contact analyses, especially for implant-abutment complexes. To predict the failure risk of a bonded tooth-restoration interface, it is essential to assess the normal and shear stresses relative to the interface. The inclusion of viscoelasticity and plastic deformation to the program to account for the time-dependent, thermal sensitive, and largely deformable nature of dental materials would enhance its application. Further improvement of the nonlinear FEM solutions should be encouraged to widen the range of applications in dental and oral health science.

  11. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  12. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  13. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  14. Genome-wide analyses of small noncoding RNAs in streptococci

    Directory of Open Access Journals (Sweden)

    Nadja ePatenge

    2015-05-01

    Full Text Available Streptococci represent a diverse group of Gram-positive bacteria, which colonize a wide range of hosts among animals and humans. Streptococcal species occur as commensal as well as pathogenic organisms. Many of the pathogenic species can cause severe, invasive infections in their hosts leading to a high morbidity and mortality. The consequence is a tremendous suffering on the part of men and livestock besides the significant financial burden in the agricultural and healthcare sectors. An environmentally stimulated and tightly controlled expression of virulence factor genes is of fundamental importance for streptococcal pathogenicity. Bacterial small noncoding RNAs (sRNAs modulate the expression of genes involved in stress response, sugar metabolism, surface composition, and other properties that are related to bacterial virulence. Even though the regulatory character is shared by this class of RNAs, variation on the molecular level results in a high diversity of functional mechanisms. The knowledge about the role of sRNAs in streptococci is still limited, but in recent years, genome-wide screens for sRNAs have been conducted in an increasing number of species. Bioinformatics prediction approaches have been employed as well as expression analyses by classical array techniques or next generation sequencing. This review will give an overview of whole genome screens for sRNAs in streptococci with a focus on describing the different methods and comparing their outcome considering sRNA conservation among species, functional similarities, and relevance for streptococcal infection.

  15. Comparative transcriptomic analyses of male and female adult Toxocara canis.

    Science.gov (United States)

    Zhou, Rong-Qiong; Ma, Guang-Xu; Korhonen, Pasi K; Luo, Yong-Li; Zhu, Hong-Hong; Luo, Yong-Fang; Gasser, Robin B; Xia, Qing-You

    2017-02-05

    Toxocariasis is an important, neglected zoonosis caused mainly by Toxocara canis. Although our knowledge of helminth molecular biology is improving through completed draft genome projects, there is limited detailed information on the molecular biology of Toxocara species. Here, transcriptomic sequencing of male and female adult T. canis and comparative analyses were conducted. For each sex, two-thirds (66-67%) of quality-filtered reads mapped to the gene set of T. canis, and at least five reads mapped to each of 16,196 (87.1%) of all 18,596 genes, and 321 genes were specifically transcribed in female and 1467 in male T. canis. Genes differentially transcribed between the two sexes were identified, enriched biological processes and pathways linked to these genes established, and molecules associated with reproduction and development predicted. In addition, small RNA pathways involved in reproduction were characterized, but there was no evidence for piwi RNA pathways in adult T. canis. The results of this transcriptomic study should provide a useful basis to support investigations of the reproductive biology of T. canis and related nematodes. 2 . Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Experimental and numerical analyses of magnesium alloy hot workability

    Directory of Open Access Journals (Sweden)

    F. Abbassi

    2016-12-01

    Full Text Available Due to their hexagonal crystal structure, magnesium alloys have relatively low workability at room temperature. In this study, the hot workability behavior of cast-extruded AZ31B magnesium alloy is studied through hot compression testing, numerical modeling and microstructural analyses. Hot deformation tests are performed at temperatures of 250 °C to 400 °C under strain rates of 0.01 to 1.0 s−1. Transmission electron microscopy is used to reveal the presence of dynamic recrystallization (DRX, dynamic recovery (DRY, cracks and shear bands. To predict plastic instabilities during hot compression tests of AZ31B magnesium alloy, the authors use Johnson–Cook damage model in a 3D finite element simulation. The optimal hot workability of magnesium alloy is found at a temperature (T of 400 °C and strain rate (ε˙ of 0.01 s−1. Stability is found at a lower strain rate, and instability is found at a higher strain rate.

  17. Sensitivity analyses of the peach bottom turbine trip 2 experiment

    International Nuclear Information System (INIS)

    Bousbia Salah, A.; D'Auria, F.

    2003-01-01

    In the light of the sustained development in computer technology, the possibilities for code calculations in predicting more realistic transient scenarios in nuclear power plants have been enlarged substantially. Therefore, it becomes feasible to perform 'Best-estimate' simulations through the incorporation of three-dimensional modeling of reactor core into system codes. This method is particularly suited for complex transients that involve strong feedback effects between thermal-hydraulics and kinetics as well as to transient involving local asymmetric effects. The Peach bottom turbine trip test is characterized by a prompt core power excursion followed by a self limiting power behavior. To emphasize and understand the feedback mechanisms involved during this transient, a series of sensitivity analyses were carried out. This should allow the characterization of discrepancies between measured and calculated trends and assess the impact of the thermal-hydraulic and kinetic response of the used models. On the whole, the data comparison revealed a close dependency of the power excursion with the core feedback mechanisms. Thus for a better best estimate simulation of the transient, both of the thermal-hydraulic and the kinetic models should be made more accurate. (author)

  18. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  19. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  20. Advanced core-analyses for subsurface characterization

    Science.gov (United States)

    Pini, R.

    2017-12-01

    The heterogeneity of geological formations varies over a wide range of length scales and represents a major challenge for predicting the movement of fluids in the subsurface. Although they are inherently limited in the accessible length-scale, laboratory measurements on reservoir core samples still represent the only way to make direct observations on key transport properties. Yet, properties derived on these samples are of limited use and should be regarded as sample-specific (or `pseudos'), if the presence of sub-core scale heterogeneities is not accounted for in data processing and interpretation. The advent of imaging technology has significantly reshaped the landscape of so-called Special Core Analysis (SCAL) by providing unprecedented insight on rock structure and processes down to the scale of a single pore throat (i.e. the scale at which all reservoir processes operate). Accordingly, improved laboratory workflows are needed that make use of such wealth of information by e.g., referring to the internal structure of the sample and in-situ observations, to obtain accurate parameterisation of both rock- and flow-properties that can be used to populate numerical models. We report here on the development of such workflow for the study of solute mixing and dispersion during single- and multi-phase flows in heterogeneous porous systems through a unique combination of two complementary imaging techniques, namely X-ray Computed Tomography (CT) and Positron Emission Tomography (PET). The experimental protocol is applied to both synthetic and natural porous media, and it integrates (i) macroscopic observations (tracer effluent curves), (ii) sub-core scale parameterisation of rock heterogeneities (e.g., porosity, permeability and capillary pressure), and direct 3D observation of (iii) fluid saturation distribution and (iv) the dynamic spreading of the solute plumes. Suitable mathematical models are applied to reproduce experimental observations, including both 1D and 3D

  1. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  2. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  3. The tumor necrosis factor alpha-induced protein 3 (TNFAIP3, A20) imposes a brake on antitumor activity of CD8 T cells.

    Science.gov (United States)

    Giordano, Marilyn; Roncagalli, Romain; Bourdely, Pierre; Chasson, Lionel; Buferne, Michel; Yamasaki, Sho; Beyaert, Rudi; van Loo, Geert; Auphan-Anezin, Nathalie; Schmitt-Verhulst, Anne-Marie; Verdeil, Grégory

    2014-07-29

    The transcription factor NF-κB is central to inflammatory signaling and activation of innate and adaptive immune responses. Activation of the NF-κB pathway is tightly controlled by several negative feedback mechanisms, including A20, an ubiquitin-modifying enzyme encoded by the tnfaip3 gene. Mice with selective deletion of A20 in myeloid, dendritic, or B cells recapitulate some human inflammatory pathology. As we observed high expression of A20 transcripts in dysfunctional CD8 T cells in an autochthonous melanoma, we analyzed the role of A20 in regulation of CD8 T-cell functions, using mice in which A20 was selectively deleted in mature conventional T cells. These mice developed lymphadenopathy and some organ infiltration by T cells but no splenomegaly and no detectable pathology. A20-deleted CD8 T cells had increased sensitivity to antigen stimulation with production of large amounts of IL-2 and IFNγ, correlated with sustained nuclear expression of NF-κB components reticuloendotheliosis oncogene c-Rel and p65. Overexpression of A20 by retroviral transduction of CD8 T cells dampened their intratumor accumulation and antitumor activity. In contrast, relief from the A20 brake in NF-κB activation in adoptively transferred antitumor CD8 T cells led to improved control of melanoma growth. Tumor-infiltrating A20-deleted CD8 T cells had enhanced production of IFNγ and TNFα and reduced expression of the inhibitory receptor programmed cell death 1. As manipulation of A20 expression in CD8 T cells did not result in pathologic manifestations in the mice, we propose it as a candidate to be targeted to increase antitumor efficiency of adoptive T-cell immunotherapy.

  4. Runtime and Pressurization Analyses of Propellant Tanks

    Science.gov (United States)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  5. Sensitivity Analyses for Robust Causal Inference from Mendelian Randomization Analyses with Multiple Genetic Variants.

    Science.gov (United States)

    Burgess, Stephen; Bowden, Jack; Fall, Tove; Ingelsson, Erik; Thompson, Simon G

    2017-01-01

    Mendelian randomization investigations are becoming more powerful and simpler to perform, due to the increasing size and coverage of genome-wide association studies and the increasing availability of summarized data on genetic associations with risk factors and disease outcomes. However, when using multiple genetic variants from different gene regions in a Mendelian randomization analysis, it is highly implausible that all the genetic variants satisfy the instrumental variable assumptions. This means that a simple instrumental variable analysis alone should not be relied on to give a causal conclusion. In this article, we discuss a range of sensitivity analyses that will either support or question the validity of causal inference from a Mendelian randomization analysis with multiple genetic variants. We focus on sensitivity analyses of greatest practical relevance for ensuring robust causal inferences, and those that can be undertaken using summarized data. Aside from cases in which the justification of the instrumental variable assumptions is supported by strong biological understanding, a Mendelian randomization analysis in which no assessment of the robustness of the findings to violations of the instrumental variable assumptions has been made should be viewed as speculative and incomplete. In particular, Mendelian randomization investigations with large numbers of genetic variants without such sensitivity analyses should be treated with skepticism.

  6. Modelling bankruptcy prediction models in Slovak companies

    Directory of Open Access Journals (Sweden)

    Kovacova Maria

    2017-01-01

    Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.

  7. Solar Cycle Predictions

    Science.gov (United States)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  8. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  9. Weld investigations by 3D analyses of Charpy V-notch specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo; Needleman, Allan

    2005-01-01

    The Charpy impact test is a standard procedure for determining the ductile-brittle transition in welds. The predictions of such tests have been investigated by full three dimensional transient analyses of Charpy V-notch specimens. The material response is characterised by an elastic-viscoplastic ......The Charpy impact test is a standard procedure for determining the ductile-brittle transition in welds. The predictions of such tests have been investigated by full three dimensional transient analyses of Charpy V-notch specimens. The material response is characterised by an elastic...... parameters in the weld material differ from those in the base material, and the heat a®ected zone (HAZ) tends to be more brittle than the other material regions. The effect of weld strength undermatch or overmatch is an important issue. Some specimens, for which the notched surface is rotated relative...

  10. MCNP benchmark analyses of critical experiments for the Space Nuclear Thermal Propulsion program

    Science.gov (United States)

    Selcow, Elizabeth C.; Cerbone, Ralph J.; Ludewig, Hans; Mughabghab, Said F.; Schmidt, Eldon; Todosow, Michael; Parma, Edward J.; Ball, Russell M.; Hoovler, Gary S.

    1993-01-01

    Benchmark analyses have been performed of Particle Bed Reactor (PBR) critical experiments (CX) using the MCNP radiation transport code. The experiments have been conducted at the Sandia National Laboratory reactor facility in support of the Space Nuclear Thermal Propulsion (SNTP) program. The test reactor is a nineteen element water moderated and reflected thermal system. A series of integral experiments have been carried out to test the capabilities of the radiation transport codes to predict the performance of PBR systems. MCNP was selected as the preferred radiation analysis tool for the benchmark experiments. Comparison between experimental and calculational results indicate close agreement. This paper describes the analyses of benchmark experiments designed to quantify the accuracy of the MCNP radiation transport code for predicting the performance characteristics of PBR reactors.

  11. MCNP benchmark analyses of critical experiments for the Space Nuclear Thermal Propulsion program

    International Nuclear Information System (INIS)

    Selcow, E.C.; Cerbone, R.J.; Ludewig, H.; Mughabghab, S.F.; Schmidt, E.; Todosow, M.; Parma, E.J.; Ball, R.M.; Hoovler, G.S.

    1993-01-01

    Benchmark analyses have been performed of Particle Bed Reactor (PBR) critical experiments (CX) using the MCNP radiation transport code. The experiments have been conducted at the Sandia National Laboratory reactor facility in support of the Space Nuclear Thermal Propulsion (SNTP) program. The test reactor is a nineteen element water moderated and reflected thermal system. A series of integral experiments have been carried out to test the capabilities of the radiation transport codes to predict the performance of PBR systems. MCNP was selected as the preferred radiation analysis tool for the benchmark experiments. Comparison between experimental and calculational results indicate close agreement. This paper describes the analyses of benchmark experiments designed to quantify the accuracy of the MCNP radiation transport code for predicting the performance characteristics of PBR reactors

  12. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Rev 00

    Energy Technology Data Exchange (ETDEWEB)

    David Dobson

    2001-06-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate and

  13. Microstructure Evolution and Flow Stress Model of a 20Mn5 Hollow Steel Ingot during Hot Compression.

    Science.gov (United States)

    Liu, Min; Ma, Qing-Xian; Luo, Jian-Bin

    2018-03-21

    20Mn5 steel is widely used in the manufacture of heavy hydro-generator shaft due to its good performance of strength, toughness and wear resistance. However, the hot deformation and recrystallization behaviors of 20Mn5 steel compressed under high temperature were not studied. In this study, the hot compression experiments under temperatures of 850-1200 °C and strain rates of 0.01/s-1/s are conducted using Gleeble thermal and mechanical simulation machine. And the flow stress curves and microstructure after hot compression are obtained. Effects of temperature and strain rate on microstructure are analyzed. Based on the classical stress-dislocation relation and the kinetics of dynamic recrystallization, a two-stage constitutive model is developed to predict the flow stress of 20Mn5 steel. Comparisons between experimental flow stress and predicted flow stress show that the predicted flow stress values are in good agreement with the experimental flow stress values, which indicates that the proposed constitutive model is reliable and can be used for numerical simulation of hot forging of 20Mn5 hollow steel ingot.

  14. Evaluation of high-resolution precipitation analyses using a dense station network

    OpenAIRE

    A. Kann; I. Meirold-Mautner; F. Schmid; G. Kirchengast; J. Fuchsberger; V. Meyer; L. Tüchler; B. Bica

    2015-01-01

    The ability of radar–rain gauge merging algorithms to precisely analyse convective precipitation patterns is of high interest for many applications, e.g. hydrological modelling, thunderstorm warnings, and, as a reference, to spatially validate numerical weather prediction models. However, due to drawbacks of methods like cross-validation and due to the limited availability of reference data sets on high temporal and spatial scales, an adequate validation is usually hardly po...

  15. Seismic criteria studies and analyses. Quarterly progress report No. 3. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-03

    Information is presented concerning the extent to which vibratory motions at the subsurface foundation level might differ from motions at the ground surface and the effects of the various subsurface materials on the overall Clinch River Breeder Reactor site response; seismic analyses of LMFBR type reactors to establish analytical procedures for predicting structure stresses and deformations; and aspects of the current technology regarding the representation of energy losses in nuclear power plants as equivalent viscous damping.

  16. Life satisfaction and frailty in community-based older adults: cross-sectional and prospective analyses.

    Science.gov (United States)

    St John, Philip D; Tyas, Suzanne L; Montgomery, Patrick R

    2013-10-01

    Frailty may be associated with reduced life satisfaction (LS). The objectives of this paper are to determine if (1) frailty is associated with LS in community-dwelling older adults in cross-sectional analyses; (2) frailty predicts LS five years later; and (3) specific domains of LS are preferentially associated with frailty. This paper presents analysis of an existing population-based cohort study of 1,751 persons aged 65+ who were assessed in 1991, with follow-up five years later. LS was measured using the terrible-delightful scale, which measures overall LS and LS in specific domains. Frailty was measured using the Brief Frailty Instrument. Analyses were adjusted for age, gender, education, and marital status. Frailty was associated with overall LS at time 1 and predicted overall LS at time 2. This was seen in unadjusted analyses and after adjusting for confounding factors. Frailty was associated with all domains of LS at time 1, and predicted LS at time 2 in all domains except housing and self-esteem. However, the effect was stronger for LS with health than with other domains for both times 1 and 2. Frailty is associated with LS, and the effect is strongest for LS with health.

  17. A Rare Case of Charcot-Mari-Tooth Disease Type 2S in a 20-year-old Man

    Directory of Open Access Journals (Sweden)

    Natalia A. Shnayder

    2017-12-01

    Full Text Available Charcot-Marie-Tooth disease type 2 (CMT2S is rare form of Charcot-Marie-Tooth disease (CMT that is characterized by a mutation in the IGHMBP2 gene. This gene encodes a helicase superfamily member that binds a specific DNA sequence from the region of the immunoglobulin mu chain switch. Mutation of this gene leads to spinal muscle atrophy with respiratory distress type 1 and CMT2S. This case report presents a 20-year-old male with genetically confirmed CMT2S having clinical respiratory involvement and symmetrically involved lower extremities. DNA sequencing revealed a previously unknown heterozygous mutation in the exone 2 of the IGHMBP2 gene leading to the replacement of the amino acid in the 46 position of the protein (chr11q13.3: 68673587 G>C. These atypical features widen the clinical spectrum of CMT2S. In describing this clinical case, we also improve diagnostic management and try to increase the alertness of various doctors towards neuromuscular diseases, including CMT.

  18. A 20-40 MHz low-power clock oscillator with open-loop frequency calibration and temperature compensation

    Science.gov (United States)

    Lee, Dongsoo; Kim, Hongjin; Lee, Kang-Yoon

    2014-05-01

    In this paper, a 20-40 MHz low-power clock oscillator is presented to provide the frequency reference in data interface applications. The frequency source is referenced to a frequency-calibrated and temperature-compensated 2.5 GHz LC VCO that is implemented with a bondwire inductor. Class-C type VCO is adopted in order to improve the phase noise and reduce the current consumption. A full digital frequency calibration circuit is proposed to cover the wide output frequency range minimizing the frequency variation. External crystal oscillator (REF_CLK) is used only for the absolute frequency calibration at the initial programming stage and is not needed after the programming stage. On the other hand, temperature compensation is performed in an analogue way by controlling the varactor in the LC VCO. This chip is fabricated using 0.18-µm CMOS with the option of lateral PNP transistor. Lateral PNP transistors are used in the temperature compensation circuits. It can be implemented laterally in standard CMOS process. The power consumption is 4.8 mW from a 1.8 V supply. The accuracy of the frequency is ±58 ppm from -20°C to 80°C. The nominal phase noise at 1 MHz and period jitter is -122 dBc/Hz and 2 ps, respectively, when the output frequency is 25 MHz.

  19. Trends in frequency and prevalence of oral cancer and oral squamous cell carcinoma in Mexicans. A 20 years retrospective study.

    Science.gov (United States)

    Gaitán-Cepeda, Luis-Alberto; Peniche-Becerra, Adriana-Graciela; Quezada-Rivera, Daniel

    2011-01-01

    To establish the time trends of the frequency and prevalence of oral cavity cancer in regard to age and gender in a 20-years (time period 1989 - 2008) cohort of Mexicans. 13,235 head and neck biopsies from the archive of the Oral Pathology Laboratory, Dental School, National Autonomous University of Mexico were revised. The cases with diagnoses of oral cancer were selected. Gender and age at diagnosis was obtained from medical records. The frequency and prevalence of oral cavity cancer and oral squamous cell carcinoma were assessed biannually in regard to the total number of population served by the oral pathology laboratory. The statistical significance of trends was established using the linear logistic regression (curve estimation) test (s 0.05). 298 cases (138 males; 160 females) of oral cancer were included; 167 (92 females; 75 males; female:male ratio: 1.1:1) corresponded to oral squamous cell carcinoma. From 1989 to 2008 the prevalence of oral cancer and oral squamous cell carcinoma increased 200% (s 0.05) and 100% (s 0.000) respectively. The increase of frequency and prevalence was observed in both genders however only in females was significant (s 0.000). We do not identify changes in the age at diagnosis. Oral cancer, specifically oral squamous cell carcinoma, has increase in Mexicans females in the last 20 years.

  20. NIR Color vs Launch Date: A 20-year Analysis of Space Weathering Effects on the Boeing 376 Spacecraft

    Science.gov (United States)

    Frith, J.; Anz-Meador, P.; Lederer, S.; Cowardin, H.; Buckalew, B.

    The Boeing HS-376 spin stabilized spacecraft was a popular design that was launched continuously into geosynchronous orbit starting in 1980 with the last launch occurring in 2002. Over 50 of the HS-376 buses were produced to fulfill a variety of different communication missions for countries all over the world. The design of the bus is easily approximated as a telescoping cylinder that is covered with solar cells and an Earth facing antenna that is despun at the top of the cylinder. The similarity in design and the number of spacecraft launched over a long period of time make the HS-376 a prime target for studying the effects of solar weathering on solar panels as a function of time. A selection of primarily non-operational HS-376 spacecraft launched over a 20 year time period were observed using the United Kingdom Infrared Telescope on Mauna Kea and multi-band near-infrared photometry produced. Each spacecraft was observed for an entire night cycling through ZYJHK filters and time-varying colors produced to compare near-infrared color as a function of launch date. The resulting analysis shown here may help in the future to set launch date constraints on the parent object of unidentified debris objects or other unknown spacecraft.