WorldWideScience

Sample records for analyses predict sistergroup

  1. Phylogenomic analyses predict sistergroup relationship of nucleariids and Fungi and paraphyly of zygomycetes with significant support

    Directory of Open Access Journals (Sweden)

    Steenkamp Emma

    2009-01-01

    Full Text Available Abstract Background Resolving the evolutionary relationships among Fungi remains challenging because of their highly variable evolutionary rates, and lack of a close phylogenetic outgroup. Nucleariida, an enigmatic group of amoeboids, have been proposed to emerge close to the fungal-metazoan divergence and might fulfill this role. Yet, published phylogenies with up to five genes are without compelling statistical support, and genome-level data should be used to resolve this question with confidence. Results Our analyses with nuclear (118 proteins and mitochondrial (13 proteins data now robustly associate Nucleariida and Fungi as neighbors, an assemblage that we term 'Holomycota'. With Nucleariida as an outgroup, we revisit unresolved deep fungal relationships. Conclusion Our phylogenomic analysis provides significant support for the paraphyly of the traditional taxon Zygomycota, and contradicts a recent proposal to include Mortierella in a phylum Mucoromycotina. We further question the introduction of separate phyla for Glomeromycota and Blastocladiomycota, whose phylogenetic positions relative to other phyla remain unresolved even with genome-level datasets. Our results motivate broad sampling of additional genome sequences from these phyla.

  2. The first record of a trans-oceanic sister-group relationship between obligate vertebrate troglobites.

    Directory of Open Access Journals (Sweden)

    Prosanta Chakrabarty

    Full Text Available We show using the most complete phylogeny of one of the most species-rich orders of vertebrates (Gobiiformes, and calibrations from the rich fossil record of teleost fishes, that the genus Typhleotris, endemic to subterranean karst habitats in southwestern Madagascar, is the sister group to Milyeringa, endemic to similar subterranean systems in northwestern Australia. Both groups are eyeless, and our phylogenetic and biogeographic results show that these obligate cave fishes now found on opposite ends of the Indian Ocean (separated by nearly 7,000 km are each others closest relatives and owe their origins to the break up of the southern supercontinent, Gondwana, at the end of the Cretaceous period. Trans-oceanic sister-group relationships are otherwise unknown between blind, cave-adapted vertebrates and our results provide an extraordinary case of Gondwanan vicariance.

  3. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 94; Issue 1. Comparative analyses of genetic risk prediction methods reveal extreme diversity of genetic predisposition to nonalcoholic fatty liver disease (NAFLD) among ethnic populations of India. Ankita Chatterjee Analabha Basu Abhijit Chowdhury Kausik Das Neeta ...

  4. Neonatal Sleep-Wake Analyses Predict 18-month Neurodevelopmental Outcomes.

    Science.gov (United States)

    Shellhaas, Renée A; Burns, Joseph W; Hassan, Fauziya; Carlson, Martha D; Barks, John D E; Chervin, Ronald D

    2017-11-01

    The neurological examination of critically ill neonates is largely limited to reflexive behavior. The exam often ignores sleep-wake physiology that may reflect brain integrity and influence long-term outcomes. We assessed whether polysomnography and concurrent cerebral near-infrared spectroscopy (NIRS) might improve prediction of 18-month neurodevelopmental outcomes. Term newborns with suspected seizures underwent standardized neurologic examinations to generate Thompson scores and had 12-hour bedside polysomnography with concurrent cerebral NIRS. For each infant, the distribution of sleep-wake stages and electroencephalogram delta power were computed. NIRS-derived fractional tissue oxygen extraction (FTOE) was calculated across sleep-wake stages. At age 18-22 months, surviving participants were evaluated with Bayley Scales of Infant Development (Bayley-III), 3rd edition. Twenty-nine participants completed Bayley-III. Increased newborn time in quiet sleep predicted worse 18-month cognitive and motor scores (robust regression models, adjusted r2 = 0.22, p = .007, and 0.27, .004, respectively). Decreased 0.5-2 Hz electroencephalograph (EEG) power during quiet sleep predicted worse 18-month language and motor scores (adjusted r2 = 0.25, p = .0005, and 0.33, .001, respectively). Predictive values remained significant after adjustment for neonatal Thompson scores or exposure to phenobarbital. Similarly, an attenuated difference in FTOE, between neonatal wakefulness and quiet sleep, predicted worse 18-month cognitive, language, and motor scores in adjusted analyses (each p sleep-as quantified by increased time in quiet sleep, lower electroencephalogram delta power during that stage, and muted differences in FTOE between quiet sleep and wakefulness-may improve prediction of adverse long-term outcomes for newborns with neurological dysfunction. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved

  5. The mitochondrial genome of Paraspadella gotoi is highly reduced and reveals that chaetognaths are a sister-group to protostomes

    Energy Technology Data Exchange (ETDEWEB)

    Helfenbein, Kevin G.; Fourcade, H. Matthew; Vanjani, Rohit G.; Boore, Jeffrey L.

    2004-05-01

    We report the first complete mitochondrial (mt) DNA sequence from a member of the phylum Chaetognatha (arrow worms). The Paraspadella gotoi mtDNA is highly unusual, missing 23 of the genes commonly found in animal mtDNAs, including atp6, which has otherwise been found universally to be present. Its 14 genes are unusually arranged into two groups, one on each strand. One group is punctuated by numerous non-coding intergenic nucleotides, while the other group is tightly packed, having no non-coding nucleotides, leading to speculation that there are two transcription units with differing modes of expression. The phylogenetic position of the Chaetognatha within the Metazoa has long been uncertain, with conflicting or equivocal results from various morphological analyses and rRNA sequence comparisons. Comparisons here of amino acid sequences from mitochondrially encoded proteins gives a single most parsimonious tree that supports a position of Chaetognatha as sister to the protostomes studied here. From this, one can more clearly interpret the patterns of evolution of various developmental features, especially regarding the embryological fate of the blastopore.

  6. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  7. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    2015-03-12

    Mar 12, 2015 ... where it is related to modern lifestyle with additional com- plication due to rising incidence of type 2 diabetes melli- tus (DM) and obesity (Angulo and ..... is rapidly becoming a health burden in western and develop- ing countries. In this study we defined a model of disease risk score prediction for different ...

  8. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  9. HHV Predicting Correlations for Torrefied Biomass Using Proximate and Ultimate Analyses

    Directory of Open Access Journals (Sweden)

    Daya Ram Nhuchhen

    2017-01-01

    Full Text Available Many correlations are available in the literature to predict the higher heating value (HHV of raw biomass using the proximate and ultimate analyses. Studies on biomass torrefaction are growing tremendously, which suggest that the fuel characteristics, such as HHV, proximate analysis and ultimate analysis, have changed significantly after torrefaction. Such changes may cause high estimation errors if the existing HHV correlations were to be used in predicting the HHV of torrefied biomass. No study has been carried out so far to verify this. Therefore, this study seeks answers to the question: “Can the existing correlations be used to determine the HHV of the torrefied biomass”? To answer this, the existing HHV predicting correlations were tested using torrefied biomass data points. Estimation errors were found to be significantly high for the existing HHV correlations, and thus, they are not suitable for predicting the HHV of the torrefied biomass. New correlations were then developed using data points of torrefied biomass. The ranges of reported data for HHV, volatile matter (VM, fixed carbon (FC, ash (ASH, carbon (C, hydrogen (H and oxygen (O contents were 14.90 MJ/kg–33.30 MJ/kg, 13.30%–88.57%, 11.25%–82.74%, 0.08%–47.62%, 35.08%–86.28%, 0.53%–7.46% and 4.31%–44.70%, respectively. Correlations with the minimum mean absolute errors and having all components of proximate and ultimate analyses were selected for future use. The selected new correlations have a good accuracy of prediction when they are validated using another set of data (26 samples. Thus, these new and more accurate correlations can be useful in modeling different thermochemical processes, including combustion, pyrolysis and gasification processes of torrefied biomass.

  10. Predictability of the monthly North Atlantic Oscillation index based on fractal analyses and dynamic system theory

    Science.gov (United States)

    Martínez, M. D.; Lana, X.; Burgueño, A.; Serra, C.

    2010-03-01

    The predictability of the monthly North Atlantic Oscillation, NAO, index is analysed from the point of view of different fractal concepts and dynamic system theory such as lacunarity, rescaled analysis (Hurst exponent) and reconstruction theorem (embedding and correlation dimensions, Kolmogorov entropy and Lyapunov exponents). The main results point out evident signs of randomness and the necessity of stochastic models to represent time evolution of the NAO index. The results also show that the monthly NAO index behaves as a white-noise Gaussian process. The high minimum number of nonlinear equations needed to describe the physical process governing the NAO index fluctuations is evidence of its complexity. A notable predictive instability is indicated by the positive Lyapunov exponents. Besides corroborating the complex time behaviour of the NAO index, present results suggest that random Cantor sets would be an interesting tool to model lacunarity and time evolution of the NAO index.

  11. Predictability of the monthly North Atlantic Oscillation index based on fractal analyses and dynamic system theory

    Directory of Open Access Journals (Sweden)

    M. D. Martínez

    2010-03-01

    Full Text Available The predictability of the monthly North Atlantic Oscillation, NAO, index is analysed from the point of view of different fractal concepts and dynamic system theory such as lacunarity, rescaled analysis (Hurst exponent and reconstruction theorem (embedding and correlation dimensions, Kolmogorov entropy and Lyapunov exponents. The main results point out evident signs of randomness and the necessity of stochastic models to represent time evolution of the NAO index. The results also show that the monthly NAO index behaves as a white-noise Gaussian process. The high minimum number of nonlinear equations needed to describe the physical process governing the NAO index fluctuations is evidence of its complexity. A notable predictive instability is indicated by the positive Lyapunov exponents. Besides corroborating the complex time behaviour of the NAO index, present results suggest that random Cantor sets would be an interesting tool to model lacunarity and time evolution of the NAO index.

  12. Computational approaches to analyse and predict small molecule transport and distribution at cellular and subcellular levels.

    Science.gov (United States)

    Min, Kyoung Ah; Zhang, Xinyuan; Yu, Jing-yu; Rosania, Gus R

    2014-01-01

    Quantitative structure-activity relationship (QSAR) studies and mechanistic mathematical modeling approaches have been independently employed for analysing and predicting the transport and distribution of small molecule chemical agents in living organisms. Both of these computational approaches have been useful for interpreting experiments measuring the transport properties of small molecule chemical agents, in vitro and in vivo. Nevertheless, mechanistic cell-based pharmacokinetic models have been especially useful to guide the design of experiments probing the molecular pathways underlying small molecule transport phenomena. Unlike QSAR models, mechanistic models can be integrated from microscopic to macroscopic levels, to analyse the spatiotemporal dynamics of small molecule chemical agents from intracellular organelles to whole organs, well beyond the experiments and training data sets upon which the models are based. Based on differential equations, mechanistic models can also be integrated with other differential equations-based systems biology models of biochemical networks or signaling pathways. Although the origin and evolution of mathematical modeling approaches aimed at predicting drug transport and distribution has occurred independently from systems biology, we propose that the incorporation of mechanistic cell-based computational models of drug transport and distribution into a systems biology modeling framework is a logical next step for the advancement of systems pharmacology research. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Transport and stability analyses supporting disruption prediction in high beta KSTAR plasmas

    Science.gov (United States)

    Ahn, J.-H.; Sabbagh, S. A.; Park, Y. S.; Berkery, J. W.; Jiang, Y.; Riquezes, J.; Lee, H. H.; Terzolo, L.; Scott, S. D.; Wang, Z.; Glasser, A. H.

    2017-10-01

    KSTAR plasmas have reached high stability parameters in dedicated experiments, with normalized beta βN exceeding 4.3 at relatively low plasma internal inductance li (βN/li>6). Transport and stability analyses have begun on these plasmas to best understand a disruption-free path toward the design target of βN = 5 while aiming to maximize the non-inductive fraction of these plasmas. Initial analysis using the TRANSP code indicates that the non-inductive current fraction in these plasmas has exceeded 50 percent. The advent of KSTAR kinetic equilibrium reconstructions now allows more accurate computation of the MHD stability of these plasmas. Attention is placed on code validation of mode stability using the PEST-3 and resistive DCON codes. Initial evaluation of these analyses for disruption prediction is made using the disruption event characterization and forecasting (DECAF) code. The present global mode kinetic stability model in DECAF developed for low aspect ratio plasmas is evaluated to determine modifications required for successful disruption prediction of KSTAR plasmas. Work supported by U.S. DoE under contract DE-SC0016614.

  14. Functional enrichment analyses and construction of functional similarity networks with high confidence function prediction by PFP

    Directory of Open Access Journals (Sweden)

    Kihara Daisuke

    2010-05-01

    Full Text Available Abstract Background A new paradigm of biological investigation takes advantage of technologies that produce large high throughput datasets, including genome sequences, interactions of proteins, and gene expression. The ability of biologists to analyze and interpret such data relies on functional annotation of the included proteins, but even in highly characterized organisms many proteins can lack the functional evidence necessary to infer their biological relevance. Results Here we have applied high confidence function predictions from our automated prediction system, PFP, to three genome sequences, Escherichia coli, Saccharomyces cerevisiae, and Plasmodium falciparum (malaria. The number of annotated genes is increased by PFP to over 90% for all of the genomes. Using the large coverage of the function annotation, we introduced the functional similarity networks which represent the functional space of the proteomes. Four different functional similarity networks are constructed for each proteome, one each by considering similarity in a single Gene Ontology (GO category, i.e. Biological Process, Cellular Component, and Molecular Function, and another one by considering overall similarity with the funSim score. The functional similarity networks are shown to have higher modularity than the protein-protein interaction network. Moreover, the funSim score network is distinct from the single GO-score networks by showing a higher clustering degree exponent value and thus has a higher tendency to be hierarchical. In addition, examining function assignments to the protein-protein interaction network and local regions of genomes has identified numerous cases where subnetworks or local regions have functionally coherent proteins. These results will help interpreting interactions of proteins and gene orders in a genome. Several examples of both analyses are highlighted. Conclusion The analyses demonstrate that applying high confidence predictions from PFP

  15. Interactions between risk factors in the prediction of onset of eating disorders: Exploratory hypothesis generating analyses.

    Science.gov (United States)

    Stice, Eric; Desjardins, Christopher D

    2018-06-01

    Because no study has tested for interactions between risk factors in the prediction of future onset of each eating disorder, this exploratory study addressed this lacuna to generate hypotheses to be tested in future confirmatory studies. Data from three prevention trials that targeted young women at high risk for eating disorders due to body dissatisfaction (N = 1271; M age 18.5, SD 4.2) and collected diagnostic interview data over 3-year follow-up were combined to permit sufficient power to predict onset of anorexia nervosa (AN), bulimia nervosa (BN), binge eating disorder (BED), and purging disorder (PD) using classification tree analyses, an analytic technique uniquely suited to detecting interactions. Low BMI was the most potent predictor of AN onset, and body dissatisfaction amplified this relation. Overeating was the most potent predictor of BN onset, and positive expectancies for thinness and body dissatisfaction amplified this relation. Body dissatisfaction was the most potent predictor of BED onset, and overeating, low dieting, and thin-ideal internalization amplified this relation. Dieting was the most potent predictor of PD onset, and negative affect and positive expectancies for thinness amplified this relation. Results provided evidence of amplifying interactions between risk factors suggestive of cumulative risk processes that were distinct for each disorder; future confirmatory studies should test the interactive hypotheses generated by these analyses. If hypotheses are confirmed, results may allow interventionists to target ultra high-risk subpopulations with more intensive prevention programs that are uniquely tailored for each eating disorder, potentially improving the yield of prevention efforts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Tumor Microvessel Density as a Potential Predictive Marker for Bevacizumab Benefit: GOG-0218 Biomarker Analyses.

    Science.gov (United States)

    Bais, Carlos; Mueller, Barbara; Brady, Mark F; Mannel, Robert S; Burger, Robert A; Wei, Wei; Marien, Koen M; Kockx, Mark M; Husain, Amreen; Birrer, Michael J

    2017-11-01

    Combining bevacizumab with frontline chemotherapy statistically significantly improved progression-free survival (PFS) but not overall survival (OS) in the phase III GOG-0218 trial. Evaluation of candidate biomarkers was an exploratory objective. Patients with stage III (incompletely resected) or IV ovarian cancer were randomly assigned to receive six chemotherapy cycles with placebo or bevacizumab followed by single-agent placebo or bevacizumab. Five candidate tumor biomarkers were assessed by immunohistochemistry. The biomarker-evaluable population was categorized into high or low biomarker-expressing subgroups using median and quartile cutoffs. Associations between biomarker expression and efficacy were analyzed. All statistical tests were two-sided. The biomarker-evaluable population (n = 980) comprising 78.5% of the intent-to-treat population had representative baseline characteristics and efficacy outcomes. Neither prognostic nor predictive associations were seen for vascular endothelial growth factor (VEGF) receptor-2, neuropilin-1, or MET. Higher microvessel density (MVD; measured by CD31) showed predictive value for PFS (hazard ratio [HR] for bevacizumab vs placebo = 0.40, 95% confidence interval [CI] = 0.29 to 0.54, vs 0.80, 95% CI = 0.59 to 1.07, for high vs low MVD, respectively, P interaction = .003) and OS (HR = 0.67, 95% CI = 0.51 to 0.88, vs 1.10, 95% CI = 0.84 to 1.44, P interaction = .02). Tumor VEGF-A was not predictive for PFS but showed potential predictive value for OS using a third-quartile cutoff for high VEGF-A expression. These retrospective tumor biomarker analyses suggest a positive association between density of vascular endothelial cells (the predominant cell type expressing VEGF receptors) and tumor VEGF-A levels and magnitude of bevacizumab effect in ovarian cancer. The potential predictive value of MVD (CD31) and tumor VEGF-A is consistent with a mechanism of action driven by VEGF-A signaling blockade. © The

  17. Using stability analyses to predict dynamic behaviour of self-oscillating polymer gels

    Science.gov (United States)

    Palkar, Vaibhav; Srivastava, Gaurav; Kuksenok, Olga; Balazs, Anna C.; Dayal, Pratyush

    2015-03-01

    Use of chemo-mechanical transduction to produce locomotion is one of the significant characteristics of biological systems. Polymer gels, intrinsically powered by oscillatory Belousov-Zhabotinsky (BZ) reaction, are biomimetic materials that exhibit rhythmic self-sustained mechanical oscillations by chemo-mechanical transduction. Via simulations, based on the 3D gel lattice spring model, we have successfully captured the dynamic behaviour of BZ gels. We have demonstrated that it is possible to direct the movement of BZ gels along complex paths, guiding them to bend, reorient and turn. From a mathematical perspective, the oscillations in the BZ gels occur when the gel's steady states loose stability by virtue of Hopf bifurcations (HB). Through the use of stability analyses, we predict the conditions under which gel switches from stationary to oscillatory mode and vice versa. In addition, we characterize the nature of HB and also identify other types of bifurcations that play a critical role in governing the dynamic behaviour of BZ gels. Also, we successfully predict the frequency of chemo-mechanical oscillations and characterize its dependency on the model parameters. Our approach not only allows us to establish optimal conditions for the motion of BZ gels, but also can be used to design other dynamical systems. IIT Gandhinagar and DST-SERB for funding.

  18. Predictive Engineering Tools for Injection-Molded Long-Carbon-Thermoplastic Composites: Weight and Cost Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Ba Nghiep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fifield, Leonard S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gandhi, Umesh N. [Toyota Research Inst. North America, Ann Arbor, MI (United States); Mori, Steven [MAGNA Exteriors and Interiors Corporation, Aurora, ON (Canada); Wollan, Eric J. [PlastiComp, Inc., Winona, MN (United States)

    2016-08-01

    This project proposed to integrate, optimize and validate the fiber orientation and length distribution models previously developed and implemented in the Autodesk Simulation Moldflow Insight (ASMI) package for injection-molded long-carbon-fiber thermoplastic composites into a cohesive prediction capability. The current effort focused on rendering the developed models more robust and efficient for automotive industry part design to enable weight savings and cost reduction. The project goal has been achieved by optimizing the developed models, improving and integrating their implementations in ASMI, and validating them for a complex 3D LCF thermoplastic automotive part (Figure 1). Both PP and PA66 were used as resin matrices. After validating ASMI predictions for fiber orientation and fiber length for this complex part against the corresponding measured data, in collaborations with Toyota and Magna PNNL developed a method using the predictive engineering tool to assess LCF/PA66 complex part design in terms of stiffness performance. Structural three-point bending analyses of the complex part and similar parts in steel were then performed for this purpose, and the team has then demonstrated the use of stiffness-based complex part design assessment to evaluate weight savings relative to the body system target (≥ 35%) set in Table 2 of DE-FOA-0000648 (AOI #1). In addition, starting from the part-to-part analysis, the PE tools enabled an estimated weight reduction for the vehicle body system using 50 wt% LCF/PA66 parts relative to the current steel system. Also, from this analysis an estimate of the manufacturing cost including the material cost for making the equivalent part in steel has been determined and compared to the costs for making the LCF/PA66 part to determine the cost per “saved” pound.

  19. Considerations when loading spinal finite element models with predicted muscle forces from inverse static analyses.

    Science.gov (United States)

    Zhu, Rui; Zander, Thomas; Dreischarf, Marcel; Duda, Georg N; Rohlmann, Antonius; Schmidt, Hendrik

    2013-04-26

    Mostly simplified loads were used in biomechanical finite element (FE) studies of the spine because of a lack of data on muscular physiological loading. Inverse static (IS) models allow the prediction of muscle forces for predefined postures. A combination of both mechanical approaches - FE and IS - appears to allow a more realistic modeling. However, it is unknown what deviations are to be expected when muscle forces calculated for models with rigid vertebrae and fixed centers of rotation, as generally found in IS models, are applied to a FE model with elastic vertebrae and discs. The aim of this study was to determine the effects of these disagreements. Muscle forces were estimated for 20° flexion and 10° extension in an IS model and transferred to a FE model. The effects of the elasticity of bony structures (rigid vs. elastic) and the definition of the center of rotation (fixed vs. non-fixed) were quantified using the deviation of actual intervertebral rotation (IVR) of the FE model and the targeted IVR from the IS model. For extension, the elasticity of the vertebrae had only a minor effect on IVRs, whereas a non-fixed center of rotation increased the IVR deviation on average by 0.5° per segment. For flexion, a combination of the two parameters increased IVR deviation on average by 1° per segment. When loading FE models with predicted muscle forces from IS analyses, the main limitations in the IS model - rigidity of the segments and the fixed centers of rotation - must be considered. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Predicting behavior change from persuasive messages using neural representational similarity and social network analyses.

    Science.gov (United States)

    Pegors, Teresa K; Tompson, Steven; O'Donnell, Matthew Brook; Falk, Emily B

    2017-08-15

    Neural activity in medial prefrontal cortex (MPFC), identified as engaging in self-related processing, predicts later health behavior change. However, it is unknown to what extent individual differences in neural representation of content and lived experience influence this brain-behavior relationship. We examined whether the strength of content-specific representations during persuasive messaging relates to later behavior change, and whether these relationships change as a function of individuals' social network composition. In our study, smokers viewed anti-smoking messages while undergoing fMRI and we measured changes in their smoking behavior one month later. Using representational similarity analyses, we found that the degree to which message content (i.e. health, social, or valence information) was represented in a self-related processing MPFC region was associated with later smoking behavior, with increased representations of negatively valenced (risk) information corresponding to greater message-consistent behavior change. Furthermore, the relationship between representations and behavior change depended on social network composition: smokers who had proportionally fewer smokers in their network showed increases in smoking behavior when social or health content was strongly represented in MPFC, whereas message-consistent behavior (i.e., less smoking) was more likely for those with proportionally more smokers in their social network who represented social or health consequences more strongly. These results highlight the dynamic relationship between representations in MPFC and key outcomes such as health behavior change; a complete understanding of the role of MPFC in motivation and action should take into account individual differences in neural representation of stimulus attributes and social context variables such as social network composition. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. CLPX-Model: Local Analysis and Prediction System: 4-D Atmospheric Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — The Local Analysis and Prediction System (LAPS), run by the NOAA's Forecast Systems Laboratory (FSL), combines numerous observed meteorological data sets into a...

  2. CLPX-Model: Local Analysis and Prediction System: 4-D Atmospheric Analyses, Version 1

    Data.gov (United States)

    National Aeronautics and Space Administration — The Local Analysis and Prediction System (LAPS), run by the NOAA's Forecast Systems Laboratory (FSL), combines numerous observed meteorological data sets into a...

  3. Analyses of deep mammalian sequence alignments and constraint predictions for 1% of the human genome

    OpenAIRE

    Margulies, Elliott H.; Cooper, Gregory M.; Asimenos, George; Thomas, Daryl J.; Dewey, Colin N.; Siepel, Adam; Birney, Ewan; Keefe, Damian; Schwartz, Ariel S.; Hou, Minmei; Taylor, James; Nikolaev, Sergey; Montoya-Burgos, Juan I.; Löytynoja, Ari; Whelan, Simon

    2007-01-01

    A key component of the ongoing ENCODE project involves rigorous comparative sequence analyses for the initially targeted 1% of the human genome. Here, we present orthologous sequence generation, alignment, and evolutionary constraint analyses of 23 mammalian species for all ENCODE targets. Alignments were generated using four different methods; comparisons of these methods reveal large-scale consistency but substantial differences in terms of small genomic rearrangements, sensitivity (sequenc...

  4. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. [Argonne National Lab., IL (United States); Pelton, A.; Eriksson, G. [Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering

    1992-07-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  5. Analyses and predictions of the thermodynamic properties and phase diagrams of silicate systems

    Energy Technology Data Exchange (ETDEWEB)

    Blander, M. (Argonne National Lab., IL (United States)); Pelton, A.; Eriksson, G. (Ecole Polytechnique, Montreal, PQ (Canada). Dept. of Metallurgy and Materials Engineering)

    1992-01-01

    Molten silicates are ordered solutions which can not be well represented by the usual polynomial representation of deviations from ideal solution behavior (i.e. excess free energies of mixing). An adaptation of quasichemical theory which is capable of describing the properties of ordered solutions represents the measured properties of binary silicates over broad ranges of composition and temperature. For simple silicates such as the MgO-FeO-SiO{sub 2} ternary system, in which silica is the only acid component, a combining rule generally leads to good predictions of ternary solutions from those of the binaries. In basic solutions, these predictions are consistent with those of the conformal ionic solution theory. Our results indicate that our approach could provide a potentially powerful tool for representing and predicting the properties of multicomponent molten silicates.

  6. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur......A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  7. Combining Results from Distinct MicroRNA Target Prediction Tools Enhances the Performance of Analyses.

    Science.gov (United States)

    Oliveira, Arthur C; Bovolenta, Luiz A; Nachtigall, Pedro G; Herkenhoff, Marcos E; Lemke, Ney; Pinhal, Danillo

    2017-01-01

    Target prediction is generally the first step toward recognition of bona fide microRNA (miRNA)-target interactions in living cells. Several target prediction tools are now available, which use distinct criteria and stringency to provide the best set of candidate targets for a single miRNA or a subset of miRNAs. However, there are many false-negative predictions, and consensus about the optimum strategy to select and use the output information provided by the target prediction tools is lacking. We compared the performance of four tools cited in literature-TargetScan (TS), miRanda-mirSVR (MR), Pita, and RNA22 (R22), and we determined the most effective approach for analyzing target prediction data (individual, union, or intersection). For this purpose, we calculated the sensitivity, specificity, precision, and correlation of these approaches using 10 miRNAs (miR-1-3p, miR-17-5p, miR-21-5p, miR-24-3p, miR-29a-3p, miR-34a-5p, miR-124-3p, miR-125b-5p, miR-145-5p, and miR-155-5p) and 1,400 genes (700 validated and 700 non-validated) as targets of these miRNAs. The four tools provided a subset of high-quality predictions and returned few false-positive predictions; however, they could not identify several known true targets. We demonstrate that union of TS/MR and TS/MR/R22 enhanced the quality of in silico prediction analysis of miRNA targets. We conclude that the union rather than the intersection of the aforementioned tools is the best strategy for maximizing performance while minimizing the loss of time and resources in subsequent in vivo and in vitro experiments for functional validation of miRNA-target interactions.

  8. Simulation, prediction, and genetic analyses of daily methane emissions in dairy cattle.

    Science.gov (United States)

    Yin, T; Pinent, T; Brügemann, K; Simianer, H; König, S

    2015-08-01

    This study presents an approach combining phenotypes from novel traits, deterministic equations from cattle nutrition, and stochastic simulation techniques from animal breeding to generate test-day methane emissions (MEm) of dairy cows. Data included test-day production traits (milk yield, fat percentage, protein percentage, milk urea nitrogen), conformation traits (wither height, hip width, body condition score), female fertility traits (days open, calving interval, stillbirth), and health traits (clinical mastitis) from 961 first lactation Brown Swiss cows kept on 41 low-input farms in Switzerland. Test-day MEm were predicted based on the traits from the current data set and 2 deterministic prediction equations, resulting in the traits labeled MEm1 and MEm2. Stochastic simulations were used to assign individual concentrate intake in dependency of farm-type specifications (requirement when calculating MEm2). Genetic parameters for MEm1 and MEm2 were estimated using random regression models. Predicted MEm had moderate heritabilities over lactation and ranged from 0.15 to 0.37, with highest heritabilities around DIM 100. Genetic correlations between MEm1 and MEm2 ranged between 0.91 and 0.94. Antagonistic genetic correlations in the range from 0.70 to 0.92 were found for the associations between MEm2 and milk yield. Genetic correlations between MEm with days open and with calving interval increased from 0.10 at the beginning to 0.90 at the end of lactation. Genetic relationships between MEm2 and stillbirth were negative (0 to -0.24) from the beginning to the peak phase of lactation. Positive genetic relationships in the range from 0.02 to 0.49 were found between MEm2 with clinical mastitis. Interpretation of genetic (co)variance components should also consider the limitations when using data generated by prediction equations. Prediction functions only describe that part of MEm which is dependent on the factors and effects included in the function. With high

  9. FUN3D Analyses in Support of the Second Aeroelastic Prediction Workshop

    Science.gov (United States)

    Chwalowski, Pawel; Heeg, Jennifer

    2016-01-01

    This paper presents the computational aeroelastic results generated in support of the second Aeroelastic Prediction Workshop for the Benchmark Supercritical Wing (BSCW) configurations and compares them to the experimental data. The computational results are obtained using FUN3D, an unstructured grid Reynolds- Averaged Navier-Stokes solver developed at NASA Langley Research Center. The analysis results include aerodynamic coefficients and surface pressures obtained for steady-state, static aeroelastic equilibrium, and unsteady flow due to a pitching wing or flutter prediction. Frequency response functions of the pressure coefficients with respect to the angular displacement are computed and compared with the experimental data. The effects of spatial and temporal convergence on the computational results are examined.

  10. Aeromechanics and Aeroacoustics Predictions of the Boeing-SMART Rotor Using Coupled-CFD/CSD Analyses

    Science.gov (United States)

    Bain, Jeremy; Sim, Ben W.; Sankar, Lakshmi; Brentner, Ken

    2010-01-01

    This paper will highlight helicopter aeromechanics and aeroacoustics prediction capabilities developed by Georgia Institute of Technology, the Pennsylvania State University, and Northern Arizona University under the Helicopter Quieting Program (HQP) sponsored by the Tactical Technology Office of the Defense Advanced Research Projects Agency (DARPA). First initiated in 2004, the goal of the HQP was to develop high fidelity, state-of-the-art computational tools for designing advanced helicopter rotors with reduced acoustic perceptibility and enhanced performance. A critical step towards achieving this objective is the development of rotorcraft prediction codes capable of assessing a wide range of helicopter configurations and operations for future rotorcraft designs. This includes novel next-generation rotor systems that incorporate innovative passive and/or active elements to meet future challenging military performance and survivability goals.

  11. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur...... anxiety and frustration appear not to be localized in a specific time interval but rather dependent on particular game stimuli....

  12. Predicting prostate cancer: analysing the clinical efficacy of prostate cancer risk calculators in a referral population.

    Science.gov (United States)

    Foley, R W; Lundon, D J; Murphy, K; Murphy, T B; Galvin, D J; Watson, R W G

    2015-09-01

    The decision to proceed to biopsy for the diagnosis of prostate cancer in clinical practice is a difficult one. Prostate cancer risk calculators allow for a systematic approach to the use of patient information to predict a patient's likelihood of prostate cancer. In this paper, we validate the two leading prostate cancer risk calculators, the prostate cancer prevention trial (PCPT) and the European Randomized Study of Screening for Prostate Cancer (ERSPC) in an Irish population. Data were collected for 337 men referred to one tertiary referral center in Ireland. Calibration analysis, ROC analysis and decision curve analysis were undertaken to ascertain the performance of the PCPT and the ERSPC risk calculators in this cohort. Of 337 consecutive biopsies, cancer was subsequently diagnosed in 146 men (43 %), 98 (67 %) of which were high grade. The AUC for the PCPT and ERSPC risk calculators were 0.68 and 0.66, respectively for the prediction of prostate cancer. Each calculator was sufficiently calibrated in this cohort. Decision curve analysis demonstrated a net benefit via the use of the PCPT and ERSPC risk calculators in the diagnosis of prostate cancer. The PCPT and ERSPC risk calculators achieve a statistically significant prediction of prostate cancer in this Irish population. This study provides external validation for these calculators, and therefore these tools can be used to aid in clinical decision making.

  13. Accuracy of Fall Prediction in Parkinson Disease: Six-Month and 12-Month Prospective Analyses

    Directory of Open Access Journals (Sweden)

    Ryan P. Duncan

    2012-01-01

    Full Text Available Introduction. We analyzed the ability of four balance assessments to predict falls in people with Parkinson Disease (PD prospectively over six and 12 months. Materials and Methods. The BESTest, Mini-BESTest, Functional Gait Assessment (FGA, and Berg Balance Scale (BBS were administered to 80 participants with idiopathic PD at baseline. Falls were then tracked for 12 months. Ability of each test to predict falls at six and 12 months was assessed using ROC curves and likelihood ratios (LR. Results. Twenty-seven percent of the sample had fallen at six months, and 32% of the sample had fallen at 12 months. At six months, areas under the ROC curve (AUC for the tests ranged from 0.8 (FGA to 0.89 (BESTest with LR+ of 3.4 (FGA to 5.8 (BESTest. At 12 months, AUCs ranged from 0.68 (BESTest, BBS to 0.77 (Mini-BESTest with LR+ of 1.8 (BESTest to 2.4 (BBS, FGA. Discussion. The various balance tests were effective in predicting falls at six months. All tests were relatively ineffective at 12 months. Conclusion. This pilot study suggests that people with PD should be assessed biannually for fall risk.

  14. Analyses of deep mammalian sequence alignments and constraint predictions for 1% of the human genome.

    Science.gov (United States)

    Margulies, Elliott H; Cooper, Gregory M; Asimenos, George; Thomas, Daryl J; Dewey, Colin N; Siepel, Adam; Birney, Ewan; Keefe, Damian; Schwartz, Ariel S; Hou, Minmei; Taylor, James; Nikolaev, Sergey; Montoya-Burgos, Juan I; Löytynoja, Ari; Whelan, Simon; Pardi, Fabio; Massingham, Tim; Brown, James B; Bickel, Peter; Holmes, Ian; Mullikin, James C; Ureta-Vidal, Abel; Paten, Benedict; Stone, Eric A; Rosenbloom, Kate R; Kent, W James; Bouffard, Gerard G; Guan, Xiaobin; Hansen, Nancy F; Idol, Jacquelyn R; Maduro, Valerie V B; Maskeri, Baishali; McDowell, Jennifer C; Park, Morgan; Thomas, Pamela J; Young, Alice C; Blakesley, Robert W; Muzny, Donna M; Sodergren, Erica; Wheeler, David A; Worley, Kim C; Jiang, Huaiyang; Weinstock, George M; Gibbs, Richard A; Graves, Tina; Fulton, Robert; Mardis, Elaine R; Wilson, Richard K; Clamp, Michele; Cuff, James; Gnerre, Sante; Jaffe, David B; Chang, Jean L; Lindblad-Toh, Kerstin; Lander, Eric S; Hinrichs, Angie; Trumbower, Heather; Clawson, Hiram; Zweig, Ann; Kuhn, Robert M; Barber, Galt; Harte, Rachel; Karolchik, Donna; Field, Matthew A; Moore, Richard A; Matthewson, Carrie A; Schein, Jacqueline E; Marra, Marco A; Antonarakis, Stylianos E; Batzoglou, Serafim; Goldman, Nick; Hardison, Ross; Haussler, David; Miller, Webb; Pachter, Lior; Green, Eric D; Sidow, Arend

    2007-06-01

    A key component of the ongoing ENCODE project involves rigorous comparative sequence analyses for the initially targeted 1% of the human genome. Here, we present orthologous sequence generation, alignment, and evolutionary constraint analyses of 23 mammalian species for all ENCODE targets. Alignments were generated using four different methods; comparisons of these methods reveal large-scale consistency but substantial differences in terms of small genomic rearrangements, sensitivity (sequence coverage), and specificity (alignment accuracy). We describe the quantitative and qualitative trade-offs concomitant with alignment method choice and the levels of technical error that need to be accounted for in applications that require multisequence alignments. Using the generated alignments, we identified constrained regions using three different methods. While the different constraint-detecting methods are in general agreement, there are important discrepancies relating to both the underlying alignments and the specific algorithms. However, by integrating the results across the alignments and constraint-detecting methods, we produced constraint annotations that were found to be robust based on multiple independent measures. Analyses of these annotations illustrate that most classes of experimentally annotated functional elements are enriched for constrained sequences; however, large portions of each class (with the exception of protein-coding sequences) do not overlap constrained regions. The latter elements might not be under primary sequence constraint, might not be constrained across all mammals, or might have expendable molecular functions. Conversely, 40% of the constrained sequences do not overlap any of the functional elements that have been experimentally identified. Together, these findings demonstrate and quantify how many genomic functional elements await basic molecular characterization.

  15. The PRO-ACT database: design, initial analyses, and predictive features.

    Science.gov (United States)

    Atassi, Nazem; Berry, James; Shui, Amy; Zach, Neta; Sherman, Alexander; Sinani, Ervin; Walker, Jason; Katsovskiy, Igor; Schoenfeld, David; Cudkowicz, Merit; Leitner, Melanie

    2014-11-04

    To pool data from completed amyotrophic lateral sclerosis (ALS) clinical trials and create an open-access resource that enables greater understanding of the phenotype and biology of ALS. Clinical trials data were pooled from 16 completed phase II/III ALS clinical trials and one observational study. Over 8 million de-identified longitudinally collected data points from over 8,600 individuals with ALS were standardized across trials and merged to create the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database. This database includes demographics, family histories, and longitudinal clinical and laboratory data. Mixed effects models were used to describe the rate of disease progression measured by the Revised ALS Functional Rating Scale (ALSFRS-R) and vital capacity (VC). Cox regression models were used to describe survival data. Implementing Bonferroni correction, the critical p value for 15 different tests was p = 0.003. The ALSFRS-R rate of decline was 1.02 (±2.3) points per month and the VC rate of decline was 2.24% of predicted (±6.9) per month. Higher levels of uric acid at trial entry were predictive of a slower drop in ALSFRS-R (p = 0.01) and VC (p predictive of a slower drop in ALSFRS-R (p = 0.01) and VC (p < 0.0001), and longer survival (p = 0.01). Finally, higher body mass index (BMI) at baseline was associated with longer survival (p < 0.0001). The PRO-ACT database is the largest publicly available repository of merged ALS clinical trials data. We report that baseline levels of creatinine and uric acid, as well as baseline BMI, are strong predictors of disease progression and survival. © 2014 American Academy of Neurology.

  16. Prediction of venous thromboembolism using semantic and sentiment analyses of clinical narratives.

    Science.gov (United States)

    Sabra, Susan; Mahmood Malik, Khalid; Alobaidi, Mazen

    2018-03-01

    Venous thromboembolism (VTE) is the third most common cardiovascular disorder. It affects people of both genders at ages as young as 20 years. The increased number of VTE cases with a high fatality rate of 25% at first occurrence makes preventive measures essential. Clinical narratives are a rich source of knowledge and should be included in the diagnosis and treatment processes, as they may contain critical information on risk factors. It is very important to make such narrative blocks of information usable for searching, health analytics, and decision-making. This paper proposes a Semantic Extraction and Sentiment Assessment of Risk Factors (SESARF) framework. Unlike traditional machine-learning approaches, SESARF, which consists of two main algorithms, namely, ExtractRiskFactor and FindSeverity, prepares a feature vector as the input to a support vector machine (SVM) classifier to make a diagnosis. SESARF matches and maps the concepts of VTE risk factors and finds adjectives and adverbs that reflect their levels of severity. SESARF uses a semantic- and sentiment-based approach to analyze clinical narratives of electronic health records (EHR) and then predict a diagnosis of VTE. We use a dataset of 150 clinical narratives, 80% of which are used to train our prediction classifier support vector machine, with the remaining 20% used for testing. Semantic extraction and sentiment analysis results yielded precisions of 81% and 70%, respectively. Using a support vector machine, prediction of patients with VTE yielded precision and recall values of 54.5% and 85.7%, respectively. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Directory of Open Access Journals (Sweden)

    Young Bin Kim

    Full Text Available Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  18. When Bitcoin encounters information in an online forum: Using text mining to analyse user opinions and predict value fluctuation.

    Science.gov (United States)

    Kim, Young Bin; Lee, Jurim; Park, Nuri; Choo, Jaegul; Kim, Jong-Hyun; Kim, Chang Hun

    2017-01-01

    Bitcoin is an online currency that is used worldwide to make online payments. It has consequently become an investment vehicle in itself and is traded in a way similar to other open currencies. The ability to predict the price fluctuation of Bitcoin would therefore facilitate future investment and payment decisions. In order to predict the price fluctuation of Bitcoin, we analyse the comments posted in the Bitcoin online forum. Unlike most research on Bitcoin-related online forums, which is limited to simple sentiment analysis and does not pay sufficient attention to note-worthy user comments, our approach involved extracting keywords from Bitcoin-related user comments posted on the online forum with the aim of analytically predicting the price and extent of transaction fluctuation of the currency. The effectiveness of the proposed method is validated based on Bitcoin online forum data ranging over a period of 2.8 years from December 2013 to September 2016.

  19. Prediction and validation of gene-disease associations using methods inspired by social network analyses.

    Directory of Open Access Journals (Sweden)

    U Martin Singh-Blom

    Full Text Available Correctly identifying associations of genes with diseases has long been a goal in biology. With the emergence of large-scale gene-phenotype association datasets in biology, we can leverage statistical and machine learning methods to help us achieve this goal. In this paper, we present two methods for predicting gene-disease associations based on functional gene associations and gene-phenotype associations in model organisms. The first method, the Katz measure, is motivated from its success in social network link prediction, and is very closely related to some of the recent methods proposed for gene-disease association inference. The second method, called Catapult (Combining dATa Across species using Positive-Unlabeled Learning Techniques, is a supervised machine learning method that uses a biased support vector machine where the features are derived from walks in a heterogeneous gene-trait network. We study the performance of the proposed methods and related state-of-the-art methods using two different evaluation strategies, on two distinct data sets, namely OMIM phenotypes and drug-target interactions. Finally, by measuring the performance of the methods using two different evaluation strategies, we show that even though both methods perform very well, the Katz measure is better at identifying associations between traits and poorly studied genes, whereas Catapult is better suited to correctly identifying gene-trait associations overall [corrected].

  20. Prediction of Epileptic Seizure by Analysing Time Series EEG Signal Using k-NN Classifier

    Directory of Open Access Journals (Sweden)

    Md. Kamrul Hasan

    2017-01-01

    Full Text Available Electroencephalographic signal is a representative signal that contains information about brain activity, which is used for the detection of epilepsy since epileptic seizures are caused by a disturbance in the electrophysiological activity of the brain. The prediction of epileptic seizure usually requires a detailed and experienced analysis of EEG. In this paper, we have introduced a statistical analysis of EEG signal that is capable of recognizing epileptic seizure with a high degree of accuracy and helps to provide automatic detection of epileptic seizure for different ages of epilepsy. To accomplish the target research, we extract various epileptic features namely approximate entropy (ApEn, standard deviation (SD, standard error (SE, modified mean absolute value (MMAV, roll-off (R, and zero crossing (ZC from the epileptic signal. The k-nearest neighbours (k-NN algorithm is used for the classification of epilepsy then regression analysis is used for the prediction of the epilepsy level at different ages of the patients. Using the statistical parameters and regression analysis, a prototype mathematical model is proposed which helps to find the epileptic randomness with respect to the age of different subjects. The accuracy of this prototype equation depends on proper analysis of the dynamic information from the epileptic EEG.

  1. Predicting Geomorphic and Hydrologic Risks after Wildfire Using Harmonic and Stochastic Analyses

    Science.gov (United States)

    Mikesell, J.; Kinoshita, A. M.; Florsheim, J. L.; Chin, A.; Nourbakhshbeidokhti, S.

    2017-12-01

    Wildfire is a landscape-scale disturbance that often alters hydrological processes and sediment flux during subsequent storms. Vegetation loss from wildfires induce changes to sediment supply such as channel erosion and sedimentation and streamflow magnitude or flooding. These changes enhance downstream hazards, threatening human populations and physical aquatic habitat over various time scales. Using Williams Canyon, a basin burned by the Waldo Canyon Fire (2012) as a case study, we utilize deterministic and statistical modeling methods (Fourier series and first order Markov chain) to assess pre- and post-fire geomorphic and hydrologic characteristics, including of precipitation, enhanced vegetation index (EVI, a satellite-based proxy of vegetation biomass), streamflow, and sediment flux. Local precipitation, terrestrial Light Detection and Ranging (LiDAR) scanning, and satellite-based products are used for these time series analyses. We present a framework to assess variability of periodic and nonperiodic climatic and multivariate trends to inform development of a post-wildfire risk assessment methodology. To establish the extent to which a wildfire affects hydrologic and geomorphic patterns, a Fourier series was used to fit pre- and post-fire geomorphic and hydrologic characteristics to yearly temporal cycles and subcycles of 6, 4, 3, and 2.4 months. These cycles were analyzed using least-squares estimates of the harmonic coefficients or amplitudes of each sub-cycle's contribution to fit the overall behavior of a Fourier series. The stochastic variances of these characteristics were analyzed by composing first-order Markov models and probabilistic analysis through direct likelihood estimates. Preliminary results highlight an increased dependence of monthly post-fire hydrologic characteristics on 12 and 6-month temporal cycles. This statistical and probabilistic analysis provides a basis to determine the impact of wildfires on the temporal dependence of

  2. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  3. Benchmark of SCALE (SAS2H) isotopic predictions of depletion analyses for San Onofre PWR MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    2000-02-01

    The isotopic composition of mixed-oxide (MOX) fuel, fabricated with both uranium and plutonium, after discharge from reactors is of significant interest to the Fissile Materials Disposition Program. The validation of the SCALE (SAS2H) depletion code for use in the prediction of isotopic compositions of MOX fuel, similar to previous validation studies on uranium-only fueled reactors, has corresponding significance. The EEI-Westinghouse Plutonium Recycle Demonstration Program examined the use of MOX fuel in the San Onofre PWR, Unit 1, during cycles 2 and 3. Isotopic analyses of the MOX spent fuel were conducted on 13 actinides and {sup 148}Nd by either mass or alpha spectrometry. Six fuel pellet samples were taken from four different fuel pins of an irradiated MOX assembly. The measured actinide inventories from those samples has been used to benchmark SAS2H for MOX fuel applications. The average percentage differences in the code results compared with the measurement were {minus}0.9% for {sup 235}U and 5.2% for {sup 239}Pu. The differences for most of the isotopes were significantly larger than in the cases for uranium-only fueled reactors. In general, comparisons of code results with alpha spectrometer data had extreme differences, although the differences in the calculations compared with mass spectrometer analyses were not extremely larger than that of uranium-only fueled reactors. This benchmark study should be useful in estimating uncertainties of inventory, criticality and dose calculations of MOX spent fuel.

  4. Tumor marker analyses in patients with brain metastases: patterns of practice and implications for survival prediction research.

    Science.gov (United States)

    Nieder, Carsten; Dalhaug, Astrid; Haukland, Ellinor; Mannsåker, Bård; Pawinski, Adam

    2015-08-01

    This study aims to explore patterns of practice of tumor marker analyses and potential prognostic impact of abnormal markers in patients with brain metastases from solid tumors. Previously, lactate dehydrogenase (LDH) and albumin were identified as relevant biomarkers. We performed a retrospective analysis of 120 patients with known LDH and albumin treated with whole-brain radiotherapy (WBRT) in two different situations: (1) brain metastases detected at initial cancer diagnosis (n = 46) and (2) brain metastases at later time points (n = 74, median interval 13 months). Twenty-six patients (57 %) from group 1 had at least one tumor marker analyzed, and 11 patients (24 %) had abnormal results. Twenty-two patients (30 %) from group 2 had at least one tumor marker analyzed, and 16 patients (22 %) had abnormal results. When assuming that LDH and albumin would be standard tests before WBRT, additional potential biomarkers were found in 36 % of patients with normal LDH and albumin. Marker positivity rates were for example 80 % for carcinoembryonic antigen (CEA) in colorectal cancer and 79 % for CA 15-3 in breast cancer. Abnormal markers were associated with presence of liver metastases. CA 15-3 values above median predicted shorter survival in patients with breast cancer (median 1.9 vs. 13.8 months, p = 0.1). Comparable trends were not observed for various markers in other tumor types. In conclusion, only a minority of patients had undergone tumor marker analyses. Final group sizes were too small to perform multivariate analyses or draw definitive conclusions. We hypothesize that CA 15-3 could be a promising biomarker that should be studied further.

  5. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.

  6. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Yan Song

    2015-01-01

    Full Text Available Background: Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC, but biomarkers of activity are lacking. The aim of this study was to investigate the association of Von Hippel-Lindau (VHL gene status, vascular endothelial growth factor receptor (VEGFR or stem cell factor receptor (KIT expression, and their relationships with characteristics and clinical outcome of advanced ccRCC. Methods: A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute, Chinese Academy of Medical Sciences between January 2010 and November 2012. Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry. Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS and overall survival (OS were calculated and then compared based on expression status. The Chi-square test, the Kaplan-Meier method, and the Lon-rank test were used for statistical analyses. Results: Of 59 patients, objective responses were observed in 28 patients (47.5%. The median PFS was 13.8 months and median OS was 39.9 months. There was an improved PFS in patients with the following clinical features: Male gender, number of metastatic sites 2 or less, VEGFR-2 positive or KIT positive. Eleven patients (18.6% had evidence of VHL mutation, with an objective response rate of 45.5%, which showed no difference with patients with no VHL mutation (47.9%. VHL mutation status did not correlate with either overall response rate (P = 0.938 or PFS (P = 0.277. The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients, respectively, which was significantly longer than that of VEGFR-2 or KIT negative patients (P = 0.026 and P = 0.043. Conclusion

  7. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  8. Within What Distance Does "Greenness" Best Predict Physical Health? A Systematic Review of Articles with GIS Buffer Analyses across the Lifespan.

    Science.gov (United States)

    Browning, Matthew; Lee, Kangjae

    2017-06-23

    Is the amount of "greenness" within a 250-m, 500-m, 1000-m or a 2000-m buffer surrounding a person's home a good predictor of their physical health? The evidence is inconclusive. We reviewed Web of Science articles that used geographic information system buffer analyses to identify trends between physical health, greenness, and distance within which greenness is measured. Our inclusion criteria were: (1) use of buffers to estimate residential greenness; (2) statistical analyses that calculated significance of the greenness-physical health relationship; and (3) peer-reviewed articles published in English between 2007 and 2017. To capture multiple findings from a single article, we selected our unit of inquiry as the analysis, not the article. Our final sample included 260 analyses in 47 articles. All aspects of the review were in accordance with PRISMA guidelines. Analyses were independently judged as more, less, or least likely to be biased based on the inclusion of objective health measures and income/education controls. We found evidence that larger buffer sizes, up to 2000 m, better predicted physical health than smaller ones. We recommend that future analyses use nested rather than overlapping buffers to evaluate to what extent greenness not immediately around a person's home (i.e., within 1000-2000 m) predicts physical health.

  9. Prediction of beef color using time-domain nuclear magnetic resonance (TD-NMR) relaxometry data and multivariate analyses.

    Science.gov (United States)

    Moreira, Luiz Felipe Pompeu Prado; Ferrari, Adriana Cristina; Moraes, Tiago Bueno; Reis, Ricardo Andrade; Colnago, Luiz Alberto; Pereira, Fabíola Manhas Verbi

    2016-05-19

    Time-domain nuclear magnetic resonance and chemometrics were used to predict color parameters, such as lightness (L*), redness (a*), and yellowness (b*) of beef (Longissimus dorsi muscle) samples. Analyzing the relaxation decays with multivariate models performed with partial least-squares regression, color quality parameters were predicted. The partial least-squares models showed low errors independent of the sample size, indicating the potentiality of the method. Minced procedure and weighing were not necessary to improve the predictive performance of the models. The reduction of transverse relaxation time (T 2 ) measured by Carr-Purcell-Meiboom-Gill pulse sequence in darker beef in comparison with lighter ones can be explained by the lower relaxivity Fe 2+ present in deoxymyoglobin and oxymyoglobin (red beef) to the higher relaxivity of Fe 3+ present in metmyoglobin (brown beef). These results point that time-domain nuclear magnetic resonance spectroscopy can become a useful tool for quality assessment of beef cattle on bulk of the sample and through-packages, because this technique is also widely applied to measure sensorial parameters, such as flavor, juiciness and tenderness, and physicochemical parameters, cooking loss, fat and moisture content, and instrumental tenderness using Warner Bratzler shear force. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. A novel bioinformatics strategy for function prediction of poorly-characterized protein genes obtained from metagenome analyses.

    Science.gov (United States)

    Abe, Takashi; Kanaya, Shigehiko; Uehara, Hiroshi; Ikemura, Toshimichi

    2009-10-01

    As a result of remarkable progresses of DNA sequencing technology, vast quantities of genomic sequences have been decoded. Homology search for amino acid sequences, such as BLAST, has become a basic tool for assigning functions of genes/proteins when genomic sequences are decoded. Although the homology search has clearly been a powerful and irreplaceable method, the functions of only 50% or fewer of genes can be predicted when a novel genome is decoded. A prediction method independent of the homology search is urgently needed. By analyzing oligonucleotide compositions in genomic sequences, we previously developed a modified Self-Organizing Map 'BLSOM' that clustered genomic fragments according to phylotype with no advance knowledge of phylotype. Using BLSOM for di-, tri- and tetrapeptide compositions, we developed a system to enable separation (self-organization) of proteins by function. Analyzing oligopeptide frequencies in proteins previously classified into COGs (clusters of orthologous groups of proteins), BLSOMs could faithfully reproduce the COG classifications. This indicated that proteins, whose functions are unknown because of lack of significant sequence similarity with function-known proteins, can be related to function-known proteins based on similarity in oligopeptide composition. BLSOM was applied to predict functions of vast quantities of proteins derived from mixed genomes in environmental samples.

  11. Prediction of size-fractionated airborne particle-bound metals using MLR, BP-ANN and SVM analyses.

    Science.gov (United States)

    Leng, Xiang'zi; Wang, Jinhua; Ji, Haibo; Wang, Qin'geng; Li, Huiming; Qian, Xin; Li, Fengying; Yang, Meng

    2017-08-01

    Size-fractionated heavy metal concentrations were observed in airborne particulate matter (PM) samples collected from 2014 to 2015 (spanning all four seasons) from suburban (Xianlin) and industrial (Pukou) areas in Nanjing, a megacity of southeast China. Rapid prediction models of size-fractionated metals were established based on multiple linear regression (MLR), back propagation artificial neural network (BP-ANN) and support vector machine (SVM) by using meteorological factors and PM concentrations as input parameters. About 38% and 77% of PM 2.5 concentrations in Xianlin and Pukou, respectively, were beyond the Chinese National Ambient Air Quality Standard limit of 75 μg/m 3 . Nearly all elements had higher concentrations in industrial areas, and in winter among the four seasons. Anthropogenic elements such as Pb, Zn, Cd and Cu showed larger percentages in the fine fraction (ø≤2.5 μm), whereas the crustal elements including Al, Ba, Fe, Ni, Sr and Ti showed larger percentages in the coarse fraction (ø > 2.5 μm). SVM showed a higher training correlation coefficient (R), and lower mean absolute error (MAE) as well as lower root mean square error (RMSE), than MLR and BP-ANN for most metals. All the three methods showed better prediction results for Ni, Al, V, Cd and As, whereas relatively poor for Cr and Fe. The daily airborne metal concentrations in 2015 were then predicted by the fully trained SVM models and the results showed the heaviest pollution of airborne heavy metals occurred in December and January, whereas the lightest pollution occurred in June and July. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Can the lifetime of the superheater tubes be predicted according to the fuel analyses? Assessment from field and laboratory data

    Energy Technology Data Exchange (ETDEWEB)

    Salmenoja, K. [Kvaerner Pulping Oy, Tampere (Finland)

    1998-12-31

    Lifetime of the superheaters in different power boilers is more or less still a mystery. This is especially true in firing biomass based fuels (biofuels), such as bark, forest residues, and straw. Due to the unhomogeneous nature of the biofuels, the lifetime of the superheaters may vary from case to case. Sometimes the lifetime is significantly shorter than originally expected, sometimes no corrosion even in the hottest tubes is observed. This is one of the main reasons why the boiler operators often demand for a better predictability on the corrosion resistance of the materials to avoid unscheduled shutdowns. (orig.) 9 refs.

  13. Pan-genome Analyses of the Species Salmonella enterica, and Identification of Genomic Markers Predictive for Species, Subspecies, and Serovar

    Science.gov (United States)

    Laing, Chad R.; Whiteside, Matthew D.; Gannon, Victor P. J.

    2017-01-01

    Food safety is a global concern, with upward of 2.2 million deaths due to enteric disease every year. Current whole-genome sequencing platforms allow routine sequencing of enteric pathogens for surveillance, and during outbreaks; however, a remaining challenge is the identification of genomic markers that are predictive of strain groups that pose the most significant health threats to humans, or that can persist in specific environments. We have previously developed the software program Panseq, which identifies the pan-genome among a group of sequences, and the SuperPhy platform, which utilizes this pan-genome information to identify biomarkers that are predictive of groups of bacterial strains. In this study, we examined the pan-genome of 4893 genomes of Salmonella enterica, an enteric pathogen responsible for the loss of more disability adjusted life years than any other enteric pathogen. We identified a pan-genome of 25.3 Mbp, a strict core of 1.5 Mbp present in all genomes, and a conserved core of 3.2 Mbp found in at least 96% of these genomes. We also identified 404 genomic regions of 1000 bp that were specific to the species S. enterica. These species-specific regions were found to encode mostly hypothetical proteins, effectors, and other proteins related to virulence. For each of the six S. enterica subspecies, markers unique to each were identified. No serovar had pan-genome regions that were present in all of its genomes and absent in all other serovars; however, each serovar did have genomic regions that were universally present among all constituent members, and statistically predictive of the serovar. The phylogeny based on SNPs within the conserved core genome was found to be highly concordant to that produced by a phylogeny using the presence/absence of 1000 bp regions of the entire pan-genome. Future studies could use these predictive regions as components of a vaccine to prevent salmonellosis, as well as in simple and rapid diagnostic tests for both

  14. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  15. A novel approach to predict sudden cardiac death (SCD using nonlinear and time-frequency analyses from HRV signals.

    Directory of Open Access Journals (Sweden)

    Elias Ebrahimzadeh

    Full Text Available Investigations show that millions of people all around the world die as the result of sudden cardiac death (SCD. These deaths can be reduced by using medical equipment, such as defibrillators, after detection. We need to propose suitable ways to assist doctors to predict sudden cardiac death with a high level of accuracy. To do this, Linear, Time-Frequency (TF and Nonlinear features have been extracted from HRV of ECG signal. Finally, healthy people and people at risk of SCD are classified by k-Nearest Neighbor (k-NN and Multilayer Perceptron Neural Network (MLP. To evaluate, we have compared the classification rates for both separate and combined Nonlinear and TF features. The results show that HRV signals have special features in the vicinity of the occurrence of SCD that have the ability to distinguish between patients prone to SCD and normal people. We found that the combination of Time-Frequency and Nonlinear features have a better ability to achieve higher accuracy. The experimental results show that the combination of features can predict SCD by the accuracy of 99.73%, 96.52%, 90.37% and 83.96% for the first, second, third and forth one-minute intervals, respectively, before SCD occurrence.

  16. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    Energy Technology Data Exchange (ETDEWEB)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-08-15

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  17. Landscaping analyses of the ROC predictions of discrete-slots and signal-detection models of visual working memory.

    Science.gov (United States)

    Donkin, Chris; Tran, Sophia Chi; Nosofsky, Robert

    2014-10-01

    A fundamental issue concerning visual working memory is whether its capacity limits are better characterized in terms of a limited number of discrete slots (DSs) or a limited amount of a shared continuous resource. Rouder et al. (2008) found that a mixed-attention, fixed-capacity, DS model provided the best explanation of behavior in a change detection task, outperforming alternative continuous signal detection theory (SDT) models. Here, we extend their analysis in two ways: first, with experiments aimed at better distinguishing between the predictions of the DS and SDT models, and second, using a model-based analysis technique called landscaping, in which the functional-form complexity of the models is taken into account. We find that the balance of evidence supports a DS account of behavior in change detection tasks but that the SDT model is best when the visual displays always consist of the same number of items. In our General Discussion section, we outline, but ultimately reject, a number of potential explanations for the observed pattern of results. We finish by describing future research that is needed to pinpoint the basis for this observed pattern of results.

  18. The GENOTEND chip: a new tool to analyse gene expression in muscles of beef cattle for beef quality prediction

    Directory of Open Access Journals (Sweden)

    Hocquette Jean-Francois

    2012-08-01

    validated in the groups of 30 Charolais young bulls slaughtered in year 2, and in the 21 Charolais steers slaughtered in year 1, but not in the group of 19 steers slaughtered in year 2 which differ from the reference group by two factors (gender and year. When the first three groups of animals were analysed together, this subset of genes explained a 4-fold higher proportion of the variability in tenderness than muscle biochemical traits. Conclusion This study underlined the relevance of the GENOTEND chip to identify markers of beef quality, mainly by confirming previous results and by detecting other genes of the heat shock family as potential markers of beef quality. However, it was not always possible to extrapolate the relevance of these markers to all animal groups which differ by several factors (such as gender or environmental conditions of production from the initial population of reference in which these markers were identified.

  19. Taxometric analyses and predictive accuracy of callous-unemotional traits regarding quality of life and behavior problems in non-conduct disorder diagnoses.

    Science.gov (United States)

    Herpers, Pierre C M; Klip, Helen; Rommelse, Nanda N J; Taylor, Mark J; Greven, Corina U; Buitelaar, Jan K

    2017-07-01

    Callous-unemotional (CU) traits have mainly been studied in relation to conduct disorder (CD), but can also occur in other disorder groups. However, it is unclear whether there is a clinically relevant cut-off value of levels of CU traits in predicting reduced quality of life (QoL) and clinical symptoms, and whether CU traits better fit a categorical (taxonic) or dimensional model. Parents of 979 youths referred to a child and adolescent psychiatric clinic rated their child's CU traits on the Inventory of Callous-Unemotional traits (ICU), QoL on the Kidscreen-27, and clinical symptoms on the Child Behavior Checklist. Experienced clinicians conferred DSM-IV-TR diagnoses of ADHD, ASD, anxiety/mood disorders and DBD-NOS/ODD. The ICU was also used to score the DSM-5 specifier 'with limited prosocial emotions' (LPE) of Conduct Disorder. Receiver operating characteristic (ROC) analyses revealed that the predictive accuracy of the ICU and LPE regarding QoL and clinical symptoms was poor to fair, and similar across diagnoses. A clinical cut-off point could not be defined. Taxometric analyses suggested that callous-unemotional traits on the ICU best reflect a dimension rather than taxon. More research is needed on the impact of CU traits on the functional adaptation, course, and response to treatment of non-CD conditions. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  20. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  1. Cost-effectiveness of insulin aspart versus human soluble insulin in type 2 diabetes in four European countries: subgroup analyses from the PREDICTIVE study.

    Science.gov (United States)

    Palmer, James L; Goodall, Gordon; Nielsen, Steffen; Kotchie, Robert W; Valentine, William J; Palmer, Andrew J; Roze, Stéphane

    2008-05-01

    To evaluate the long-term health economic outcomes associated with insulin aspart (IAsp) compared to human soluble insulin (HI) in type 2 diabetes patients on basal-bolus therapy in Sweden, Spain, Italy and Poland. A published computer simulation model of diabetes was used to predict life expectancy, quality-adjusted life expectancy and incidence of diabetes-related complications. Baseline cohort characteristics (age 61.6 years, duration of diabetes 13.2 years, 45.1% male, HbA(1c) 8.2%, BMI 29.8 kg/m(2)) and treatment effects were derived from the PREDICTIVE observational study. Country-specific complication costs were derived from published sources. The analyses were run over 35-year time horizons from third-party payer perspectives in Spain, Italy and Poland and from a societal perspective in Sweden. Future costs and clinical benefits were discounted at country-specific discount rates. Sensitivity analyses were performed. IAsp was associated with improvements in discounted life expectancy and quality-adjusted life expectancy, and a reduced incidence of most diabetes-related complications versus HI in all four settings. IAsp was associated with societal cost-savings in Sweden (SEK 2470), direct medical cost-savings in Sweden and Spain (SEK 8248 and euro 1382, respectively), but increased direct costs in Italy (euro 2235) and Poland (euro 743). IAsp was associated with improved quality-adjusted life expectancy in Sweden (0.077 QALYs), Spain (0.080 QALYs), Italy (0.120 QALYs) and Poland (0.003 QALYs). IAsp was dominant versus HI in both Sweden and Spain, would be considered cost-effective in Italy with an incremental cost-effectiveness ratio of euro 18,597 per QALY gained, but would not be considered cost-effective in Poland.

  2. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Dionne, B.

    2011-01-01

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  3. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent

    Science.gov (United States)

    Guo, Yan; Warren Andersen, Shaneda; Shu, Xiao-Ou; Michailidou, Kyriaki; Bolla, Manjeet K.; Wang, Qin; Garcia-Closas, Montserrat; Milne, Roger L.; Schmidt, Marjanka K.; Chang-Claude, Jenny; Dunning, Allison; Bojesen, Stig E.; Ahsan, Habibul; Aittomäki, Kristiina; Andrulis, Irene L.; Anton-Culver, Hoda; Beckmann, Matthias W.; Beeghly-Fadiel, Alicia; Benitez, Javier; Bogdanova, Natalia V.; Bonanni, Bernardo; Børresen-Dale, Anne-Lise; Brand, Judith; Brauch, Hiltrud; Brenner, Hermann; Brüning, Thomas; Burwinkel, Barbara; Casey, Graham; Chenevix-Trench, Georgia; Couch, Fergus J.; Cross, Simon S.; Czene, Kamila; Dörk, Thilo; Dumont, Martine; Fasching, Peter A.; Figueroa, Jonine; Flesch-Janys, Dieter; Fletcher, Olivia; Flyger, Henrik; Fostira, Florentia; Gammon, Marilie; Giles, Graham G.; Guénel, Pascal; Haiman, Christopher A.; Hamann, Ute; Hooning, Maartje J.; Hopper, John L.; Jakubowska, Anna; Jasmine, Farzana; Jenkins, Mark; John, Esther M.; Johnson, Nichola; Jones, Michael E.; Kabisch, Maria; Knight, Julia A.; Koppert, Linetta B.; Kosma, Veli-Matti; Kristensen, Vessela; Le Marchand, Loic; Lee, Eunjung; Li, Jingmei; Lindblom, Annika; Lubinski, Jan; Malone, Kathi E.; Mannermaa, Arto; Margolin, Sara; McLean, Catriona; Meindl, Alfons; Neuhausen, Susan L.; Nevanlinna, Heli; Neven, Patrick; Olson, Janet E.; Perez, Jose I. A.; Perkins, Barbara; Phillips, Kelly-Anne; Pylkäs, Katri; Rudolph, Anja; Santella, Regina; Sawyer, Elinor J.; Schmutzler, Rita K.; Seynaeve, Caroline; Shah, Mitul; Shrubsole, Martha J.; Southey, Melissa C.; Swerdlow, Anthony J.; Toland, Amanda E.; Tomlinson, Ian; Torres, Diana; Truong, Thérèse; Ursin, Giske; Van Der Luijt, Rob B.; Verhoef, Senno; Whittemore, Alice S.; Winqvist, Robert; Zhao, Hui; Zhao, Shilin; Hall, Per; Simard, Jacques; Kraft, Peter; Hunter, David; Easton, Douglas F.; Zheng, Wei

    2016-01-01

    Background Observational epidemiological studies have shown that high body mass index (BMI) is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors. Methods We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC) (cases  =  46,325, controls  =  42,482). We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE) Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively. Results In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56–0.75, p = 3.32 × 10−10). The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31–0.62, p  =  9.91 × 10−8) and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46–0.71, p  =  1.88 × 10−8). This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60–0.84, p   =   1.64 × 10−7). Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs) in association with breast cancer risk at p

  4. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    2016-08-01

    Full Text Available Observational epidemiological studies have shown that high body mass index (BMI is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors.We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC (cases  =  46,325, controls  =  42,482. We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively.In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56-0.75, p = 3.32 × 10-10. The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31-0.62, p  =  9.91 × 10-8 and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46-0.71, p  =  1.88 × 10-8. This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60-0.84, p   =   1.64 × 10-7. Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs in association with breast cancer risk at p < 0.05; for 16 of them, the

  5. Application of pathways analyses for site performance prediction for the Gas Centrifuge Enrichment Plant and Oak Ridge Central Waste Disposal Facility

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.

    1984-01-01

    The suitability of the Gas Centrifuge Enrichment Plant and the Oak Ridge Central Waste Disposal Facility for shallow-land burial of low-level radioactive waste is evaluated using pathways analyses. The analyses rely on conservative scenarios to describe the generation and migration of contamination and the potential human exposure to the waste. Conceptual and numerical models are developed using data from comprehensive laboratory and field investigations and are used to simulate the long-term transport of contamination to man. Conservatism is built into the analyses when assumptions concerning future events have to be made or when uncertainties concerning site or waste characteristics exist. Maximum potential doses to man are calculated and compared to the appropriate standards. The sites are found to provide adequate buffer to persons outside the DOE reservations. Conclusions concerning site capacity and site acceptability are drawn. In reaching these conclusions, some consideration is given to the uncertainties and conservatisms involved in the analyses. Analytical methods to quantitatively assess the probability of future events to occur and the sensitivity of the results to data uncertainty may prove useful in relaxing some of the conservatism built into the analyses. The applicability of such methods to pathways analyses is briefly discussed. 18 refs., 9 figs

  6. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior.

    Science.gov (United States)

    Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D

    2016-08-01

    Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. In silico and cell-based analyses reveal strong divergence between prediction and observation of T-cell-recognized tumor antigen T-cell epitopes.

    Science.gov (United States)

    Schmidt, Julien; Guillaume, Philippe; Dojcinovic, Danijel; Karbach, Julia; Coukos, George; Luescher, Immanuel

    2017-07-14

    Tumor exomes provide comprehensive information on mutated, overexpressed genes and aberrant splicing, which can be exploited for personalized cancer immunotherapy. Of particular interest are mutated tumor antigen T-cell epitopes, because neoepitope-specific T cells often are tumoricidal. However, identifying tumor-specific T-cell epitopes is a major challenge. A widely used strategy relies on initial prediction of human leukocyte antigen-binding peptides by in silico algorithms, but the predictive power of this approach is unclear. Here, we used the human tumor antigen NY-ESO-1 (ESO) and the human leukocyte antigen variant HLA-A*0201 (A2) as a model and predicted in silico the 41 highest-affinity, A2-binding 8-11-mer peptides and assessed their binding, kinetic complex stability, and immunogenicity in A2-transgenic mice and on peripheral blood mononuclear cells from ESO-vaccinated melanoma patients. We found that 19 of the peptides strongly bound to A2, 10 of which formed stable A2-peptide complexes and induced CD8 + T cells in A2-transgenic mice. However, only 5 of the peptides induced cognate T cells in humans; these peptides exhibited strong binding and complex stability and contained multiple large hydrophobic and aromatic amino acids. These results were not predicted by in silico algorithms and provide new clues to improving T-cell epitope identification. In conclusion, our findings indicate that only a small fraction of in silico -predicted A2-binding ESO peptides are immunogenic in humans, namely those that have high peptide-binding strength and complex stability. This observation highlights the need for improving in silico predictions of peptide immunogenicity. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Molecular and clinical analyses of Greig cephalopolysyndactyly and Pallister-Hall syndromes: Robust phenotype prediction from the type and position of GLI3 mutations

    NARCIS (Netherlands)

    Johnston, Jennifer J.; Olivos-Glander, Isabelle; Killoran, Christina; Elson, Emma; Turner, Joyce T.; Peters, Kathryn F.; Abbott, Margaret H.; Aughton, David J.; Aylsworth, Arthur S.; Bamshad, Michael J.; Booth, Carol; Curry, Cynthia J.; David, Albert; Dinulos, Mary Beth; Flannery, David B.; Fox, Michelle A.; Graham, John M.; Grange, Dorothy K.; Guttmacher, Alan E.; Hannibal, Mark C.; Henn, Wolfram; Hennekam, Raoul C. M.; Holmes, Lewis B.; Hoyme, H. Eugene; Leppig, Kathleen A.; Lin, Angela E.; Macleod, Patrick; Manchester, David K.; Marcelis, Carlo; Mazzanti, Laura; McCann, Emma; McDonald, Marie T.; Mendelsohn, Nancy J.; Moeschler, John B.; Moghaddam, Billur; Neri, Giovanni; Newbury-Ecob, Ruth; Pagon, Roberta A.; Phillips, John A.; Sadler, Laurie S.; Stoler, Joan M.; Tilstra, David; Walsh Vockley, Catherine M.; Zackai, Elaine H.; Zadeh, Touran M.; Brueton, Louise; Black, Graeme Charles M.; Biesecker, Leslie G.

    2005-01-01

    Mutations in the GLI3 zinc-finger transcription factor gene cause Greig cephalopolysyndactyly syndrome (GCPS) and Pallister-Hall syndrome (PHS), which are variable but distinct clinical entities. We hypothesized that GLI3 mutations that predict a truncated functional repressor protein cause PHS and

  9. The Janus-faced nature of time spent on homework : Using latent profile analyses to predict academic achievement over a school year

    NARCIS (Netherlands)

    Flunger, Barbara; Trautwein, Ulrich; Nagengast, Benjamin; Lüdtke, Oliver; Niggli, Alois; Schnyder, Inge

    2015-01-01

    Homework time and achievement are only modestly associated, whereas homework effort has consistently been shown to positively predict later achievement. We argue that time spent on homework can be an important predictor of achievement when combined with measures of homework effort. Latent profile

  10. Consequences of kriging and land use regression for PM2.5 predictions in epidemiologic analyses: insights into spatial variability using high-resolution satellite data.

    Science.gov (United States)

    Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A

    2015-01-01

    Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.

  11. A systematic study of coordinate precision in X-ray structure analyses. Pt. 2. Predictive estimates of E.S.D.'s for the general-atom case

    International Nuclear Information System (INIS)

    Allen, F.H.; Cole, J.C.; Durham Univ.; Howard, J.A.K.; Durham Univ.

    1995-01-01

    The relationship between the mean isotropic e.s.d. anti σ(A) o of any element type A in a crystal structure and the R factor and atomic constitution of that structure is explored for 124 905 element-type occurrences calculated from 33 955 entries in the Cambridge Structural Database. On the basis of the work of Cruickshank [Acta Cryst. (1960), 13, 774-777], it is shown that anti σ(A) p values can be estimated by equations of the form anti σ(A) p = KRN 1/2 c /Z A where N c is taken as ΣZ 2 i /Z 2 C , the Z i are atomic numbers and the summation is over all atoms in the asymmetric unit. Values of K were obtained by regression techniques using the anti σ(A) o as basis. The constant K nc for noncentrosymmetric structures is found to be larger than K c for centrosymmetric structures by a factor of ∼2 1/2 , as predicted by Cruickshank (1960). Two predictive equations are generated, one for first-row elements and the second for elements with Z A > 10. The relationship between the different constants K that arise in these two situations is linked to shape differentials in scattering-factor (f i ) curves for light and heavy atoms. It is found that predictive equations in which the Z i are selectively replaced by f i at a constant sinθ/λ of 0.30 A -1 generate closely similar values of K for the light-atom and heavy-atom subsets. The overall analysis indicates that atomic e.s.d.'s may be seriously underestimated in the more precise structure determinations, that e.s.d.'s for the heaviest atoms may be less reliable than those for lighter atoms and that e.s.d.'s in noncentrosymmetric structures may be less accurate than those in centrosymmetric structures. (orig.)

  12. Comparative Analyses of the Relative Effects of Various Mutations in Major Histocompatibility Complex I-a Way to Predict Protein-Protein Interactions.

    Science.gov (United States)

    Ali, Ananya; Biswas, Ria; Bhattacharjee, Sanchari; Nath, Prabahan; Pan, Sumanjit; Bagchi, Angshuman

    2016-09-01

    Protein-protein interactions (PPIs) play pivotal roles in most of the biological processes. PPI dysfunctions are therefore associated with disease situations. Mutations often lead to PPI dysfunctions, but there are certain other types of mutations which do not cause any appreciable abnormalities. This second type of mutations is called polymorphic mutations. So far, there are many studies that deal with the identification of PPI sites, but clear-cut analyses of the involvements of mutations in PPI dysfunctions are few and far between. We therefore made an attempt to link the appearances of mutations and PPI disruptions. We used major histocompatibility complex as our reference protein complex. We analyzed the mutations leading to the disease amyloidosis and also the other mutations that do not lead to disease conditions. We computed various biophysical parameters like relative solvent accessibility to discriminate between the two different types of mutations. Our analyses for the first time came up with a plausible explanation for the effects of different types of mutations in disease development. Our future plans are to build tools to detect the effects of mutations in disease developments by disrupting the PPIs.

  13. Effect of the fermentation pH on the storage stability of Lactobacillus rhamnosus preparations and suitability of in vitro analyses of cell physiological functions to predict it.

    Science.gov (United States)

    Saarela, M H; Alakomi, H-L; Puhakka, A; Mättö, J

    2009-04-01

    To investigate how cell physiological functions can predict the stability of freeze-dried probiotics. In addition, the effect of the fermentation pH on the stability of probiotics was investigated. Fermenter-grown (pH 5.8 or 5.0) Lactobacillus rhamnosus cells were freeze-dried and their survival was evaluated during storage at 37 degrees C, in apple juice and during acid [hydrochloric acid (HCl) and malic acid] and bile exposure. Cells grown at pH 5.0 were generally coping better with acid-stress than cells grown at pH 5.8. Cells were more sensitive to malic acid compared with HCl. Short-term stability results of Lact. rhamnosus cells in malic acid correlated well with the long-term stability results in apple juice, whereas the results of cell membrane integrity studies were in accordance with bile exposure results. Malic acid exposure can prove useful in evaluating the long-term stability of probiotic preparations in apple juice. Fermentation at reduced pH may ensure a better performance of Lact. rhamnosus cells during the subsequent acid-stress. The beneficial effect of lowered fermentation pH to Lact. rhamnosus stability during storage in apple juice and the usefulness of malic acid test in predicting the stability were shown.

  14. Biomimetic in vitro oxidation of lapachol: a model to predict and analyse the in vivo phase I metabolism of bioactive compounds.

    Science.gov (United States)

    Niehues, Michael; Barros, Valéria Priscila; Emery, Flávio da Silva; Dias-Baruffi, Marcelo; Assis, Marilda das Dores; Lopes, Norberto Peporine

    2012-08-01

    The bioactive naphtoquinone lapachol was studied in vitro by a biomimetic model with Jacobsen catalyst (manganese(III) salen) and iodosylbenzene as oxidizing agent. Eleven oxidation derivatives were thus identified and two competitive oxidation pathways postulated. Similar to Mn(III) porphyrins, Jacobsen catalyst mainly induced the formation of para-naphtoquinone derivatives of lapachol, but also of two ortho-derivatives. The oxidation products were used to develop a GC-MS (SIM mode) method for the identification of potential phase I metabolites in vivo. Plasma analysis of Wistar rats orally administered with lapachol revealed two metabolites, α-lapachone and dehydro-α-lapachone. Hence, the biomimetic model with a manganese salen complex has evidenced its use as a valuable tool to predict and elucidate the in vivo phase I metabolism of lapachol and possibly also of other bioactive natural compounds. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  15. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  16. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon) berries at ripening initiation

    Science.gov (United States)

    Lücker, Joost; Laszczak, Mario; Smith, Derek; Lund, Steven T

    2009-01-01

    Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison') in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening initiation and may be further

  17. Structural and Biochemical Analyses of Swine Major Histocompatibility Complex Class I Complexes and Prediction of the Epitope Map of Important Influenza A Virus Strains.

    Science.gov (United States)

    Fan, Shuhua; Wu, Yanan; Wang, Song; Wang, Zhenbao; Jiang, Bo; Liu, Yanjie; Liang, Ruiying; Zhou, Wenzhong; Zhang, Nianzhi; Xia, Chun

    2016-08-01

    , the skewing of the α1 and α2 helixes is important in the different peptide conformations in SLA-3*hs0202. We also determined the fundamental motif for SLA-3*hs0202 to be X-(M/A/R)-(N/Q/R/F)-X-X-X-X-X-(V/I) based on a series of structural and biochemical analyses, and 28 SLA-3*hs0202-restricted epitope candidates were identified from important IAV strains. We believe our structure and analyses of pSLA-3*hs0202 can benefit vaccine development to control IAV in swine. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  18. Effects of pharmacists' interventions on appropriateness of prescribing and evaluation of the instruments' (MAI, STOPP and STARTs' ability to predict hospitalization--analyses from a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Ulrika Gillespie

    Full Text Available Appropriateness of prescribing can be assessed by various measures and screening instruments. The aims of this study were to investigate the effects of pharmacists' interventions on appropriateness of prescribing in elderly patients, and to explore the relationship between these results and hospital care utilization during a 12-month follow-up period.The study population from a previous randomized controlled study, in which the effects of a comprehensive pharmacist intervention on re-hospitalization was investigated, was used. The criteria from the instruments MAI, STOPP and START were applied retrospectively to the 368 study patients (intervention group (I n = 182, control group (C n = 186. The assessments were done on admission and at discharge to detect differences over time and between the groups. Hospital care consumption was recorded and the association between scores for appropriateness, and hospitalization was analysed.The number of Potentially Inappropriate Medicines (PIMs per patient as identified by STOPP was reduced for I but not for C (1.42 to 0.93 vs. 1.46 to 1.66 respectively, p<0.01. The number of Potential Prescription Omissions (PPOs per patient as identified by START was reduced for I but not for C (0.36 to 0.09 vs. 0.42 to 0.45 respectively, p<0.001. The summated score for MAI was reduced for I but not for C (8.5 to 5.0 and 8.7 to 10.0 respectively, p<0.001. There was a positive association between scores for MAI and STOPP and drug-related readmissions (RR 8-9% and 30-34% respectively. No association was detected between the scores of the tools and total re-visits to hospital.The interventions significantly improved the appropriateness of prescribing for patients in the intervention group as evaluated by the instruments MAI, STOPP and START. High scores in MAI and STOPP were associated with a higher number of drug-related readmissions.

  19. A new tool for prediction and analysis of thermal comfort in steady and transient states; Un nouvel outil pour la prediction et l'analyse du confort thermique en regime permanent et variable

    Energy Technology Data Exchange (ETDEWEB)

    Megri, A.Ch. [Illinois Institute of Technology, Civil and Architectural Engineering Dept., Chicago, Illinois (United States); Megri, A.F. [Centre Universitaire de Tebessa, Dept. d' Electronique (Algeria); El Naqa, I. [Washington Univ., School of Medicine, Dept. of Radiation Oncology, Saint Louis, Missouri (United States); Achard, G. [Universite de Savoie, Lab. Optimisation de la Conception et Ingenierie de L' Environnement (LOCIE) - ESIGEC, 73 - Le Bourget du Lac (France)

    2006-02-15

    Thermal comfort is influenced by psychological as well as physiological factors. This paper proposes the use of support vector machine (SVM) learning for automated prediction of human thermal comfort in steady and transient states. The SVM is an artificial intelligent approach that could capture the input/output mapping from the given data. Support vector machines were developed based on the Structural Risk Minimization principle. Different sets of representative experimental environmental factors that affect a homogenous person's thermal balance were used for training the SVM machine. The SVM is a very efficient, fast, and accurate technique to identify thermal comfort. This technique permits the determination of thermal comfort indices for different sub-categories of people; sick and elderly, in extreme climatic conditions, when the experimental data for such sub-category are available. The experimental data has been used for the learning and testing processes. The results show a good correlation between SVM predicted values and those obtained from conventional thermal comfort, such as Fanger and Gagge models. The 'trained machine' with representative data could be used easily and effectively in comparison with other conventional estimation methods of different indices. (author)

  20. Area under the curve predictions of dalbavancin, a new lipoglycopeptide agent, using the end of intravenous infusion concentration data point by regression analyses such as linear, log-linear and power models.

    Science.gov (United States)

    Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally

    2018-02-01

    1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.

  1. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  2. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  3. Periodic safety analyses

    International Nuclear Information System (INIS)

    Gouffon, A.; Zermizoglou, R.

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989

  4. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...... and finally data analysis based on the ISO approach. The device was calibrated and tested on commercially available laser systems. It showed good reproducibility. It was the target to be able to measure CW lasers with a power up to 200 W, focused down to spot diameters in the range of 10µm. In order...

  5. IMI - Oral biopharmaceutics tools project - Evaluation of bottom-up PBPK prediction success part 3: Identifying gaps in system parameters by analysing In Silico performance across different compound classes.

    Science.gov (United States)

    Darwich, Adam S; Margolskee, Alison; Pepin, Xavier; Aarons, Leon; Galetin, Aleksandra; Rostami-Hodjegan, Amin; Carlert, Sara; Hammarberg, Maria; Hilgendorf, Constanze; Johansson, Pernilla; Karlsson, Eva; Murphy, Dónal; Tannergren, Christer; Thörn, Helena; Yasin, Mohammed; Mazuir, Florent; Nicolas, Olivier; Ramusovic, Sergej; Xu, Christine; Pathak, Shriram M; Korjamo, Timo; Laru, Johanna; Malkki, Jussi; Pappinen, Sari; Tuunainen, Johanna; Dressman, Jennifer; Hansmann, Simone; Kostewicz, Edmund; He, Handan; Heimbach, Tycho; Wu, Fan; Hoft, Carolin; Pang, Yan; Bolger, Michael B; Huehn, Eva; Lukacova, Viera; Mullin, James M; Szeto, Ke X; Costales, Chester; Lin, Jian; McAllister, Mark; Modi, Sweta; Rotter, Charles; Varma, Manthena; Wong, Mei; Mitra, Amitava; Bevernage, Jan; Biewenga, Jeike; Van Peer, Achiel; Lloyd, Richard; Shardlow, Carole; Langguth, Peter; Mishenzon, Irina; Nguyen, Mai Anh; Brown, Jonathan; Lennernäs, Hans; Abrahamsson, Bertil

    2017-01-01

    Three Physiologically Based Pharmacokinetic software packages (GI-Sim, Simcyp® Simulator, and GastroPlus™) were evaluated as part of the Innovative Medicine Initiative Oral Biopharmaceutics Tools project (OrBiTo) during a blinded "bottom-up" anticipation of human pharmacokinetics. After data analysis of the predicted vs. measured pharmacokinetics parameters, it was found that oral bioavailability (F oral ) was underpredicted for compounds with low permeability, suggesting improper estimates of intestinal surface area, colonic absorption and/or lack of intestinal transporter information. F oral was also underpredicted for acidic compounds, suggesting overestimation of impact of ionisation on permeation, lack of information on intestinal transporters, or underestimation of solubilisation of weak acids due to less than optimal intestinal model pH settings or underestimation of bile micelle contribution. F oral was overpredicted for weak bases, suggesting inadequate models for precipitation or lack of in vitro precipitation information to build informed models. Relative bioavailability was underpredicted for both high logP compounds as well as poorly water-soluble compounds, suggesting inadequate models for solubility/dissolution, underperforming bile enhancement models and/or lack of biorelevant solubility measurements. These results indicate areas for improvement in model software, modelling approaches, and generation of applicable input data. However, caution is required when interpreting the impact of drug-specific properties in this exercise, as the availability of input parameters was heterogeneous and highly variable, and the modellers generally used the data "as is" in this blinded bottom-up prediction approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  7. Dialogisk kommunikationsteoretisk analyse

    DEFF Research Database (Denmark)

    Phillips, Louise Jane

    2018-01-01

    analysemetode, der er blevet udviklet inden for dialogisk kommunikationsforskning - The Integrated Framework for Analysing Dialogic Knowledge Production and Communication (IFADIA). IFADIA-metoden bygger på en kombination af Bakhtins dialogteori og Foucaults teori om magt/viden og diskurs. Metoden er beregnet...

  8. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    1997-01-01

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  9. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, Maria A.; Luyten, Johannes W.; Scheerens, Jaap; Sleegers, P.J.C.; Scheerens, J

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  10. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    . Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  11. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  12. Filmstil - teori og analyse

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    Filmstil påvirker på afgørende vis vores oplevelse af film. Men filmstil, måden, de levende billeder organiserer fortællingen på fylder noget mindre end filmens handling, når vi taler om film. Filmstil - teori og analyse er en rigt eksemplificeret præsentation, kritik og videreudvikling af...

  13. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  14. Hydrophysical conditions and periphyton in natural rivers. Analysis and predictive modelling of periphyton by changed regulations; Hydrofysiske forhold og begroing i naturlige elver. Analyse og prediktiv modellering av begroing ved reguleringsendringer

    Energy Technology Data Exchange (ETDEWEB)

    Stokseth, S.

    1994-10-01

    The objective of this thesis has been to examine the interaction between hydrodynamical and physical factors and the temporal and spatial dynamics of periphyton in natural steep rivers. The study strategy has been to work with quantitative system variables to be able to evaluate the potential usability of a predictive model for periphyton changes as a response to river regulations. The thesis is constituted by a theoretical and an empirical study. The theoretical study is aimed at presenting a conceptual model of the relevant factors based on an analysis of published studies. Effort has been made to evaluate and present the background material in a structured way. To concurrently handle the spatial and temporal dynamics of periphyton a new method for data collection has been developed. A procedure for quantifying the photo registrations has been developed. The simple hydrodynamical parameters were estimated from a set of standard formulas whereas the complex parameters were estimated from a three dimensional simulation model called SSIIM. The main conclusion from the analysis is that flood events are the major controlling factors wrt. periphyton biomass and that water temperature is of major importance for the periphyton resistance. Low temperature clearly increases the periphyton erosion resistance. Thus, to model or control the temporal dynamics the river periphyton, the water temperature and the frequency and size of floods should be regarded the most significant controlling factors. The data in this study has been collected from a river with a stable water quality and frequent floods. 109 refs., 41 figs., 34 tabs.

  15. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  16. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  17. Some Tools for Robustifying Econometric Analyses

    NARCIS (Netherlands)

    V. Hoornweg (Victor)

    2013-01-01

    markdownabstract__Abstract__ We use automated algorithms to update and evaluate ad hoc judgments that are made in applied econometrics. Such an application of automated algorithms robustifies empirical econometric analyses, it achieves lower and more consistent prediction errors, and it helps to

  18. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  19. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  20. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    Sturzstroms are very fast landslides of very large initial volume. As type features they display extreme run out, pared with intensive fragmentation of the involved blocks of rock within a collisional flow. The inherent danger to the growing communities in alpine valleys below future potential sites of sturzstroms must be examined and results of predictions of endangered zones allow to impact upon the planning processes in these areas. This calls for the ability to make Type A predictions, according to Lambe (1973), which are done before an event. But Type A predictions are only possible if sufficient understanding of the mechanisms involved in a process is available. The motivation of the doctoral thesis research project presented is therefore to reveal the mechanics of sturzstroms in more detail in order to contribute to the development of a Type A run out prediction model. It is obvious that a sturzstrom represents a highly dynamic collisional granular regime. Thus particles do not only collide but will eventually crush each other. Erismann and Abele (2001) describe this process as dynamic disintegration, where kinetic energy is the main driver for fragmenting the rock mass. In this case an approach combining the type features long run out and fragmentation within a single hypothesis is represented by the dynamic fragmentation-spreading model (Davies and McSaveney, 2009; McSaveney and Davies, 2009). Unfortunately, sturzstroms, and fragmentation within sturzstroms, can not be observed directly in a real event because of their long "reoccurrence time" and the obvious difficulties in placing measuring devices within such a rock flow. Therefore, rigorous modelling is required in particular of the transition from static to dynamic behaviour to achieve better knowledge of the mechanics of sturzstroms, and to provide empirical evidence to confirm the dynamic fragmentation-spreading model. Within this study fragmentation and their effects on the mobility of sturzstroms

  1. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  2. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  3. Analyse af elbilers forbrug

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Torp, Kristian

    2014-01-01

    Denne rapport undersøger GPS og CAN bus datagrundlaget opsamlet ved kørsel med elbiler og analysere på elbilers forbrug. Analyserne er baseret på godt 133 millioner GPS og CAN bus målinger opsamlet fra 164 elbiler (Citroen C-Zero, Mitsubishi iMiev og Peugeot Ion) i kalenderåret 2012....... For datagrundlaget kan det konstateres, at der er behov for væsentlige, men simple opstramninger for fremadrettet at gøre det nemmere at anvende GPS/CAN bus data fra elbiler i andre analyser. Brugen af elbiler er sammenlignet med brændstofbiler og konklusionen er, at elbiler generelt kører 10-15 km/t langsommere på...

  4. Analyse de "La banlieue"

    Directory of Open Access Journals (Sweden)

    Nelly Morais

    2006-11-01

    Full Text Available 1. Préambule - Conditions de réalisation de la présente analyse Un groupe d'étudiants de master 1 de FLE de l'université Paris 3 (donc des étudiants en didactique des langues se destinant à l'enseignement du FLE a observé le produit au cours d'un module sur les TIC (Technologies de l'Information et de la Communication et la didactique des langues. Une discussion s'est ensuite engagée sur le forum d'une plate-forme de formation à distance à partir de quelques questions posées par l'enseigna...

  5. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  6. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  7. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  8. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  9. Comparative analyses of genetic risk prediction methods reveal ...

    Indian Academy of Sciences (India)

    2015-03-12

    Mar 12, 2015 ... Abstract. Nonalcoholic fatty liver disease (NAFLD) is a distinct pathologic condition characterized by a disease spectrum ranging from simple steatosis to steato-hepatitis, cirrhosis and hepatocellular carcinoma. Prevalence of NAFLD varies in different ethnic groups, ranging from 12% in Chinese to 45% in ...

  10. Comparison of analyses to predict ruminal fibre degradability and ...

    African Journals Online (AJOL)

    The objective of this study was to compare the ruminal degradability of neutral detergent fibre (NDF) and indigestible NDF (INDF) between silages (n = 24) that originated from three different temperate grass species, i.e. Dactylis glomerata L., Festuca arundinacea L. and hybrid, Felina – Lolium multiflorum L. × Festuca ...

  11. Analysis of K-net and Kik-net data: implications for ground motion prediction - acceleration time histories, response spectra and nonlinear site response; Analyse des donnees accelerometriques de K-net et Kik-net: implications pour la prediction du mouvement sismique - accelerogrammes et spectres de reponse - et la prise en compte des effets de site non-lineaire

    Energy Technology Data Exchange (ETDEWEB)

    Pousse, G

    2005-10-15

    This thesis intends to characterize ground motion during earthquake. This work is based on two Japanese networks. It deals with databases of shallow events, depth less than 25 km, with magnitude between 4.0 and 7.3. The analysis of K-net allows to compute a spectral ground motion prediction equation and to review the shape of the Eurocode 8 design spectra. We show the larger amplification at short period for Japanese data and bring in light the soil amplification that takes place at large period. In addition, we develop a new empirical model for simulating synthetic stochastic nonstationary acceleration time histories. By specifying magnitude, distance and site effect, this model allows to produce many time histories, that a seismic event is liable to produce at the place of interest. Furthermore, the study of near-field borehole records of the Kik-net allows to explore the validity domain of predictive equations and to explain what occurs by extrapolating ground motion predictions. Finally, we show that nonlinearity reduces the dispersion of ground motion at the surface. (author)

  12. Semen analysis and prediction of natural conception

    NARCIS (Netherlands)

    Leushuis, Esther; van der Steeg, Jan Willem; Steures, Pieternel; Repping, Sjoerd; Bossuyt, Patrick M. M.; Mol, Ben Willem J.; Hompes, Peter G. A.; van der Veen, Fulco

    2014-01-01

    Do two semen analyses predict natural conception better than a single semen analysis and will adding the results of repeated semen analyses to a prediction model for natural pregnancy improve predictions? A second semen analysis does not add helpful information for predicting natural conception

  13. ENSO Prediction and Predictability

    Science.gov (United States)

    Cane, M. A.

    2016-12-01

    In 1986 there was one dynamical forecasting model for ENSO and a small handful of statistical and hybrid schemes. Now there are more than 40 models in the IRI-CPC forecast plume, many of them coupled GCMs. Why hasn't forecasting improved more in 30 years? In a 1982 landmark paper, Rasmussen and Carpenter created the canonical El Niño, transforming the inchoate view of the time. Now much research is focuses on ENSO diversity. Is our understanding deeper for this? Has this work properly incorporated the ENSO life cycle of the canonical event? Some eras are more predictable than others. Why? Is it random, or is there a systematic difference in background state? Do we have any idea how predictable ENSO is? Why is the ENSO band 2-7 years? What will become of ENSO in the next century?

  14. WALS Prediction

    NARCIS (Netherlands)

    Magnus, J.R.; Wang, W.; Zhang, Xinyu

    2012-01-01

    Abstract: Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty

  15. Predictive Testing

    Science.gov (United States)

    ... you want to learn. Search form Search Predictive testing You are here Home Testing & Services Testing for ... you make the decision. What Is Predictive Genetic Testing Predictive genetic testing searches for genetic changes, or ...

  16. Multiple Imputation for Network Analyses

    NARCIS (Netherlands)

    Krause, Robert; Huisman, Mark; Steglich, Christian; Snijders, Thomas

    2016-01-01

    Missing data on network ties is a fundamental problem for network analyses. The biases induced by missing edge data, even when missing completely at random (MCAR), are widely acknowledged and problematic for network analyses (Kossinets, 2006; Huisman & Steglich, 2008; Huisman, 2009). Although

  17. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    , and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack...... for the influence of such size-effects on cavitation instabilities are presented. When a metal contains a distribution of micro voids, and the void spacing compared to void size is not extremely large, the surrounding voids may affect the occurrence of a cavitation instability at one of the voids. This has been...

  18. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  19. Climate prediction and predictability

    Science.gov (United States)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  20. [Prediction in cerebrovascular diseases].

    Science.gov (United States)

    Hamann, G F

    2014-10-01

    Prediction of the outcome of cerebrovascular diseases or of the effects and complications of various forms of treatment are essential components of all stroke treatment regimens. This review focuses on the prediction of the stroke risk in primary prevention, the prediction of the risk of secondary stroke following a transient ischemic attack (TIA), the estimation of the outcome following manifest stroke and the treatment effects, the prediction of secondary cerebrovascular events and the prediction of vascular cognitive impairment following stroke. All predictive activities in cerebrovascular disease are hindered by the translation of predictive results from studies and patient populations to the individual patient. Future efforts in genetic analyses may be able to overcome this barrier and to enable individual prediction in the area of so-called personalized medicine. In all the various fields of prediction in cerebrovascular diseases, three major variables are always important: age of the patient, severity and subtype of the stroke. Increasing age, more severe stroke symptoms and the cardioembolic stroke subtype predict a poor outcome regarding both survival and permanent disability. This finding is somewhat banal and will therefore never replace the well experienced clinician judging the chances of a patient and taking into account the personal situation of this patient, e.g. for initiation of a rehabilitation program. Besides the individualized prediction, in times of restricted economic resources and increasing tendency to clarify questions of medical treatment in court, it seems unavoidable to use prediction in economic and medicolegal interaction with clinical medicine. This tendency will be accompanied by difficult ethical problems which neurologists must be aware of. Improved prediction should not be used to allocate or restrict resources or to restrict medically indicated treatment.

  1. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability: SSD Plot Diagrams

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  2. Data for decay Heat Predictions

    International Nuclear Information System (INIS)

    1987-01-01

    These proceedings of a specialists' meeting on data for decay heat predictions are based on fission products yields, on delayed neutrons and on comparative evaluations on evaluated and experimental data for thermal and fast fission. Fourteen conferences were analysed

  3. Análise lise da estabilidade e previsibilidade da qualidade fisiológica de sementes de soja produzidas em Cristalina, Goiás = Stability and predictability analyses of the physiological quality of soybean seeds produced in Cristalina, Goiás (Brazil

    Directory of Open Access Journals (Sweden)

    Éder Matsuo

    2008-04-01

    emergence speed and stability analyses were tested through the methods proposed by Lin and Binns (1988 and Annicchiarico (1992. The germination percentage averages, the emergence of plants and the emergence speed index were compared through Tukey’s test at 5% probability. In the evaluation of the seeds’ physiological quality, genotype 7B1454170 was identified as the best, and genotype9B1459189 as the worst. The genotypes Emgopa 313, 7B1454170, 11B145341 and DM339 were classified as offering high stability in physiological quality, and genotypes 3B1346193 and 9B1459189 offered low predictability. The estimation methods used were efficient,coherent among them and allowed the identification, among the evaluated genotypes, of the ones that offered greater stability and predictability.

  4. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software......, this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...... are based on (quite successful) ad-hoc techniques. We believe they can be significantly improved beyond the state-of-the-art by pairing them with static analyses techniques. In this paper we present an approach to both formalising those real-world systems, as well as providing an underlying semantics, which...

  5. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  6. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well...

  7. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  8. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  9. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  10. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  11. Safety analyses of surface facilities

    International Nuclear Information System (INIS)

    Anspach, W.; Baran, A.; Dorst, H.J.; Eifert, B.; Gruen, M.; Behrendt, V.; Berkhan, W.; Dincklage, R.D. v.; Doehler, J.; Bruecher, H.

    1981-01-01

    The investigations were carried out using the example of the Gorleben waste disposal center and the planning documents established for this center. The safety analyses refer to the transport of spent fuel elements, the water-cooled interin storage and the reprocessing stage. Regarding the risk analysis of the technical systems the dynamics of the courses of incidents can be better taken into account by doing a methodical development. (DG) [de

  12. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  13. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  14. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  15. Accurate renormalization group analyses in neutrino sector

    Energy Technology Data Exchange (ETDEWEB)

    Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Kaneta, Kunio [Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Takahashi, Ryo [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Yamaguchi, Yuya [Department of Physics, Faculty of Science, Hokkaido University, Sapporo 060-0810 (Japan)

    2014-08-15

    We investigate accurate renormalization group analyses in neutrino sector between ν-oscillation and seesaw energy scales. We consider decoupling effects of top quark and Higgs boson on the renormalization group equations of light neutrino mass matrix. Since the decoupling effects are given in the standard model scale and independent of high energy physics, our method can basically apply to any models beyond the standard model. We find that the decoupling effects of Higgs boson are negligible, while those of top quark are not. Particularly, the decoupling effects of top quark affect neutrino mass eigenvalues, which are important for analyzing predictions such as mass squared differences and neutrinoless double beta decay in an underlying theory existing at high energy scale.

  16. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  17. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    Science.gov (United States)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  18. Providing traceability for neuroimaging analyses.

    Science.gov (United States)

    McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran

    2013-09-01

    With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of

  19. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  20. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  1. Finite element analyses of wood laminated composite poles

    Science.gov (United States)

    Cheng Piao; Todd F. Shupe; R.C. Tang; Chung Y. Hse

    2005-01-01

    Finite element analyses using ANSYS were conducted on orthotropic, polygonal, wood laminated composite poles subjected to a body force and a concentrated load at the free end. Deflections and stress distributions of small-scale and full-size composite poles were analyzed and compared to the results obtained in an experimental study. The predicted deflection for both...

  2. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  3. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; ten Have, Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health

  4. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  5. Incorporating nonlinearity into mediation analyses.

    Science.gov (United States)

    Knafl, George J; Knafl, Kathleen A; Grey, Margaret; Dixon, Jane; Deatrick, Janet A; Gallo, Agatha M

    2017-03-21

    Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X) on a child's adaptation (Y) to a chronic condition by the difficulty (M) for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  6. Incorporating nonlinearity into mediation analyses

    Directory of Open Access Journals (Sweden)

    George J. Knafl

    2017-03-01

    Full Text Available Abstract Background Mediation is an important issue considered in the behavioral, medical, and social sciences. It addresses situations where the effect of a predictor variable X on an outcome variable Y is explained to some extent by an intervening, mediator variable M. Methods for addressing mediation have been available for some time. While these methods continue to undergo refinement, the relationships underlying mediation are commonly treated as linear in the outcome Y, the predictor X, and the mediator M. These relationships, however, can be nonlinear. Methods are needed for assessing when mediation relationships can be treated as linear and for estimating them when they are nonlinear. Methods Existing adaptive regression methods based on fractional polynomials are extended here to address nonlinearity in mediation relationships, but assuming those relationships are monotonic as would be consistent with theories about directionality of such relationships. Results Example monotonic mediation analyses are provided assessing linear and monotonic mediation of the effect of family functioning (X on a child’s adaptation (Y to a chronic condition by the difficulty (M for the family in managing the child's condition. Example moderated monotonic mediation and simulation analyses are also presented. Conclusions Adaptive methods provide an effective way to incorporate possibly nonlinear monotonicity into mediation relationships.

  7. Fracturing and brittleness index analyses of shales

    Science.gov (United States)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  8. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  9. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  10. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  11. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  12. Isotopic signatures by bulk analyses

    International Nuclear Information System (INIS)

    Efurd, D.W.; Rokop, D.J.

    1997-01-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally

  13. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  14. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  15. Ergonomic analyses of downhill skiing.

    Science.gov (United States)

    Clarys, J P; Publie, J; Zinzen, E

    1994-06-01

    The purpose of this study was to provide electromyographic feedback for (1) pedagogical advice in motor learning, (2) the ergonomics of materials choice and (3) competition. For these purposes: (1) EMG data were collected for the Stem Christie, the Stem Turn and the Parallel Christie (three basic ski initiation drills) and verified for the complexity of patterns; (2) integrated EMG (iEMG) and linear envelopes (LEs) were analysed from standardized positions, motions and slopes using compact, soft and competition skis; (3) in a simulated 'parallel special slalom', the muscular activity pattern and intensity of excavated and flat snow conditions were compared. The EMG data from the three studies were collected on location in the French Alps (Tignes). The analog raw EMG was recorded on the slopes with a portable seven-channel FM recorder (TEAC MR30) and with pre-amplified bipolar surface electrodes supplied with a precision instrumentation amplifier (AD 524, Analog Devices, Norwood, USA). The raw signal was full-wave rectified and enveloped using a moving average principle. This linear envelope was normalized according to the highest peak amplitude procedure per subject and was integrated in order to obtain a reference of muscular intensity. In the three studies and for all subjects (elite skiers: n = 25 in studies 1 and 2, n = 6 in study 3), we found a high level of co-contractions in the lower limb extensors and flexors, especially during the extension phase of the ski movement. The Stem Christie and the Parallel Christie showed higher levels of rhythmic movement (92 and 84%, respectively).(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Analysing Interplanetary Probe Guidance Accuracy

    Directory of Open Access Journals (Sweden)

    S. V. Sukhova

    2016-01-01

    Full Text Available The paper presents a guidance accuracy analysis and estimates delta-v budget required to provide the trajectory correction maneuvers for direct interplanetary flights (without midcourse gravity assists. The analysis takes into consideration the orbital hyperbolic injection errors (depend on a selected launch vehicle and ascent trajectory and the uncertainties of midcourse correction maneuvers.The calculation algorithm is based on Monte Carlo simulation and Danby’s matrix methods (the matrizant of keplerian motion. Danby’s method establishes a link between the errors of the spacecraft state vectors at different flight times using the reference keplerian orbit matrizant. Utilizing the nominal trajectory parameters and the covariance matrix of launch vehicle injection errors the random perturbed orbits are generated and required velocity corrections are calculated. The next step is to simulate midcourse maneuver performance uncertainty using the midcourse maneuver covariance matrix. The obtained trajectory correction impulses and spacecraft position errors are statistically processed to compute required delta-v budget and dispersions ellipse parameters for different prediction intervals.As an example, a guidance accuracy analysis has been conducted for a 2022 mission to Mars and a Venus mission in 2026. The paper considers one and two midcourse correction options, as well as utilization of two different launch vehicles.The presented algorithm based on Monte Carlo simulation and Danby’s methods provides preliminary evaluation for midcourse corrections delta-v budget and spacecraft position error. The only data required for this guidance accuracy analysis are a reference keplerian trajectory and a covariance matrix of the injection errors. Danby’s matrix method allows us to take into account also the other factors affecting the trajectory thereby increasing the accuracy of analysis.

  17. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  18. Detonation Performance Analyses for Recent Energetic Molecules

    Science.gov (United States)

    Stiel, Leonard; Samuels, Philip; Spangler, Kimberly; Iwaniuk, Daniel; Cornell, Rodger; Baker, Ernest

    2017-06-01

    Detonation performance analyses were conducted for a number of evolving and potential high explosive materials. The calculations were completed for theoretical maximum densities of the explosives using the Jaguar thermo-chemical equation of state computer programs for performance evaluations and JWL/JWLB equations of state parameterizations. A number of recently synthesized materials were investigated for performance characterizations and comparisons to existing explosives, including TNT, RDX, HMX, and Cl-20. The analytic cylinder model was utilized to establish cylinder and Gurney velocities as functions of the radial expansions of the cylinder for each explosive. The densities and heats of formulation utilized in the calculations are primarily experimental values from Picatinny Arsenal and other sources. Several of the new materials considered were predicted to have enhanced detonation characteristics compared to conventional explosives. In order to confirm the accuracy of the Jaguar and analytic cylinder model results, available experimental detonation and Gurney velocities for representative energetic molecules and their formulations were compared with the corresponding calculated values. Close agreement was obtained with most of the data. Presently at NATO.

  19. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  20. Thermal and hydraulic analyses of the System 81 cold traps

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.

    1977-06-15

    Thermal and hydraulic analyses of the System 81 Type I and II cold traps were completed except for thermal transients analysis. Results are evaluated, discussed, and reported. Analytical models were developed to determine the physical dimensions of the cold traps and to predict the performance. The FFTF cold trap crystallizer performances were simulated using the thermal model. This simulation shows that the analytical model developed predicts reasonably conservative temperatures. Pressure drop and sodium residence time calculations indicate that the present design will meet the requirements specified in the E-Specification. Steady state temperature data for the critical regions were generated to assess the magnitude of the thermal stress.

  1. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...... works, articles of theoretical nature, application-oriented studies and articles dealing with the topic of law and policy. The analysis of the research results reveals that more than half of the literature pool deals with the application and actual function tests of Prediction Markets. The results...

  2. Efficient ALL vs. ALL collision risk analyses

    Science.gov (United States)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  3. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  4. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  5. Analyses of bundle experiment data using MATRA-h

    Energy Technology Data Exchange (ETDEWEB)

    Lim, In Cheol; Chea, Hee Taek [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    When the construction and operation license for HANARO was renewed in 1995, 25% of CHF penalty was imposed. The reason for this was that the validation work related to the CHF design calculation was not enough for the assurance of CHF margin. As a part of the works to recover this CHF penalty, MATRA-h was developed by implementing the new correlations for the heat transfer, CHF prediction, subcooled void to the MATRA-a, which is the modified version of COBRA-IV-I done by KAERI. Using MATRA-h, the subchannel analyses for the bundle experiment data were performed. The comparison of the code predictions with the experimental results, it was found that the code would give the conservative predictions as far as the CHF in the bundle geometry is concerned. (author). 12 refs., 25 figs., 16 tabs.

  6. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  7. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    Institute of Information Technology (IIIT), Hyderabad. The study was made in various direc- tions to compare the results our system developed with the out put of Hindi Analyser developed by IIIT, Hyderabad. 3.1 Comparison of rule-based morphological analyser with unsupervised morphological analyser. 3.1a Calculation of ...

  8. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  9. Genetic Prediction.

    Science.gov (United States)

    Turkheimer, Eric

    2015-01-01

    The fundamental reason that the genetics of behavior has remained so controversial for so long is that the layer of theory between data and their interpretation is thicker and more opaque than in more established areas of science. The finding that variations in tiny snippets of DNA have small but detectable relations to variation in behavior surprises no one, at least no one who was paying attention to the twin studies. How such snippets of DNA are related to differences in behavior-known as the gene-to-behavior pathway-is the great theoretical problem of modern behavioral genetics. Given that intentional human breeding is a horrific prospect, what kind of technology might we want (or fear) out of human behavioral genetics? One possibility is a technology that could predict important behavioral characteristics of humans based on their genomes alone. A moment's thought suggests significant benefits and risks that might be associated with such a possibility, but for the moment, just consider how convincing it would be if on the day of a baby's birth we could make meaningful predictions about whether he or she would become a concert pianist or an alcoholic. This article will consider where we are right now as regards that possibility, using human height and intelligence as the primary examples. © 2015 The Hastings Center.

  10. Predicting gangrenous cholecystitis.

    Science.gov (United States)

    Wu, Bin; Buddensick, Thomas J; Ferdosi, Hamid; Narducci, Dusty Marie; Sautter, Amanda; Setiawan, Lisa; Shaukat, Haroon; Siddique, Mustafa; Sulkowski, Gisela N; Kamangar, Farin; Kowdley, Gopal C; Cunningham, Steven C

    2014-09-01

    Gangrenous cholecystitis (GC) is often challenging to treat. The objectives of this study were to determine the accuracy of pre-operative diagnosis, to assess the rate of post-cholecystectomy complications and to assess models to predict GC. A retrospective single-institution review identified patients undergoing a cholecystectomy. Logistic regression models were used to examine the association of variables with GC and to build risk-assessment models. Of 5812 patients undergoing a cholecystectomy, 2219 had acute, 4837 chronic and 351 GC. Surgeons diagnosed GC pre-operatively in only 9% of cases. Patients with GC had more complications, including bile-duct injury, increased estimated blood loss (EBL) and more frequent open cholecystectomies. In unadjusted analyses, variables significantly associated with GC included: age >45 years, male gender, heart rate (HR) >90, white blood cell count (WBC) >13,000/mm(3), gallbladder wall thickening (GBWT) ≥ 4 mm, pericholecystic fluid (PCCF) and American Society of Anesthesiology (ASA) >2. In adjusted analyses, age, WBC, GBWT and HR, but not gender, PCCF or ASA remained statistically significant. A 5-point scoring system was created: 0 points gave a 2% probability of GC and 5 points a 63% probability. Using models can improve a diagnosis of GC pre-operatively. A prediction of GC pre-operatively may allow surgeons to be better prepared for a difficult operation. © 2014 International Hepato-Pancreato-Biliary Association.

  11. Predicting supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Heinemeyer, S. [Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Weiglein, G. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-07-15

    We review the result of SUSY parameter fits based on frequentist analyses of experimental constraints from electroweak precision data, (g-2){sub {mu}}, B physics and cosmological data. We investigate the parameters of the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking mass parameters, and a model with common non-universal Higgs mass parameters in the superpotential (NUHM1). Shown are the results for the SUSY and Higgs spectrum of the models. Many sparticle masses are highly correlated in both the CMSSM and NUHM1, and parts of the regions preferred at the 68% C.L. are accessible to early LHC running. The best-fit points could be tested even with 1 fb{sup -1} at {radical}(s)=7 TeV. (orig.)

  12. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  13. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  14. Genome-Facilitated Analyses of Geomicrobial Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kenneth H. Nealson

    2012-05-02

    This project had the goal(s) of understanding the mechanism(s) of extracellular electron transport (EET) in the microbe Shewanella oneidensis MR-1, and a number of other strains and species in the genus Shewanella. The major accomplishments included sequencing, annotation, and analysis of more than 20 Shewanella genomes. The comparative genomics enabled the beginning of a systems biology approach to this genus. Another major contribution involved the study of gene regulation, primarily in the model organism, MR-1. As part of this work, we took advantage of special facilities at the DOE: e.g., the synchrotron radiation facility at ANL, where we successfully used this system for elemental characterization of single cells in different metabolic states (1). We began work with purified enzymes, and identification of partially purified enzymes, leading to initial characterization of several of the 42 c-type cytochromes from MR-1 (2). As the genome became annotated, we began experiments on transcriptome analysis under different conditions of growth, the first step towards systems biology (3,4). Conductive appendages of Shewanella, called bacterial nanowires were identified and characterized during this work (5, 11, 20,21). For the first time, it was possible to measure the electron transfer rate between single cells and a solid substrate (20), a rate that has been confirmed by several other laboratories. We also showed that MR-1 cells preferentially attach to cells at a given charge, and are not attracted, or even repelled by other charges. The interaction with the charged surfaces begins with a stimulation of motility (called electrokinesis), and eventually leads to attachment and growth. One of the things that genomics allows is the comparative analysis of the various Shewanella strains, which led to several important insights. First, while the genomes predicted that none of the strains looked like they should be able to degrade N-acetyl glucosamine (NAG), the monomer

  15. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  16. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  17. Roux 105 PHONETIC DATA AND PHONOLOGICAL ANALYSES ...

    African Journals Online (AJOL)

    This paper is basically concerned with the relationship between phonetic data and phonological analyses. I) It will be shown that phonological analyses based on unverified phonetic data tend to accommodate ad hoc, unmotivated, and even phonetically implausible phonological rules. On the other hand, it will be ...

  18. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  19. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  20. Novel Algorithms for Astronomical Plate Analyses

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness ...

  1. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    -to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...... and needs among population groups with a low ability to pay. Instead of cost-benefit analyses, impact analyses evaluating the likely effects of project alternatives against a wide range of societal goals is recommended, with quantification and economic valorisation only for impact categories where this can......This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural...

  2. Secondary structural analyses of ITS1 in Paramecium.

    Science.gov (United States)

    Hoshina, Ryo

    2010-01-01

    The nuclear ribosomal RNA gene operon is interrupted by internal transcribed spacer (ITS) 1 and ITS2. Although the secondary structure of ITS2 has been widely investigated, less is known about ITS1 and its structure. In this study, the secondary structure of ITS1 sequences for Paramecium and other ciliates was predicted. Each Paramecium ITS1 forms an open loop with three helices, A through C. Helix B was highly conserved among Paramecium, and similar helices were found in other ciliates. A phylogenetic analysis using the ITS1 sequences showed high-resolution, implying that ITS1 is a good tool for species-level analyses.

  3. Predictable Medea

    Directory of Open Access Journals (Sweden)

    Elisabetta Bertolino

    2010-01-01

    Full Text Available By focusing on the tragedy of the 'unpredictable' infanticide perpetrated by Medea, the paper speculates on the possibility of a non-violent ontological subjectivity for women victims of gendered violence and whether it is possible to respond to violent actions in non-violent ways; it argues that Medea did not act in an unpredictable way, rather through the very predictable subject of resentment and violence. 'Medea' represents the story of all of us who require justice as retribution against any wrong. The presupposition is that the empowered female subjectivity of women’s rights contains the same desire of mastering others of the masculine current legal and philosophical subject. The subject of women’s rights is grounded on the emotions of resentment and retribution and refuses the categories of the private by appropriating those of the righteous, masculine and public subject. The essay opposes the essentialised stereotypes of the feminine and the maternal with an ontological approach of people as singular, corporeal, vulnerable and dependent. There is therefore an emphasis on the excluded categories of the private. Forgiveness is taken into account as a category of the private and a possibility of responding to violence with newness. A violent act is seen in relations to the community of human beings rather than through an isolated setting as in the case of the individual of human rights. In this context, forgiveness allows to risk again and being with. The result is also a rethinking of feminist actions, feminine subjectivity and of the maternal. Overall the paper opens up the Arendtian category of action and forgiveness and the Cavarerian unique and corporeal ontology of the selfhood beyond gendered stereotypes.

  4. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  5. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje

    2014-01-01

    To investigate the role of neurological abnormalities and magnetic resonance imaging (MRI) lesions in predicting global functional decline in a cohort of initially independent-living elderly subjects. The Leukoaraiosis And DISability (LADIS) Study, involving 11 European centres, was primarily aimed...... at evaluating age-related white matter changes (ARWMC) as an independent predictor of the transition to disability (according to Instrumental Activities of Daily Living scale) or death in independent elderly subjects that were followed up for 3 years. At baseline, a standardized neurological examination.......0 years, 45 % males), 327 (51.7 %) presented at the initial visit with ≥1 neurological abnormality and 242 (38 %) reached the main study outcome. Cox regression analyses, adjusting for MRI features and other determinants of functional decline, showed that the baseline presence of any neurological...

  6. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  7. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  8. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  9. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  10. Comparative sequence analyses of genome and transcriptome ...

    Indian Academy of Sciences (India)

    /fulltext/jbsc/040/05/0891-0907. Keywords. Asian elephant; comparative genomics; gene prediction; transcriptome. Abstract. The Asian elephant Elephas maximus and the African elephant Loxodonta africana that diverged 5-7 million years ...

  11. Predictable earthquakes?

    Science.gov (United States)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  12. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  13. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  14. Thermal Analyse sof Cross-Linked Polyethylene

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2007-01-01

    Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.

  15. Nuclear criticality predictability

    International Nuclear Information System (INIS)

    Briggs, J.B.

    1999-01-01

    As a result of lots of efforts, a large portion of the tedious and redundant research and processing of critical experiment data has been eliminated. The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined, and valuable criticality safety experimental data is preserved. Criticality safety personnel in 31 different countries are now using the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Much has been accomplished by the work of the ICSBEP. However, evaluation and documentation represents only one element of a successful Nuclear Criticality Safety Predictability Program and this element only exists as a separate entity, because this work was not completed in conjunction with the experimentation process. I believe; however, that the work of the ICSBEP has also served to unify the other elements of nuclear criticality predictability. All elements are interrelated, but for a time it seemed that communications between these elements was not adequate. The ICSBEP has highlighted gaps in data, has retrieved lost data, has helped to identify errors in cross section processing codes, and has helped bring the international criticality safety community together in a common cause as true friends and colleagues. It has been a privilege to associate with those who work so diligently to make the project a success. (J.P.N.)

  16. Finite element analyses of tool stresses in metal cutting processes

    Energy Technology Data Exchange (ETDEWEB)

    Kistler, B.L. [Sandia National Labs., Livermore, CA (United States)

    1997-01-01

    In this report, we analytically predict and examine stresses in tool tips used in high speed orthogonal machining operations. Specifically, one analysis was compared to an existing experimental measurement of stresses in a sapphire tool tip cutting 1020 steel at slow speeds. In addition, two analyses were done of a carbide tool tip in a machining process at higher cutting speeds, in order to compare to experimental results produced as part of this study. The metal being cut was simulated using a Sandia developed damage plasticity material model, which allowed the cutting to occur analytically without prespecifying the line of cutting/failure. The latter analyses incorporated temperature effects on the tool tip. Calculated tool forces and peak stresses matched experimental data to within 20%. Stress contours generally agreed between analysis and experiment. This work could be extended to investigate/predict failures in the tool tip, which would be of great interest to machining shops in understanding how to optimize cost/retooling time.

  17. Pegasys: software for executing and integrating analyses of biological sequences.

    Science.gov (United States)

    Shah, Sohrab P; He, David Y M; Sawkins, Jessica N; Druce, Jeffrey C; Quon, Gerald; Lett, Drew; Zheng, Grace X Y; Xu, Tao; Ouellette, B F Francis

    2004-04-19

    We present Pegasys--a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  18. Multivariate differential analyses of adolescents' experiences of aggression in families

    Directory of Open Access Journals (Sweden)

    Chris Myburgh

    2011-01-01

    Full Text Available Aggression is part of South African society and has implications for the mental health of persons living in South Africa. If parents are aggressive adolescents are also likely to be aggressive and that will impact negatively on their mental health. In this article the nature and extent of adolescents' experiences of aggression and aggressive behaviour in the family are investigated. A deductive explorative quantitative approach was followed. Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers Cronbach Alpha, various consecutive first and second order factor analyses, correlations, multiple regression, MANOVA, ANOVA and Scheffè/ Dunnett tests were used. It was found that aggression correlated negatively with the independent variables; and the correlations between adolescents and their parents were significant. Regression analyses indicated that different predictors predicted aggression. Furthermore, differences between adolescents and their parents indicated that the experienced levels of aggression between adolescents and their parents were small. Implications for education are given.

  19. Regional Scale Analyses of Climate Change Impacts on Agriculture

    Science.gov (United States)

    Wolfe, D. W.; Hayhoe, K.

    2006-12-01

    New statistically downscaled climate modeling techniques provide an opportunity for improved regional analysis of climate change impacts on agriculture. Climate modeling outputs can often simultaneously meet the needs of those studying impacts on natural as well as managed ecosystems. Climate outputs can be used to drive existing forest or crop models, or livestock models (e.g., temperature-humidity index model predicting dairy milk production) for improved information on regional impact. High spatial resolution climate forecasts, combined with knowledge of seasonal temperatures or rainfall constraining species ranges, can be used to predict shifts in suitable habitat for invasive weeds, insects, and pathogens, as well as cash crops. Examples of climate thresholds affecting species range and species composition include: minimum winter temperature, duration of winter chilling (vernalization) hours (e.g., hours below 7.2 C), frost-free period, and frequency of high temperature stress days in summer. High resolution climate outputs can also be used to drive existing integrated pest management models predicting crop insect and disease pressure. Collectively, these analyses can be used to test hypotheses or provide insight into the impact of future climate change scenarios on species range shifts and threat from invasives, shifts in crop production zones, and timing and regional variation in economic impacts.

  20. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...... through the analysis of one of the earliest recorded examples of preschool education (initiated by J. F. Oberlin in northeastern France in 1767). The general idea of societal need is elaborated as a way of analysing practices, and a general analytic schema is presented for characterising preschool...

  1. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  2. Advanced Toroidal Facility vacuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advanced Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described

  3. Proteomic Analyses of the Vitreous Humour

    Directory of Open Access Journals (Sweden)

    Martina Angi

    2012-01-01

    Full Text Available The human vitreous humour (VH is a transparent, highly hydrated gel, which occupies the posterior segment of the eye between the lens and the retina. Physiological and pathological conditions of the retina are reflected in the protein composition of the VH, which can be sampled as part of routine surgical procedures. Historically, many studies have investigated levels of individual proteins in VH from healthy and diseased eyes. In the last decade, proteomics analyses have been performed to characterise the proteome of the human VH and explore networks of functionally related proteins, providing insight into the aetiology of diabetic retinopathy and proliferative vitreoretinopathy. Recent proteomic studies on the VH from animal models of autoimmune uveitis have identified new signalling pathways associated to autoimmune triggers and intravitreal inflammation. This paper aims to guide biological scientists through the different proteomic techniques that have been used to analyse the VH and present future perspectives for the study of intravitreal inflammation using proteomic analyses.

  4. A 256-channel pulse-height analyser

    International Nuclear Information System (INIS)

    Berset, J.C.; Delavallade, G.; Lindsay, J.

    1975-01-01

    The design, construction, and testing of a small, low-cost 256-channel pulse-height analyser is briefly discussed. The analyser, intended for use in the setting up of experiments in high-energy physics, is fully compatible with the CERN/NIM nucleonic instrumentation. It has a digital display of channel and content as well as outputs for printing, plotting, and binary transfer. The logic circuitry is made with TTL integrated circuits and has a static random-access MOS memory. Logic and timing diagrams are given. Detailed specifications are also included. (Author)

  5. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  6. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  7. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-11-22

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  8. Prediction of heart disease using apache spark analysing decision trees and gradient boosting algorithm

    Science.gov (United States)

    Chugh, Saryu; Arivu Selvan, K.; Nadesh, RK

    2017-11-01

    Numerous destructive things influence the working arrangement of human body as hypertension, smoking, obesity, inappropriate medication taking which causes many contrasting diseases as diabetes, thyroid, strokes and coronary diseases. The impermanence and horribleness of the environment situation is also the reason for the coronary disease. The structure of Apache start relies on the evolution which requires gathering of the data. To break down the significance of use programming focused on data structure the Apache stop ought to be utilized and it gives various central focuses as it is fast in light as it uses memory worked in preparing. Apache Spark continues running on dispersed environment and chops down the data in bunches giving a high profitability rate. Utilizing mining procedure as a part of the determination of coronary disease has been exhaustively examined indicating worthy levels of precision. Decision trees, Neural Network, Gradient Boosting Algorithm are the various apache spark proficiencies which help in collecting the information.

  9. Behavioral and Physiological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction

    Science.gov (United States)

    Ninness, Chris; Lauter, Judy L.; Coffee, Michael; Clary, Logan; Kelly, Elizabeth; Rumph, Marilyn; Rumph, Robin; Kyle, Betty; Ninness, Sharon K.

    2012-01-01

    Using 3 diversified datasets, we explored the pattern-recognition ability of the Self-Organizing Map (SOM) artificial neural network as applied to diversified nonlinear data distributions in the areas of behavioral and physiological research. Experiment 1 employed a dataset obtained from the UCI Machine Learning Repository. Data for this study…

  10. Predicting an optimal function for diagnostic and prognostic analyses with gene expression data

    NARCIS (Netherlands)

    Jong, V.L.

    2017-01-01

    The completion of the human genome and the advancement of high-throughput technologies have enable the quantification of thousands of genes for precision medicine. The problem with gene expression data is that the number of genes (as variables) greatly supersedes the number of samples thereby

  11. Variability in spectrophotometric pyruvate analyses for predicting onion pungency and nutraceutical value.

    Science.gov (United States)

    Beretta, Vanesa H; Bannoud, Florencia; Insani, Marina; Galmarini, Claudio R; Cavagnaro, Pablo F

    2017-06-01

    Onion pyruvate concentration is used as a predictor of flavor intensity and nutraceutical value. The protocol of Schwimmer and Weston (SW) (1961) is the most widespread methodology for estimating onion pyruvate. Anthon and Barret (AB) (2003) proposed modifications to this procedure. Here, we compared these spectrophotometry-based procedures for pyruvate analysis using a diverse collection of onion cultivars. The SW method always led to over-estimation of pyruvate levels in colored, but not in white onions, by up to 65%. Identification of light-absorbance interfering compounds was performed by spectrophotometry and HPLC analysis. Interference by quercetin and anthocyanins, jointly, accounted for more than 90% of the over-estimation of pyruvate. Pyruvate determinations according to AB significantly reduced absorbance interference from compounds other than pyruvate. This study provides evidence about the mechanistic basis underlying differences between the SW and AB methods for indirect assessment of onion flavor and nutraceutical value. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Identifying the role of Wilms tumor 1 associated protein in cancer prediction using integrative genomic analyses.

    Science.gov (United States)

    Wu, Li-Sheng; Qian, Jia-Yi; Wang, Minghai; Yang, Haiwei

    2016-09-01

    The Wilms tumor suppressor, WT1 was first identified due to its essential role in the normal development of the human genitourinary system. Wilms tumor 1 associated protein (WTAP) was subsequently revealed to interact with WT1 using yeast two-hybrid screening. The present study identified 44 complete WTAP genes in the genomes of vertebrates, including fish, amphibians, birds and mammals. The vertebrate WTAP proteins clustered into the primate, rodent and teleost lineages using phylogenetic tree analysis. From 1,347 available SNPs in the human WTAP gene, 19 were identified to cause missense mutations. WTAP was expressed in bladder, blood, brain, breast, colorectal, esophagus, eye, head and neck, lung, ovarian, prostate, skin and soft tissue cancers. A total of 17 out of 328 microarrays demonstrated an association between WTAP gene expression and cancer prognosis. However, the association between WTAP gene expression and prognosis varied in distinct types of cancer, and even in identical types of cancer from separate microarray databases. By searching the Catalogue of Somatic Mutations in Cancer database, 65 somatic mutations were identified in the human WTAP gene from the cancer tissue samples. These results suggest that the function of WTAP in tumor formation may be multidimensional. Furthermore, signal transducer and activator of transcription 1, forkhead box protein O1, interferon regulatory factor 1, glucocorticoid receptor and peroxisome proliferator-activated receptor γ transcription factor binding sites were identified in the upstream (promoter) region of the human WTAP gene, suggesting that these transcription factors may be involved in WTAP functions in tumor formation.

  13. Sequence analyses and 3D structure prediction of two Type III ...

    African Journals Online (AJOL)

    Internet

    2012-04-17

    Apr 17, 2012 ... the usefulness of protein structure models for molecular replacement. Bioinformatics, 21(2): 72-76. Gomez JM, Loir M, Le Gac F (1998). Growth hormone receptors in testis and liver during the spermatogenetic cycle in rainbow trout. (Oncorhynchus mykiss). Biol. Reprod. 58: 483-491. Harvey S, Scanes CG.

  14. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  15. Good Governance Analysing Performance of Economic Community ...

    African Journals Online (AJOL)

    Good Governance Analysing Performance of Economic Community of West African States and Southern African Development Community Members on Mo Ibrahim Index of ... The Index is important, significant and appropriate because it outlines criteria and conditions deemed essential for Africans to live meaningful lives.

  16. Regression og geometrisk data analyse (2. del)

    DEFF Research Database (Denmark)

    Brinkkjær, Ulf

    2010-01-01

    Artiklen søger at vise, hvordan regressionsanalyse og geometrisk data analyse kan integreres. Det er interessant, fordi disse metoder ofte opstilles som modsætninger f.eks. som en modsætning mellem beskrivende og forklarende metoder. Artiklens første del bragtes i Praktiske Grunde 3-4 / 2007....

  17. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Abstract. The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of programmes was used to analyse reproductive data. Heritabilities and product-moment correlations between.

  18. Phonetic data and phonological analyses | Roux | Stellenbosch ...

    African Journals Online (AJOL)

    Stellenbosch Papers in Linguistics. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 1 (1978) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register. Phonetic data and phonological analyses. JC Roux. Abstract.

  19. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  20. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  1. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    The phylogenetic relationships among the tayassuids are unclear and have insti- gated debate over the ... [Adega F., Chaves R. and Guedes-Pinto H. 2007 Chromosomal evolution and phylogenetic analyses in Tayassu pecari and Pecari tajacu. (Tayassuidae): tales ..... Chromosome banding in Amphibia. XXV. Karyotype ...

  2. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  3. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  4. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show...

  5. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... must clearly identify and differentiate between the roles performed by the natural disposal site... and segregation requirements will be met and that adequate barriers to inadvertent intrusion will be... need for ongoing active maintenance after closure must be based upon analyses of active natural...

  6. Chemical Analyses of Silicon Aerogel Samples

    Energy Technology Data Exchange (ETDEWEB)

    van der Werf, I.; Palmisano, F.; De Leo, Raffaele; Marrone, Stefano

    2008-04-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  7. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  8. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  9. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  10. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1993-01-01

    In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques. (J.P.N.)

  11. Microbiological And Physicochemical Analyses Of Oil Contaminated ...

    African Journals Online (AJOL)

    Michael Horsfall

    The physicochemical properties of the soil samples analysed shows the pH ... Keywords: Oil contaminated soil, microbial isolates, mechanical workshops and physicochemical parameters. Pollution of the environment by petroleum ... strains capable of degrading Poly aromatic hydrocarbons have been isolated from soil and.

  12. Pecheries maritimes artisanales Togolaises : analyse des ...

    African Journals Online (AJOL)

    Pecheries maritimes artisanales Togolaises : analyse des debarquements et de la valeur commerciale des captures. K.M. Sedzro, E.D. Fiogbe, E.B. Guerra. Abstract. Description du sujet : La connaissance scientifique de la pression des pêcheries artisanales sur les ressources marines togolaises s'avère nécessaire pour ...

  13. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  14. The Economic Cost of Homosexuality: Multilevel Analyses

    Science.gov (United States)

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  15. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  16. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  17. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  18. Disruption prediction at JET

    International Nuclear Information System (INIS)

    Milani, F.

    1998-12-01

    The sudden loss of the plasma magnetic confinement, known as disruption, is one of the major issue in a nuclear fusion machine as JET (Joint European Torus). Disruptions pose very serious problems to the safety of the machine. The energy stored in the plasma is released to the machine structure in few milliseconds resulting in forces that at JET reach several Mega Newtons. The problem is even more severe in the nuclear fusion power station where the forces are in the order of one hundred Mega Newtons. The events that occur during a disruption are still not well understood even if some mechanisms that can lead to a disruption have been identified and can be used to predict them. Unfortunately it is always a combination of these events that generates a disruption and therefore it is not possible to use simple algorithms to predict it. This thesis analyses the possibility of using neural network algorithms to predict plasma disruptions in real time. This involves the determination of plasma parameters every few milliseconds. A plasma boundary reconstruction algorithm, XLOC, has been developed in collaboration with Dr. D. O'Brien and Dr. J. Ellis capable of determining the plasma wall/distance every 2 milliseconds. The XLOC output has been used to develop a multilayer perceptron network to determine plasma parameters as l i and q ψ with which a machine operational space has been experimentally defined. If the limits of this operational space are breached the disruption probability increases considerably. Another approach for prediction disruptions is to use neural network classification methods to define the JET operational space. Two methods have been studied. The first method uses a multilayer perceptron network with softmax activation function for the output layer. This method can be used for classifying the input patterns in various classes. In this case the plasma input patterns have been divided between disrupting and safe patterns, giving the possibility of

  19. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  20. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  1. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  2. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  3. Introduction à l'analyse fonctionnelle

    CERN Document Server

    Reischer, Corina; Hengartner, Walter

    1981-01-01

    Fruit de la collaboration des professeur Walter Hengarther de l'Université Laval, Marcel Lambert et Corina Reischer de l'Université du Québec à Trois-Rivières, Introduction à l'analyse fonctionnelle se distingue tant par l'étendue de son contenu que par l'accessibilité de sa présentation. Sans céder quoi que ce soit sur la rigueur, il est parfaitement adapté à un premier cours d'analyse fonctionnelle. Tout en étant d'abord destiné aux étudiants en mathématiques, il pourra certes être utile aux étudiants de second cycle en sciences et en génie.

  4. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  5. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  6. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  7. Analysing qualitative research data using computer software.

    Science.gov (United States)

    McLafferty, Ella; Farley, Alistair H

    An increasing number of clinical nurses are choosing to undertake qualitative research. A number of computer software packages are available designed for the management and analysis of qualitative data. However, while it is claimed that the use of these programs is also increasing, this claim is not supported by a search of recent publications. This paper discusses the advantages and disadvantages of using computer software packages to manage and analyse qualitative data.

  8. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  9. Analysing customer behaviour in mobile app usage

    OpenAIRE

    Chen, Qianling; Zhang, Min; Zhao, Xiande

    2017-01-01

    Purpose – Big data produced by mobile apps contains valuable knowledge about customers and markets and has been viewed as productive resources. This study proposes a multiple methods approach to elicit intelligence and value from big data by analysing customer behaviour in mobile app usage. Design/methodology/approach – The big data analytical approach is developed using three data mining techniques: RFM (Recency, Frequency, Monetary) analysis, link analysis, and association rule learning. We...

  10. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  11. Steady State Thermal Analyses of SCEPTOR X-57 Wingtip Propulsion

    Science.gov (United States)

    Schnulo, Sydney L.; Chin, Jeffrey C.; Smith, Andrew D.; Dubois, Arthur

    2017-01-01

    Electric aircraft concepts enable advanced propulsion airframe integration approaches that promise increased efficiency as well as reduced emissions and noise. NASA's fully electric Maxwell X-57, developed under the SCEPTOR program, features distributed propulsion across a high aspect ratio wing. There are 14 propulsors in all: 12 high lift motor that are only active during take off and climb, and 2 larger motors positioned on the wingtips that operate over the entire mission. The power electronics involved in the wingtip propulsion are temperature sensitive and therefore require thermal management. This work focuses on the high and low fidelity heat transfer analysis methods performed to ensure that the wingtip motor inverters do not reach their temperature limits. It also explores different geometry configurations involved in the X-57 development and any thermal concerns. All analyses presented are performed at steady state under stressful operating conditions, therefore predicting temperatures which are considered the worst-case scenario to remain conservative.

  12. Spent fuel shipping costs for transportation logistics analyses

    International Nuclear Information System (INIS)

    Cole, B.M.; Cross, R.E.; Cashwell, J.W.

    1983-05-01

    Logistics analyses supplied to the nuclear waste management programs of the U.S. Department of Energy through the Transportation Technology Center (TTC) at Sandia National Laboratories are used to predict nuclear waste material logistics, transportation packaging demands, shipping and receiving rates and transportation-related costs for alternative strategies. This study is an in-depth analysis of the problems and contingencies associated with the costs of shipping irradiated reactor fuel. These costs are extremely variable however, and have changed frequently (sometimes monthly) during the past few years due to changes in capital, fuel, and labor costs. All costs and charges reported in this study are based on January 1982 data using existing transport cask systems and should be used as relative indices only. Actual shipping costs would be negotiable for each origin-destination combination

  13. Gamma-ray spectrometric analyses of some Nigerian rock samples

    International Nuclear Information System (INIS)

    Uwah, E.J.; Ajakaiye, D.E.

    1990-01-01

    Approximate uranium concentrations in rock samples from the Sokoto Basin of Nigeria were predicted from γ-equivalent uranium and thorium (eU and eTh) measurements which made use of similar rock samples, previously analyzed by delayed neutron counting (DNC) and x-ray fluorescence (XRF) techniques, as reference materials. Comparison of the results of the 3 techniques shows that the eU values approximate the DNC results more than the XRF results do, with standard error of estimate of ±6.68 ppm eU and correlation coefficient of 0.984. Corresponding values for XRF analyses are ±32.53 ppm U and 0.730, respectively. (author)

  14. Compilation of Sandia coal char combustion data and kinetic analyses

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.E.; Hurt, R.H.; Baxter, L.L.; Hardesty, D.R.

    1992-06-01

    An experimental project was undertaken to characterize the physical and chemical processes that govern the combustion of pulverized coal chars. The experimental endeavor establishes a database on the reactivities of coal chars as a function of coal type, particle size, particle temperature, gas temperature, and gas and composition. The project also provides a better understanding of the mechanism of char oxidation, and yields quantitative information on the release rates of nitrogen- and sulfur-containing species during char combustion. An accurate predictive engineering model of the overall char combustion process under technologically relevant conditions in a primary product of this experimental effort. This document summarizes the experimental effort, the approach used to analyze the data, and individual compilations of data and kinetic analyses for each of the parent coals investigates.

  15. Contract Dynamics : Lessons from Empirical Analyses

    OpenAIRE

    Magali Chaudey

    2010-01-01

    Working paper GATE 2010-35; The recognition that contracts have a time dimension has given rise to a very abundant literature since the end of the 1980s. In such a dynamic context, the contract may take place over several periods and develop repeated interactions. Then, the principal topics of the analysis are commitment, reputation, memory and the renegotiation of the contract. Few papers have tried to apply the predictions of dynamic contract theory to data. The examples of applications int...

  16. Comparative genomic analyses of the Taylorellae.

    Science.gov (United States)

    Hauser, Heidi; Richter, Daniel C; van Tonder, Andries; Clark, Louise; Preston, Andrew

    2012-09-14

    Contagious equine metritis (CEM) is an important venereal disease of horses that is of concern to the thoroughbred industry. Taylorella equigenitalis is a causative agent of CEM but very little is known about it or its close relative Taylorella asinigenitalis. To reveal novel information about Taylorella biology, comparative genomic analyses were undertaken. Whole genome sequencing was performed for the T. equigenitalis type strain, NCTC11184. Draft genome sequences were produced for a second T. equigenitalis strain and for a strain of T. asinigenitalis. These genome sequences were analysed and compared to each other and the recently released genome sequence of T. equigenitalis MCE9. These analyses revealed that T. equigenitalis strains appear to be very similar to each other with relatively little strain-specific DNA content. A number of genes were identified that encode putative toxins and adhesins that are possibly involved in infection. Analysis of T. asinigenitalis revealed that it has a very similar gene repertoire to that of T. equigenitalis but shares surprisingly little DNA sequence identity with it. The generation of genome sequence information greatly increases knowledge of these poorly characterised bacteria and greatly facilitates study of them. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  18. Rotational distortion in conventional allometric analyses.

    Science.gov (United States)

    Packard, Gary C

    2011-08-01

    Three data sets from the recent literature were submitted to new analyses to illustrate the rotational distortion that commonly accompanies traditional allometric analyses and that often causes allometric equations to be inaccurate and misleading. The first investigation focused on the scaling of evaporative water loss to body mass in passerine birds; the second was concerned with the influence of body size on field metabolic rates of rodents; and the third addressed interspecific variation in kidney mass among primates. Straight lines were fitted to logarithmic transformations by Ordinary Least Squares and Generalized Linear Models, and the resulting equations then were re-expressed as two-parameter power functions in the original arithmetic scales. The re-expressed models were displayed on bivariate graphs together with tracings for equations fitted directly to untransformed data by nonlinear regression. In all instances, models estimated by back-transformation failed to describe major features of the arithmetic distribution whereas equations fitted by nonlinear regression performed quite well. The poor performance of equations based on models fitted to logarithms can be traced to the increased weight and leverage exerted in those analyses by observations for small species and to the decreased weight and leverage exerted by large ones. The problem of rotational distortion can be avoided by performing exploratory analysis on untransformed values and by validating fitted models in the scale of measurement. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  20. Predictability of blocking

    International Nuclear Information System (INIS)

    Tosi, E.; Ruti, P.; Tibaldi, S.; D'Andrea, F.

    1994-01-01

    Tibaldi and Molteni (1990, hereafter referred to as TM) had previously investigated operational blocking predictability by the ECMWF model and the possible relationships between model systematic error and blocking in the winter season of the Northern Hemisphere, using seven years of ECMWF operational archives of analyses and day 1 to 10 forecasts. They showed that fewer blocking episodes than in the real atmosphere were generally simulated by the model, and that this deficiency increased with increasing forecast time. As a consequence of this, a major contribution to the systematic error in the winter season was shown to derive from the inability of the model to properly forecast blocking. In this study, the analysis performed in TM for the first seven winter seasons of the ECMWF operational model is extended to the subsequent five winters, during which model development, reflecting both resolution increases and parametrisation modifications, continued unabated. In addition the objective blocking index developed by TM has been applied to the observed data to study the natural low frequency variability of blocking. The ability to simulate blocking of some climate models has also been tested

  1. Analyses of the OSU-MASLWR Experimental Test Facility

    Directory of Open Access Journals (Sweden)

    F. Mascari

    2012-01-01

    Full Text Available Today, considering the sustainability of the nuclear technology in the energy mix policy of developing and developed countries, the international community starts the development of new advanced reactor designs. In this framework, Oregon State University (OSU has constructed, a system level test facility to examine natural circulation phenomena of importance to multi-application small light water reactor (MASLWR design, a small modular pressurized water reactor (PWR, relying on natural circulation during both steady-state and transient operation. The target of this paper is to give a review of the main characteristics of the experimental facility, to analyse the main phenomena characterizing the tests already performed, the potential transients that could be investigated in the facility, and to describe the current IAEA International Collaborative Standard Problem that is being hosted at OSU and the experimental data will be collected at the OSU-MASLWR test facility. A summary of the best estimate thermal hydraulic system code analyses, already performed, to analyze the codes capability in predicting the phenomena typical of the MASLWR prototype, thermal hydraulically characterized in the OSU-MASLWR facility, is presented as well.

  2. Progress Report on Computational Analyses of Water-Based NSTF

    Energy Technology Data Exchange (ETDEWEB)

    Lv, Q. [Argonne National Lab. (ANL), Argonne, IL (United States); Kraus, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Lisowski, D. [Argonne National Lab. (ANL), Argonne, IL (United States); Nunez, D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-01

    CFD analysis has been focused on important component-level phenomena using STARCCM+ to supplement the system analysis of integral system behavior. A notable area of interest was the cavity region. This area is of particular interest for CFD analysis due to the multi-dimensional flow and complex heat transfer (thermal radiation heat transfer and natural convection), which are not simulated directly by RELAP5. CFD simulations allow for the estimation of the boundary heat flux distribution along the riser tubes, which is needed in the RELAP5 simulations. The CFD results can also provide additional data to help establish what level of modeling detail is necessary in RELAP5. It was found that the flow profiles in the cavity region are simpler for the water-based concept than for the air-cooled concept. The local heat flux noticeably increases axially, and is higher in the fins than in the riser tubes. These results were utilized in RELAP5 simulations as boundary conditions, to provide better temperature predictions in the system level analyses. It was also determined that temperatures were higher in the fins than the riser tubes, but within design limits for thermal stresses. Higher temperature predictions were identified in the edge fins, in part due to additional thermal radiation from the side cavity walls.

  3. ANALYSE THE PERFORMANCE OF ENSEMBLE CLASSIFIERS USING SAMPLING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    M. Balamurugan

    2016-07-01

    Full Text Available In Ensemble classifiers, the Combination of multiple prediction models of classifiers is important for making progress in a variety of difficult prediction problems. Ensemble of classifiers proved potential in getting higher accuracy compared to single classifier. Even though by the usage ensemble classifiers, still there is in-need to improve its performance. There are many possible ways available to increase the performance of ensemble classifiers. One of the ways is sampling, which plays a major role for improving the quality of ensemble classifier. Since, it helps in reducing the bias in input data set of ensemble. Sampling is the process of extracting the subset of samples from the original dataset. In this research work, analysis is done on sampling techniques for ensemble classifiers. In ensemble classifier, specifically one of the probability based sampling techniques is being always used. Samples are gathered in a process which gives all the individuals in the population of equal chances, such that, sampling bias is removed. In this paper, analyse the performance of ensemble classifiers by using various sampling techniques and list out their drawbacks.

  4. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  5. NASA Cold Land Processes Experiment (CLPX 2002/03): Atmospheric analyses datasets

    Science.gov (United States)

    Glen E. Liston; Daniel L. Birkenheuer; Christopher A. Hiemstra; Donald W. Cline; Kelly Elder

    2008-01-01

    This paper describes the Local Analysis and Prediction System (LAPS) and the 20-km horizontal grid version of the Rapid Update Cycle (RUC20) atmospheric analyses datasets, which are available as part of the Cold Land Processes Field Experiment (CLPX) data archive. The LAPS dataset contains spatially and temporally continuous atmospheric and surface variables over...

  6. Multivariate Analyses of Predictors of Heavy Episodic Drinking and Drinking-Related Problems among College Students

    Science.gov (United States)

    Fenzel, L. Mickey

    2005-01-01

    The present study examines predictors of heavy drinking frequency and drinking-related problems among more than 600 college students. Controlling for high school drinking frequency, results of multiple regression analyses showed that more frequent heavy drinking was predicted by being male and risk factors of more frequent marijuana and tobacco…

  7. Systems Analyses Reveal Shared and Diverse Attributes of Oct4 Regulation in Pluripotent Cells

    DEFF Research Database (Denmark)

    Ding, Li; Paszkowski-Rogacz, Maciej; Winzi, Maria

    2015-01-01

    of Oct4, a key regulator of pluripotency. Our data signify that there are similarities, but also fundamental differences in Oct4 regulation in EpiSCs versus embryonic stem cells (ESCs). Through multiparametric data analyses, we predict that Tox4 is associating with the Paf1C complex, which maintains cell...

  8. Analyses of expressed sequence tags from apple.

    Science.gov (United States)

    Newcomb, Richard D; Crowhurst, Ross N; Gleave, Andrew P; Rikkerink, Erik H A; Allan, Andrew C; Beuning, Lesley L; Bowen, Judith H; Gera, Emma; Jamieson, Kim R; Janssen, Bart J; Laing, William A; McArtney, Steve; Nain, Bhawana; Ross, Gavin S; Snowden, Kimberley C; Souleyre, Edwige J F; Walton, Eric F; Yauk, Yar-Khing

    2006-05-01

    The domestic apple (Malus domestica; also known as Malus pumila Mill.) has become a model fruit crop in which to study commercial traits such as disease and pest resistance, grafting, and flavor and health compound biosynthesis. To speed the discovery of genes involved in these traits, develop markers to map genes, and breed new cultivars, we have produced a substantial expressed sequence tag collection from various tissues of apple, focusing on fruit tissues of the cultivar Royal Gala. Over 150,000 expressed sequence tags have been collected from 43 different cDNA libraries representing 34 different tissues and treatments. Clustering of these sequences results in a set of 42,938 nonredundant sequences comprising 17,460 tentative contigs and 25,478 singletons, together representing what we predict are approximately one-half the expressed genes from apple. Many potential molecular markers are abundant in the apple transcripts. Dinucleotide repeats are found in 4,018 nonredundant sequences, mainly in the 5'-untranslated region of the gene, with a bias toward one repeat type (containing AG, 88%) and against another (repeats containing CG, 0.1%). Trinucleotide repeats are most common in the predicted coding regions and do not show a similar degree of sequence bias in their representation. Bi-allelic single-nucleotide polymorphisms are highly abundant with one found, on average, every 706 bp of transcribed DNA. Predictions of the numbers of representatives from protein families indicate the presence of many genes involved in disease resistance and the biosynthesis of flavor and health-associated compounds. Comparisons of some of these gene families with Arabidopsis (Arabidopsis thaliana) suggest instances where there have been duplications in the lineages leading to apple of biosynthetic and regulatory genes that are expressed in fruit. This resource paves the way for a concerted functional genomics effort in this important temperate fruit crop.

  9. [Analyses of deaths can provide meaningful learning].

    Science.gov (United States)

    Jensen, Marie Rosenørn Hviid; Jørsboe, Hanne Blæhr

    2016-05-16

    Learning based on deceased patients has provided medicine with substantial knowledge and is still a source of new information. The basic learning approach has been autopsies, but focus has shifted towards analysis of registry data. This article evaluates different ways to analyse the natural deaths, including autopsies, audits, clinical databases and hospital standardised mortality ratios in regard of clinical learning. We claim that data-powered analysis cannot stand alone, and recommend that clinicians should organise multidisciplinary theoretically based audits, in order to keep learning from the deceased.

  10. Fully Coupled FE Analyses of Buried Structures

    Directory of Open Access Journals (Sweden)

    James T. Baylot

    1994-01-01

    Full Text Available Current procedures for determining the response of buried structures to the effects of the detonation of buried high explosives recommend decoupling the free-field stress analysis from the structure response analysis. A fully coupled (explosive–soil structure finite element analysis procedure was developed so that the accuracies of current decoupling procedures could be evaluated. Comparisons of the results of analyses performed using this procedure with scale-model experiments indicate that this finite element procedure can be used to effectively evaluate the accuracies of the methods currently being used to decouple the free-field stress analysis from the structure response analysis.

  11. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  12. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  13. ORNL analyses of AVR performance and safety

    Energy Technology Data Exchange (ETDEWEB)

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal.

  14. ORNL analyses of AVR performance and safety

    International Nuclear Information System (INIS)

    Cleveland, J.C.

    1985-01-01

    Because of the high interest in modular High Temperature Reactor performance and safety, a cooperative project has been established involving the Oak Ridge National Laboratory (ORNL), Arbeitsgemeinschaft Versuchs Reaktor GmbH (AVR), and Kernforschungsanlage Juelich GmbH (KFA) in reactor physics, performance and safety. This paper presents initial results of ORNL's examination of a hypothetical depressurized core heatup accident and consideration of how a depressurized core heatup test might be conducted by AVR staff. Also presented are initial analyses of a test involving a reduction in core flow and of a test involving reactivity insertion via control rod withdrawal

  15. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  16. PREDICT: Satellite tracking and orbital prediction

    Science.gov (United States)

    Magliacane, John A.

    2011-12-01

    PREDICT is an open-source, multi-user satellite tracking and orbital prediction program written under the Linux operating system. PREDICT provides real-time satellite tracking and orbital prediction information to users and client applications through: the system console the command line a network socket the generation of audio speechData such as a spacecraft's sub-satellite point, azimuth and elevation headings, Doppler shift, path loss, slant range, orbital altitude, orbital velocity, footprint diameter, orbital phase (mean anomaly), squint angle, eclipse depth, the time and date of the next AOS (or LOS of the current pass), orbit number, and sunlight and visibility information are provided on a real-time basis. PREDICT can also track (or predict the position of) the Sun and Moon. PREDICT has the ability to control AZ/EL antenna rotators to maintain accurate orientation in the direction of communication satellites. As an aid in locating and tracking satellites through optical means, PREDICT can articulate tracking coordinates and visibility information as plain speech.

  17. CHEMICAL ANALYSES OF SODIUM SYSTEMS FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, W. O.; Yunker, W. H.; Scott, F. A.

    1970-06-01

    BNWL-1407 summarizes information gained from the Chemical Analyses of Sodium Systems Program pursued by Battelle- Northwest over the period from July 1967 through June 1969. Tasks included feasibility studies for performing coulometric titration and polarographic determinations of oxygen in sodium, and the development of new separation techniques for sodium impurities and their subsequent analyses. The program was terminated ahead of schedule so firm conclusions were not obtained in all areas of the work. At least 40 coulometric titrations were carried out and special test cells were developed for coulometric application. Data indicated that polarographic measurements are theoretically feasible, but practical application of the method was not verified. An emission spectrographic procedure for trace metal impurities was developed and published. Trace metal analysis by a neutron activation technique was shown to be feasible; key to the success of the activation technique was the application of a new ion exchange resin which provided a sodium separation factor of 10{sup 11}. Preliminary studies on direct scavenging of trace metals produced no conclusive results.

  18. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  19. Hierarchical regression for analyses of multiple outcomes.

    Science.gov (United States)

    Richardson, David B; Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R; Chu, Haitao

    2015-09-01

    In cohort mortality studies, there often is interest in associations between an exposure of primary interest and mortality due to a range of different causes. A standard approach to such analyses involves fitting a separate regression model for each type of outcome. However, the statistical precision of some estimated associations may be poor because of sparse data. In this paper, we describe a hierarchical regression model for estimation of parameters describing outcome-specific relative rate functions and associated credible intervals. The proposed model uses background stratification to provide flexible control for the outcome-specific associations of potential confounders, and it employs a hierarchical "shrinkage" approach to stabilize estimates of an exposure's associations with mortality due to different causes of death. The approach is illustrated in analyses of cancer mortality in 2 cohorts: a cohort of dioxin-exposed US chemical workers and a cohort of radiation-exposed Japanese atomic bomb survivors. Compared with standard regression estimates of associations, hierarchical regression yielded estimates with improved precision that tended to have less extreme values. The hierarchical regression approach also allowed the fitting of models with effect-measure modification. The proposed hierarchical approach can yield estimates of association that are more precise than conventional estimates when one wishes to estimate associations with multiple outcomes. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  1. Autisme et douleur – analyse bibliographique

    Science.gov (United States)

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  2. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  3. Hitchhikers’ guide to analysing bird ringing data

    Directory of Open Access Journals (Sweden)

    Harnos Andrea

    2015-12-01

    Full Text Available Bird ringing datasets constitute possibly the largest source of temporal and spatial information on vertebrate taxa available on the globe. Initially, the method was invented to understand avian migration patterns. However, data deriving from bird ringing has been used in an array of other disciplines including population monitoring, changes in demography, conservation management and to study the effects of climate change to name a few. Despite the widespread usage and importance, there are no guidelines available specifically describing the practice of data management, preparation and analyses of ringing datasets. Here, we present the first of a series of comprehensive tutorials that may help fill this gap. We describe in detail and through a real-life example the intricacies of data cleaning and how to create a data table ready for analyses from raw ringing data in the R software environment. Moreover, we created and present here the R package; ringR, designed to carry out various specific tasks and plots related to bird ringing data. Most methods described here can also be applied to a wide range of capture-recapture type data based on individual marking, regardless to taxa or research question.

  4. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  5. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  6. Analysing wind farm efficiency on complex terrains

    International Nuclear Information System (INIS)

    Castellani, Francesco; Astolfi, Davide; Terzi, Ludovico; Hansen, Kurt Schaldemose; Rodrigo, Javier Sanz

    2014-01-01

    Actual performances of onshore wind farms are deeply affected both by wake interactions and terrain complexity: therefore monitoring how the efficiency varies with the wind direction is a crucial task. Polar efficiency plot is therefore a useful tool for monitoring wind farm performances. The approach deserves careful discussion for onshore wind farms, where orography and layout commonly affect performance assessment. The present work deals with three modern wind farms, owned by Sorgenia Green, located on hilly terrains with slopes from gentle to rough. Further, onshore wind farm of Nprrekffir Enge has been analysed as a reference case: its layout is similar to offshore wind farms and the efficiency is mainly driven by wakes. It is shown and justified that terrain complexity imposes a novel and more consistent way for defining polar efficiency. Dependency of efficiency on wind direction, farm layout and orography is analysed and discussed. Effects of atmospheric stability have been also investigated through MERRA reanalysis data from NASA satellites. Monin-Obukhov Length has been used to discriminate climate regimes

  7. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  8. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  9. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-02-01

    Full Text Available Objectives: The study examined and identified the factors that affect lawyers’ attitudes to knowledge sharing, and their knowledge sharing behaviour. Specifically, it investigated the relationship between the salient beliefs affecting the knowledge sharing attitude of lawyers’, and applied a modified version of the Theory of Reasoned Action (TRA in the knowledge sharing context, to predict how these factors affect their knowledge sharing behaviour.Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance.Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’.Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers. 

  10. Monitoring and prediction of natural disasters

    International Nuclear Information System (INIS)

    Kondratyev, K. Ya; Krapivin, V. F.

    2004-01-01

    The problems of natural disaster predicting and accomplishing a synthesis of environmental monitoring systems to collect, store, and process relevant information for their solution are analysed. A three-level methodology is proposed for making decisions concerning the natural disaster dynamics. The methodology is based on the assessment of environmental indicators and the use of numerical models of the environment

  11. Wrinkling prediction with adaptive mesh refinement

    NARCIS (Netherlands)

    Selman, A.; Meinders, Vincent T.; van den Boogaard, Antonius H.; Huetink, Han

    2000-01-01

    An adaptive mesh refinement procedure for wrinkling prediction analyses is presented. First the critical values are determined using Hutchinson’s bifurcation functional. A wrinkling risk factor is then defined and used to determined areas of potential wrinkling risk. Finally, a mesh refinement is

  12. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  13. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  14. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  15. Feasibility Analyses of Integrated Broiler Production

    Directory of Open Access Journals (Sweden)

    L. Komalasari

    2010-12-01

    Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.

  16. Preserving the nuclear option: analyses and recommendations

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    It is certain that a future role for nuclear power will depend on substantial changes in the management and regulation of the enterprise. It is widely believed that institutional, rather than technological, change is, at least in the short term, the key to resuscitating the nuclear option. Several recent analyses of the problems facing nuclear power, together with the current congressional hearing on the Nuclear Regulatory Commission's fiscal year 1986 budget request, have examined both the future of nuclear power and what can be done to address present institutional shortcomings. The congressional sessions have provided an indication of the views of both legislators and regulators, and this record, although mixed, generally shows continued optimism about the prospects of the nuclear option if needed reforms are accomplished

  17. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  18. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  19. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  20. An introduction to modern missing data analyses.

    Science.gov (United States)

    Baraldi, Amanda N; Enders, Craig K

    2010-02-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional techniques. This article explains the theoretical underpinnings of missing data analyses, gives an overview of traditional missing data techniques, and provides accessible descriptions of maximum likelihood and multiple imputation. In particular, this article focuses on maximum likelihood estimation and presents two analysis examples from the Longitudinal Study of American Youth data. One of these examples includes a description of the use of auxiliary variables. Finally, the paper illustrates ways that researchers can use intentional, or planned, missing data to enhance their research designs.

  1. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  2. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  3. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  4. Precursor analyses for German nuclear power plants

    International Nuclear Information System (INIS)

    Babst, Siegfried; Gaenssmantel, Gerhard; Stueck, Reinhard

    2009-01-01

    Precursor analysis is an internationally recognized method for quantifying the safety-relevance of operational events in nuclear power plants. Precursors are operational events in nuclear power plants which had no serious impact, but which could have led to serious impacts, if additional malfunctions had occurred. Examples of such operational events are component failures or transients, for example, the loss of main feedwater. On the basis of the probabilities for the occurrence of additional malfunctions or initiating events precursor analyses determine the probability with which these additional malfunctions during the event occurred would have led to core damage. This conditional probability is a measure for the safety relevance of the operational event occurred. Events, for which the probability of core damages is > 10 -6 , are internationally classified as ''precursor''. (orig.)

  5. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  6. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in-creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack...... with social engineering. Due to this combination of attack steps on technical and social levels, risk assessment in socio-technical systems is complex. Therefore, established risk assessment methods often abstract away the internal structure of an organisation and ignore human factors when modelling...... and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...

  7. Antarctic observations available for IMS correlative analyses

    International Nuclear Information System (INIS)

    Rycroft, M.J.

    1982-01-01

    A review is provided of the wide-ranging observational programs of 25 stations operating on and around the continent of Antarctica during the International Magnetospheric Study (IMS). Attention is given to observations of geomagnetism, short period fluctuations of the earth's electromagnetic field, observations of the ionosphere and of whistler mode signals, observational programs in ionospheric and magnetospheric physics, upper atmosphere physics observations, details of magnetospheric programs conducted at Kerguelen, H-component magnetograms, magnetic field line oscillations, dynamic spectra of whistlers, and the variation of plasmapause position derived from recorded whistlers. The considered studies suggest that, in principle, if the level of magnetic activity is known, predictions can be made concerning the time at which the trough occurs, and the shape and the movement of the main trough

  8. Analysing CMS transfers using Machine Learning techniques

    CERN Document Server

    Diotalevi, Tommaso

    2016-01-01

    LHC experiments transfer more than 10 PB/week between all grid sites using the FTS transfer service. In particular, CMS manages almost 5 PB/week of FTS transfers with PhEDEx (Physics Experiment Data Export). FTS sends metrics about each transfer (e.g. transfer rate, duration, size) to a central HDFS storage at CERN. The work done during these three months, here as a Summer Student, involved the usage of ML techniques, using a CMS framework called DCAFPilot, to process this new data and generate predictions of transfer latencies on all links between Grid sites. This analysis will provide, as a future service, the necessary information in order to proactively identify and maybe fix latency issued transfer over the WLCG.

  9. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  10. GPU based framework for geospatial analyses

    Science.gov (United States)

    Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus

    2017-04-01

    Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric

  11. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  12. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  13. Analyses of tropistic responses using metabolomics.

    Science.gov (United States)

    Millar, Katherine D L; Kiss, John Z

    2013-01-01

    Characterization of phototropism and gravitropism has been through gene expression studies, assessment of curvature response, and protein expression experiments. To our knowledge, the current study is the first to determine how the metabolome, the complete set of small-molecule metabolites within a plant, is impacted during these tropisms. We have determined the metabolic profile of plants during gravitropism and phototropism. Seedlings of Arabidopsis thaliana wild type (WT) and phyB mutant were exposed to unidirectional light (red or blue) or reoriented to induce a tropistic response, and small-molecule metabolites were assayed and quantified. A subset of the WT was analyzed using microarray experiments to obtain gene profiling data. Analyses of the metabolomic data using principal component analysis showed a common profile in the WT during the different tropistic curvatures, but phyB mutants produced a distinctive profile for each tropism. Interestingly, the gravity treatment elicited the greatest changes in gene expression of the WT, followed by blue light, then by red light treatments. For all tropisms, we identified genes that were downregulated by a large magnitude in carbohydrate metabolism and secondary metabolism. These included ATCSLA15, CELLULOSE SYNTHASE-LIKE, and ATCHS/SHS/TT4, CHALCONE SYNTHASE. In addition, genes involved in amino acid biosynthesis were strongly upregulated, and these included THA1 (THREONINE ALDOLASE 1) and ASN1 (DARK INDUCIBLE asparagine synthase). We have established the first metabolic profile of tropisms in conjunction with transcriptomic analyses. This approach has been useful in characterizing the similarities and differences in the molecular mechanisms involved with phototropism and gravitropism.

  14. Predicting outdoor sound

    CERN Document Server

    Attenborough, Keith; Horoshenkov, Kirill

    2014-01-01

    1. Introduction  2. The Propagation of Sound Near Ground Surfaces in a Homogeneous Medium  3. Predicting the Acoustical Properties of Outdoor Ground Surfaces  4. Measurements of the Acoustical Properties of Ground Surfaces and Comparisons with Models  5. Predicting Effects of Source Characteristics on Outdoor Sound  6. Predictions, Approximations and Empirical Results for Ground Effect Excluding Meteorological Effects  7. Influence of Source Motion on Ground Effect and Diffraction  8. Predicting Effects of Mixed Impedance Ground  9. Predicting the Performance of Outdoor Noise Barriers  10. Predicting Effects of Vegetation, Trees and Turbulence  11. Analytical Approximations including Ground Effect, Refraction and Turbulence  12. Prediction Schemes  13. Predicting Sound in an Urban Environment.

  15. The prediction of different experiences of longterm illness

    DEFF Research Database (Denmark)

    Blank, N; Diderichsen, Finn

    1996-01-01

    To analyse the role played by socioeconomic factors and self rated general health in the prediction of the reporting of severe longterm illness, and the extent to which these factors explain social class differences in the reporting of such illness.......To analyse the role played by socioeconomic factors and self rated general health in the prediction of the reporting of severe longterm illness, and the extent to which these factors explain social class differences in the reporting of such illness....

  16. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  17. Analysing Scenarios of Cell Population System Development

    Directory of Open Access Journals (Sweden)

    M. S. Vinogradova

    2014-01-01

    Full Text Available The article considers an isolated population system consisting of two types of human stem cells, namely normal cells and cells with chromosomal abnormalities (abnormal ones. The system develops in the laboratory (in vitro. The article analyses possible scenarios of the population system development, which are implemented for different values of its parameters. An investigated model of the cell population system takes into account the limited resources. It is represented as a system of two nonlinear differential equations with continuous right-hand part. The model is considered with non-negative values of the variables; the domain is divided into four sets. The model feature is that in each set the right part of the system of differential equations has a different form.The article analyses a quality of the rest points of the system in each of four sets. The analytical conditions for determination of the number of rest points and the quality of rest points, with, at least, one zero coordinate, are obtained.It is shown that the population system under study cannot have more than two points of rest, both coordinates of which are positive (non-zero. It is difficult to determine quality of such rest points depending on the model parameters due to the complexity of the expressions, which define the systems of the first approximation, recorded in a neighborhood of these points of rest. Numerical research results of the stability of these points of rest are obtained, and phase portraits with the specified specific values of the system parameters are demonstrated. The main scenarios for the cell population development are adduced. Analysis of mathematical model shows that a cell population system may remain the system consisting of populations of normal and abnormal cells; it can degenerate into a population of abnormal cells or perish. The scenario, in which there is only the population of normal cells, is not implemented. The numerical simulation

  18. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  19. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  20. Prediction of postoperative pain: a systematic review of predictive experimental pain studies

    DEFF Research Database (Denmark)

    Werner, Mads Utke; Mjöbo, Helena N; Nielsen, Per R

    2010-01-01

    preoperative responses to experimental pain stimuli and clinical postoperative pain and demonstrates that the preoperative pain tests may predict 4-54% of the variance in postoperative pain experience depending on the stimulation methods and the test paradigm used. The predictive strength is much higher than...... previously reported for single factor analyses of demographics and psychologic factors. In addition, some of these studies indicate that an increase in preoperative pain sensitivity is associated with a high probability of development of sustained postsurgical pain....

  1. Network-Based and Binless Frequency Analyses.

    Directory of Open Access Journals (Sweden)

    Sybil Derrible

    Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.

  2. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  3. Recent advances in cellular glycomic analyses.

    Science.gov (United States)

    Furukawa, Jun-Ichi; Fujitani, Naoki; Shinohara, Yasuro

    2013-02-21

    A large variety of glycans is intricately located on the cell surface, and the overall profile (the glycome, given the entire repertoire of glycoconjugate-associated sugars in cells and tissues) is believed to be crucial for the diverse roles of glycans, which are mediated by specific interactions that control cell-cell adhesion, immune response, microbial pathogenesis and other cellular events. The glycomic profile also reflects cellular alterations, such as development, differentiation and cancerous change. A glycoconjugate-based approach would therefore be expected to streamline discovery of novel cellular biomarkers. Development of such an approach has proven challenging, due to the technical difficulties associated with the analysis of various types of cellular glycomes; however, recent progress in the development of analytical methodologies and strategies has begun to clarify the cellular glycomics of various classes of glycoconjugates. This review focuses on recent advances in the technical aspects of cellular glycomic analyses of major classes of glycoconjugates, including N- and O-linked glycans, derived from glycoproteins, proteoglycans and glycosphingolipids. Articles that unveil the glycomics of various biologically important cells, including embryonic and somatic stem cells, induced pluripotent stem (iPS) cells and cancer cells, are discussed.

  4. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  5. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  6. Water channel structures analysed by electron crystallography.

    Science.gov (United States)

    Tani, Kazutoshi; Fujiyoshi, Yoshinori

    2014-05-01

    The mechanisms underlying water transport through aquaporin (AQP) have been debated for two decades. The water permeation phenomenon of AQP seems inexplicable because the Grotthuss mechanism does not allow for simultaneous fast water permeability and inhibition of proton transfer through the hydrogen bonds of water molecules. The AQP1 structure determined by electron crystallography provided the first insights into the proton exclusion mechanism despite fast water permeation. Although several studies have provided clues about the mechanism based on the AQP structure, each proposed mechanism remains incomplete. The present review is focused on AQP function and structure solved by electron crystallography in an attempt to fill the gaps between the findings in the absence and presence of lipids. Many AQP structures can be superimposed regardless of the determination method. The AQP fold is preserved even under conditions lacking lipids, but the water arrangement in the channel pore differs. The differences might be explained by dipole moments formed by the two short helices in the lipid bilayer. In addition, structure analyses of double-layered two-dimensional crystals of AQP suggest an array formation and cell adhesive function. Electron crystallography findings not only have contributed to resolve some of the water permeation mechanisms, but have also elucidated the multiple functions of AQPs in the membrane. The roles of AQPs in the brain remain obscure, but their multiple activities might be important in the regulation of brain and other biological functions. This article is part of a Special Issue entitled Aquaporins. © 2013.

  7. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  8. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  9. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  10. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  11. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  12. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  13. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  14. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  15. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  16. Repository simulation system (REPSIMS) for design analyses

    International Nuclear Information System (INIS)

    Griesmeyer, J.M.; Dennis, A.W.

    1989-01-01

    The Repository Simulation System (REPSIMS) combines graphic programming and interactive simulation to facilitate early identification of acceptable design concepts for a nuclear waste repository. REPSIMS is an object-oriented, menu-driven, versatile computer modeling system that allows the facility designer to create visual models of proposed facilities, graphically define operations, and using simulation analyses, determine the efficiencies of proposed designs and their operations. Hierarchical representations of both physical facilities and operations allow REPSIMS to be used early in the evaluation of conceptual designs as well as for the analysis of mature designs. High-level models of conceptual designs can be used to identify critical facility layout and operation issues. These preliminary models can then be refined to investigate those issues and to incorporate additional information as it becomes available. REPSIMS thus supports the typical top-down design process in which general specifications for major systems and operations are successively refined as the design progresses. REPSIMS has been used to determine the impact of using robotic, manual contact, or master/slave operations on cask turnaround times, throughput, and equipment utilization, and to investigate the impact of the ratio between truck and rail shipments to the repository. An analysis of alternative designs for the waste-handling building at Yucca Mountain has begun

  17. Kinematic gait analyses in healthy Golden Retrievers

    Directory of Open Access Journals (Sweden)

    Gabriela C.A. Silva

    2014-12-01

    Full Text Available Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female, aged between 2 and 4 years, weighing 21.5 to 28 kg, clinically normal. Flexion and extension were described for shoulder, elbow, carpal, hip, femorotibialis and tarsal joints. The gait was characterized lateral and had accepted hypothesis of normality for all variables, except for the stance of hip and elbow, considering a confidence level of 95%, significance level α = 0.05. Variations have been attributed to displacement of the stripes during movement and the duplicated number of reviews. The kinematic analysis proved to be a consistent method of evaluation of the movement during canine gait and the data can be used in the diagnosis and evaluation of canine gait in comparison to other studies and treatment of dogs with musculoskeletal disorders.

  18. Theoretical and computational analyses of LNG evaporator

    Science.gov (United States)

    Chidambaram, Palani Kumar; Jo, Yang Myung; Kim, Heuy Dong

    2017-04-01

    Theoretical and numerical analysis on the fluid flow and heat transfer inside a LNG evaporator is conducted in this work. Methane is used instead of LNG as the operating fluid. This is because; methane constitutes over 80% of natural gas. The analytical calculations are performed using simple mass and energy balance equations. The analytical calculations are made to assess the pressure and temperature variations in the steam tube. Multiphase numerical simulations are performed by solving the governing equations (basic flow equations of continuity, momentum and energy equations) in a portion of the evaporator domain consisting of a single steam pipe. The flow equations are solved along with equations of species transport. Multiphase modeling is incorporated using VOF method. Liquid methane is the primary phase. It vaporizes into the secondary phase gaseous methane. Steam is another secondary phase which flows through the heating coils. Turbulence is modeled by a two equation turbulence model. Both the theoretical and numerical predictions are seen to match well with each other. Further parametric studies are planned based on the current research.

  19. Computational Analyses for Transplant Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    Anyou eWang

    2015-09-01

    Full Text Available Translational medicine offers a rich promise for improved diagnostics and drug discovery for biomedical research in the field of transplantation, where continued unmet diagnostic and therapeutic needs persist. Current advent of genomics and proteomics profiling called omics provides new resources to develop novel biomarkers for clinical routine. Establishing such a marker system heavily depends on appropriate applications of computational algorithms and software, which are basically based on mathematical theories and models. Understanding these theories would help to apply appropriate algorithms to ensure biomarker systems successful. Here, we review the key advances in theories and mathematical models relevant to transplant biomarker developments. Advantages and limitations inherent inside these models are discussed. The principles of key computational approaches for selecting efficiently the best subset of biomarkers from high dimensional omics data are highlighted. Prediction models are also introduced and the integration of multi-microarray data is also discussed. Appreciating these key advances would help to accelerate the development of clinically reliable biomarker systems.

  20. Population Pharmacokinetic Analyses of Lithium: A Systematic Review.

    Science.gov (United States)

    Methaneethorn, Janthima

    2018-02-01

    Even though lithium has been used for the treatment of bipolar disorder for several decades, its toxicities are still being reported. The major limitation in the use of lithium is its narrow therapeutic window. Several methods have been proposed to predict lithium doses essential to attain therapeutic levels. One of the methods used to guide lithium therapy is population pharmacokinetic approach which accounts for inter- and intra-individual variability in predicting lithium doses. Several population pharmacokinetic studies of lithium have been conducted. The objective of this review is to provide information on population pharmacokinetics of lithium focusing on nonlinear mixed effect modeling approach and to summarize significant factors affecting lithium pharmacokinetics. A literature search was conducted from PubMed database from inception to December, 2016. Studies conducted in humans, using lithium as a study drug, providing population pharmacokinetic analyses of lithium by means of nonlinear mixed effect modeling, were included in this review. Twenty-four articles were identified from the database. Seventeen articles were excluded based on the inclusion and exclusion criteria. A total of seven articles were included in this review. Of these, only one study reported a combined population pharmacokinetic-pharmacodynamic model of lithium. Lithium pharmacokinetics were explained using both one- and two-compartment models. The significant predictors of lithium clearance identified in most studies were renal function and body size. One study reported a significant effect of age on lithium clearance. The typical values of lithium clearance ranged from 0.41 to 9.39 L/h. The magnitude of inter-individual variability on lithium clearance ranged from 12.7 to 25.1%. Only two studies evaluated the models using external data sets. Model methodologies in each study are summarized and discussed in this review. For future perspective, a population pharmacokinetic

  1. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    Energy Technology Data Exchange (ETDEWEB)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions.

  2. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    Rector, D.R.; McCann, R.A.; Jenquin, U.P.; Heeb, C.M.; Creer, J.M.; Wheeler, C.L.

    1986-12-01

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions

  3. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  4. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  5. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  6. Molecular biomarker analyses using circulating tumor cells.

    Directory of Open Access Journals (Sweden)

    Elizabeth A Punnoose

    2010-09-01

    Full Text Available Evaluation of cancer biomarkers from blood could significantly enable biomarker assessment by providing a relatively non-invasive source of representative tumor material. Circulating Tumor Cells (CTCs isolated from blood of metastatic cancer patients hold significant promise in this regard.Using spiked tumor-cells we evaluated CTC capture on different CTC technology platforms, including CellSearch and two biochip platforms, and used the isolated CTCs to develop and optimize assays for molecular characterization of CTCs. We report similar performance for the various platforms tested in capturing CTCs, and find that capture efficiency is dependent on the level of EpCAM expression. We demonstrate that captured CTCs are amenable to biomarker analyses such as HER2 status, qRT-PCR for breast cancer subtype markers, KRAS mutation detection, and EGFR staining by immunofluorescence (IF. We quantify cell surface expression of EGFR in metastatic lung cancer patient samples. In addition, we determined HER2 status by IF and FISH in CTCs from metastatic breast cancer patients. In the majority of patients (89% we found concordance with HER2 status from patient tumor tissue, though in a subset of patients (11%, HER2 status in CTCs differed from that observed in the primary tumor. Surprisingly, we found CTC counts to be higher in ER+ patients in comparison to HER2+ and triple negative patients, which could be explained by low EpCAM expression and a more mesenchymal phenotype of tumors belonging to the basal-like molecular subtype of breast cancer.Our data suggests that molecular characterization from captured CTCs is possible and can potentially provide real-time information on biomarker status. In this regard, CTCs hold significant promise as a source of tumor material to facilitate clinical biomarker evaluation. However, limitations exist from a purely EpCAM based capture system and addition of antibodies to mesenchymal markers could further improve CTC

  7. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  8. EP 1000 steam generator tube rupture analyses

    International Nuclear Information System (INIS)

    Saiu, G.; Frogheri, M.; Schulz, T.L.

    2001-01-01

    European electrical utility organizations together with Westinghouse and Ansaldo are participating in a program to utilize the Westinghouse passive nuclear plant technology to develop a plant which meets the European Utility Requirements (EUR) and is expected to be licensable in Europe. The program was initiated in 1994 and the plant is designated EP1000. The EP1000 design is notable for simplicity that comes from a reliance on passive safety systems to enhance plant safety. The use of passive safety systems has provided significant and measurable improvements in plant simplification, safety, reliability, investment protection and plant costs. These systems use only natural forces such as gravity, natural circulation, and compressed gas to provide the driving forces for the systems to adequately cool the reactor core following an initiating event. The EP1000 builds up on the Westinghouse passive nuclear plant technology to enhance plant safety and meet European Utility Requirements and specific European National Safety Criteria. This paper summarizes the main results of the Steam Generator Tube Rupture (SGTR) analysis activity, performed in Phase 2B of the European Passive Plant Program. The purpose of the study is to provide evidence that the passive safety system performance provides a significant improvement in terms of safety, providing significant margins to steam generator overfilling and reducing the need for operator actions. The behavior of the EP1000 plant following SGTR accidents has been analyzed by means of the RELAP5/Mod3.2 code. Sensitivity cases were performed, to address the impact of varying the number of steam generator tubes that rupture, and the potential adverse interactions that could result from operation of control systems (i.e., Chemical and Volume Control System, Startup Feedwater). Analyses have also been performed to define and verify improved protection system logic to avoid possible steam generator safety valve challenges both in the

  9. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  10. On study design in neuroimaging heritability analyses

    Science.gov (United States)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  11. Analyses of Hypomethylated Oil Palm Gene Space

    Science.gov (United States)

    Jayanthi, Nagappan; Mohd-Amin, Ab Halim; Azizi, Norazah; Chan, Kuang-Lim; Maqbool, Nauman J.; Maclean, Paul; Brauning, Rudi; McCulloch, Alan; Moraga, Roger; Ong-Abdullah, Meilina; Singh, Rajinder

    2014-01-01

    Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC) were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP) markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm. PMID:24497974

  12. Quantitative DNA Analyses for Airborne Birch Pollen.

    Directory of Open Access Journals (Sweden)

    Isabell Müller-Germann

    Full Text Available Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR, which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8 and the other for a multi-copy gene (ITS. The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm, the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  13. Analyses of hypomethylated oil palm gene space.

    Directory of Open Access Journals (Sweden)

    Eng-Ti L Low

    Full Text Available Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm.

  14. Predictive modelling of noise level generated during sawing of rocks ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... Influence of the operating variables and rock properties on the noise level are investigated and analysed. Statistical analyses are then employed and models are built for the prediction of noise levels depending on the operating variables and the rock properties. The derived models are validated through ...

  15. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    Science.gov (United States)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums

  16. DMINDA: an integrated web server for DNA motif identification and analyses.

    Science.gov (United States)

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. A systematic review of the quality and impact of anxiety disorder meta-analyses.

    Science.gov (United States)

    Ipser, Jonathan C; Stein, Dan J

    2009-08-01

    Meta-analyses are seen as representing the pinnacle of a hierarchy of evidence used to inform clinical practice. Therefore, the potential importance of differences in the rigor with which they are conducted and reported warrants consideration. In this review, we use standardized instruments to describe the scientific and reporting quality of meta-analyses of randomized controlled trials of the treatment of anxiety disorders. We also use traditional and novel metrics of article impact to assess the influence of meta-analyses across a range of research fields in the anxiety disorders. Overall, although the meta-analyses that we examined had some flaws, their quality of reporting was generally acceptable. Neither the scientific nor reporting quality of the meta-analyses was predicted by any of the impact metrics. The finding that treatment meta-analyses were cited less frequently than quantitative reviews of studies in current "hot spots" of research (ie, genetics, imaging) points to the multifactorial nature of citation patterns. A list of the meta-analyses included in this review is available on an evidence-based website of anxiety and trauma-related disorders.

  18. Predictable or not predictable? The MOV question

    International Nuclear Information System (INIS)

    Thibault, C.L.; Matzkiw, J.N.; Anderson, J.W.; Kessler, D.W.

    1994-01-01

    Over the past 8 years, the nuclear industry has struggled to understand the dynamic phenomena experienced during motor-operated valve (MOV) operation under differing flow conditions. For some valves and designs, their operational functionality has been found to be predictable; for others, unpredictable. Although much has been accomplished over this period of time, especially on modeling valve dynamics, the unpredictability of many valves and designs still exists. A few valve manufacturers are focusing on improving design and fabrication techniques to enhance product reliability and predictability. However, this approach does not address these issues for installed and inpredictable valves. This paper presents some of the more promising techniques that Wyle Laboratories has explored with potential for transforming unpredictable valves to predictable valves and for retrofitting installed MOVs. These techniques include optimized valve tolerancing, surrogated material evaluation, and enhanced surface treatments

  19. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  20. Nonlinear finite element analyses: advances and challenges in dental applications.

    Science.gov (United States)

    Wakabayashi, N; Ona, M; Suzuki, T; Igarashi, Y

    2008-07-01

    To discuss the development and current status of application of nonlinear finite element method (FEM) in dentistry. The literature was searched for original research articles with keywords such as nonlinear, finite element analysis, and tooth/dental/implant. References were selected manually or searched from the PUBMED and MEDLINE databases through November 2007. The nonlinear problems analyzed in FEM studies were reviewed and categorized into: (A) nonlinear simulations of the periodontal ligament (PDL), (B) plastic and viscoelastic behaviors of dental materials, (C) contact phenomena in tooth-to-tooth contact, (D) contact phenomena within prosthodontic structures, and (E) interfacial mechanics between the tooth and the restoration. The FEM in dentistry recently focused on simulation of realistic intra-oral conditions such as the nonlinear stress-strain relationship in the periodontal tissues and the contact phenomena in teeth, which could hardly be solved by the linear static model. The definition of contact area critically affects the reliability of the contact analyses, especially for implant-abutment complexes. To predict the failure risk of a bonded tooth-restoration interface, it is essential to assess the normal and shear stresses relative to the interface. The inclusion of viscoelasticity and plastic deformation to the program to account for the time-dependent, thermal sensitive, and largely deformable nature of dental materials would enhance its application. Further improvement of the nonlinear FEM solutions should be encouraged to widen the range of applications in dental and oral health science.

  1. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  2. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  3. Visualizing Confidence in Cluster-Based Ensemble Weather Forecast Analyses.

    Science.gov (United States)

    Kumpf, Alexander; Tost, Bianca; Baumgart, Marlene; Riemer, Michael; Westermann, Rudiger; Rautenhaus, Marc

    2018-01-01

    In meteorology, cluster analysis is frequently used to determine representative trends in ensemble weather predictions in a selected spatio-temporal region, e.g., to reduce a set of ensemble members to simplify and improve their analysis. Identified clusters (i.e., groups of similar members), however, can be very sensitive to small changes of the selected region, so that clustering results can be misleading and bias subsequent analyses. In this article, we - a team of visualization scientists and meteorologists-deliver visual analytics solutions to analyze the sensitivity of clustering results with respect to changes of a selected region. We propose an interactive visual interface that enables simultaneous visualization of a) the variation in composition of identified clusters (i.e., their robustness), b) the variability in cluster membership for individual ensemble members, and c) the uncertainty in the spatial locations of identified trends. We demonstrate that our solution shows meteorologists how representative a clustering result is, and with respect to which changes in the selected region it becomes unstable. Furthermore, our solution helps to identify those ensemble members which stably belong to a given cluster and can thus be considered similar. In a real-world application case we show how our approach is used to analyze the clustering behavior of different regions in a forecast of "Tropical Cyclone Karl", guiding the user towards the cluster robustness information required for subsequent ensemble analysis.

  4. Genome-wide analyses of small noncoding RNAs in streptococci

    Directory of Open Access Journals (Sweden)

    Nadja ePatenge

    2015-05-01

    Full Text Available Streptococci represent a diverse group of Gram-positive bacteria, which colonize a wide range of hosts among animals and humans. Streptococcal species occur as commensal as well as pathogenic organisms. Many of the pathogenic species can cause severe, invasive infections in their hosts leading to a high morbidity and mortality. The consequence is a tremendous suffering on the part of men and livestock besides the significant financial burden in the agricultural and healthcare sectors. An environmentally stimulated and tightly controlled expression of virulence factor genes is of fundamental importance for streptococcal pathogenicity. Bacterial small noncoding RNAs (sRNAs modulate the expression of genes involved in stress response, sugar metabolism, surface composition, and other properties that are related to bacterial virulence. Even though the regulatory character is shared by this class of RNAs, variation on the molecular level results in a high diversity of functional mechanisms. The knowledge about the role of sRNAs in streptococci is still limited, but in recent years, genome-wide screens for sRNAs have been conducted in an increasing number of species. Bioinformatics prediction approaches have been employed as well as expression analyses by classical array techniques or next generation sequencing. This review will give an overview of whole genome screens for sRNAs in streptococci with a focus on describing the different methods and comparing their outcome considering sRNA conservation among species, functional similarities, and relevance for streptococcal infection.

  5. Comparative transcriptomic analyses of male and female adult Toxocara canis.

    Science.gov (United States)

    Zhou, Rong-Qiong; Ma, Guang-Xu; Korhonen, Pasi K; Luo, Yong-Li; Zhu, Hong-Hong; Luo, Yong-Fang; Gasser, Robin B; Xia, Qing-You

    2017-02-05

    Toxocariasis is an important, neglected zoonosis caused mainly by Toxocara canis. Although our knowledge of helminth molecular biology is improving through completed draft genome projects, there is limited detailed information on the molecular biology of Toxocara species. Here, transcriptomic sequencing of male and female adult T. canis and comparative analyses were conducted. For each sex, two-thirds (66-67%) of quality-filtered reads mapped to the gene set of T. canis, and at least five reads mapped to each of 16,196 (87.1%) of all 18,596 genes, and 321 genes were specifically transcribed in female and 1467 in male T. canis. Genes differentially transcribed between the two sexes were identified, enriched biological processes and pathways linked to these genes established, and molecules associated with reproduction and development predicted. In addition, small RNA pathways involved in reproduction were characterized, but there was no evidence for piwi RNA pathways in adult T. canis. The results of this transcriptomic study should provide a useful basis to support investigations of the reproductive biology of T. canis and related nematodes. 2 . Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Experimental and numerical analyses of magnesium alloy hot workability

    Directory of Open Access Journals (Sweden)

    F. Abbassi

    2016-12-01

    Full Text Available Due to their hexagonal crystal structure, magnesium alloys have relatively low workability at room temperature. In this study, the hot workability behavior of cast-extruded AZ31B magnesium alloy is studied through hot compression testing, numerical modeling and microstructural analyses. Hot deformation tests are performed at temperatures of 250 °C to 400 °C under strain rates of 0.01 to 1.0 s−1. Transmission electron microscopy is used to reveal the presence of dynamic recrystallization (DRX, dynamic recovery (DRY, cracks and shear bands. To predict plastic instabilities during hot compression tests of AZ31B magnesium alloy, the authors use Johnson–Cook damage model in a 3D finite element simulation. The optimal hot workability of magnesium alloy is found at a temperature (T of 400 °C and strain rate (ε˙ of 0.01 s−1. Stability is found at a lower strain rate, and instability is found at a higher strain rate.

  7. Sensitivity analyses of the peach bottom turbine trip 2 experiment

    International Nuclear Information System (INIS)

    Bousbia Salah, A.; D'Auria, F.

    2003-01-01

    In the light of the sustained development in computer technology, the possibilities for code calculations in predicting more realistic transient scenarios in nuclear power plants have been enlarged substantially. Therefore, it becomes feasible to perform 'Best-estimate' simulations through the incorporation of three-dimensional modeling of reactor core into system codes. This method is particularly suited for complex transients that involve strong feedback effects between thermal-hydraulics and kinetics as well as to transient involving local asymmetric effects. The Peach bottom turbine trip test is characterized by a prompt core power excursion followed by a self limiting power behavior. To emphasize and understand the feedback mechanisms involved during this transient, a series of sensitivity analyses were carried out. This should allow the characterization of discrepancies between measured and calculated trends and assess the impact of the thermal-hydraulic and kinetic response of the used models. On the whole, the data comparison revealed a close dependency of the power excursion with the core feedback mechanisms. Thus for a better best estimate simulation of the transient, both of the thermal-hydraulic and the kinetic models should be made more accurate. (author)

  8. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  9. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  10. Advanced core-analyses for subsurface characterization

    Science.gov (United States)

    Pini, R.

    2017-12-01

    The heterogeneity of geological formations varies over a wide range of length scales and represents a major challenge for predicting the movement of fluids in the subsurface. Although they are inherently limited in the accessible length-scale, laboratory measurements on reservoir core samples still represent the only way to make direct observations on key transport properties. Yet, properties derived on these samples are of limited use and should be regarded as sample-specific (or `pseudos'), if the presence of sub-core scale heterogeneities is not accounted for in data processing and interpretation. The advent of imaging technology has significantly reshaped the landscape of so-called Special Core Analysis (SCAL) by providing unprecedented insight on rock structure and processes down to the scale of a single pore throat (i.e. the scale at which all reservoir processes operate). Accordingly, improved laboratory workflows are needed that make use of such wealth of information by e.g., referring to the internal structure of the sample and in-situ observations, to obtain accurate parameterisation of both rock- and flow-properties that can be used to populate numerical models. We report here on the development of such workflow for the study of solute mixing and dispersion during single- and multi-phase flows in heterogeneous porous systems through a unique combination of two complementary imaging techniques, namely X-ray Computed Tomography (CT) and Positron Emission Tomography (PET). The experimental protocol is applied to both synthetic and natural porous media, and it integrates (i) macroscopic observations (tracer effluent curves), (ii) sub-core scale parameterisation of rock heterogeneities (e.g., porosity, permeability and capillary pressure), and direct 3D observation of (iii) fluid saturation distribution and (iv) the dynamic spreading of the solute plumes. Suitable mathematical models are applied to reproduce experimental observations, including both 1D and 3D

  11. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  12. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  13. Runtime and Pressurization Analyses of Propellant Tanks

    Science.gov (United States)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  14. Sensitivity Analyses for Robust Causal Inference from Mendelian Randomization Analyses with Multiple Genetic Variants.

    Science.gov (United States)

    Burgess, Stephen; Bowden, Jack; Fall, Tove; Ingelsson, Erik; Thompson, Simon G

    2017-01-01

    Mendelian randomization investigations are becoming more powerful and simpler to perform, due to the increasing size and coverage of genome-wide association studies and the increasing availability of summarized data on genetic associations with risk factors and disease outcomes. However, when using multiple genetic variants from different gene regions in a Mendelian randomization analysis, it is highly implausible that all the genetic variants satisfy the instrumental variable assumptions. This means that a simple instrumental variable analysis alone should not be relied on to give a causal conclusion. In this article, we discuss a range of sensitivity analyses that will either support or question the validity of causal inference from a Mendelian randomization analysis with multiple genetic variants. We focus on sensitivity analyses of greatest practical relevance for ensuring robust causal inferences, and those that can be undertaken using summarized data. Aside from cases in which the justification of the instrumental variable assumptions is supported by strong biological understanding, a Mendelian randomization analysis in which no assessment of the robustness of the findings to violations of the instrumental variable assumptions has been made should be viewed as speculative and incomplete. In particular, Mendelian randomization investigations with large numbers of genetic variants without such sensitivity analyses should be treated with skepticism.

  15. Modelling bankruptcy prediction models in Slovak companies

    Directory of Open Access Journals (Sweden)

    Kovacova Maria

    2017-01-01

    Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.

  16. Solar Cycle Predictions

    Science.gov (United States)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  17. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  18. Weld investigations by 3D analyses of Charpy V-notch specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo; Needleman, Allan

    2005-01-01

    The Charpy impact test is a standard procedure for determining the ductile-brittle transition in welds. The predictions of such tests have been investigated by full three dimensional transient analyses of Charpy V-notch specimens. The material response is characterised by an elastic-viscoplastic ......The Charpy impact test is a standard procedure for determining the ductile-brittle transition in welds. The predictions of such tests have been investigated by full three dimensional transient analyses of Charpy V-notch specimens. The material response is characterised by an elastic...... parameters in the weld material differ from those in the base material, and the heat a®ected zone (HAZ) tends to be more brittle than the other material regions. The effect of weld strength undermatch or overmatch is an important issue. Some specimens, for which the notched surface is rotated relative...

  19. MCNP benchmark analyses of critical experiments for the Space Nuclear Thermal Propulsion program

    Science.gov (United States)

    Selcow, Elizabeth C.; Cerbone, Ralph J.; Ludewig, Hans; Mughabghab, Said F.; Schmidt, Eldon; Todosow, Michael; Parma, Edward J.; Ball, Russell M.; Hoovler, Gary S.

    1993-01-01

    Benchmark analyses have been performed of Particle Bed Reactor (PBR) critical experiments (CX) using the MCNP radiation transport code. The experiments have been conducted at the Sandia National Laboratory reactor facility in support of the Space Nuclear Thermal Propulsion (SNTP) program. The test reactor is a nineteen element water moderated and reflected thermal system. A series of integral experiments have been carried out to test the capabilities of the radiation transport codes to predict the performance of PBR systems. MCNP was selected as the preferred radiation analysis tool for the benchmark experiments. Comparison between experimental and calculational results indicate close agreement. This paper describes the analyses of benchmark experiments designed to quantify the accuracy of the MCNP radiation transport code for predicting the performance characteristics of PBR reactors.

  20. MCNP benchmark analyses of critical experiments for the Space Nuclear Thermal Propulsion program

    International Nuclear Information System (INIS)

    Selcow, E.C.; Cerbone, R.J.; Ludewig, H.; Mughabghab, S.F.; Schmidt, E.; Todosow, M.; Parma, E.J.; Ball, R.M.; Hoovler, G.S.

    1993-01-01

    Benchmark analyses have been performed of Particle Bed Reactor (PBR) critical experiments (CX) using the MCNP radiation transport code. The experiments have been conducted at the Sandia National Laboratory reactor facility in support of the Space Nuclear Thermal Propulsion (SNTP) program. The test reactor is a nineteen element water moderated and reflected thermal system. A series of integral experiments have been carried out to test the capabilities of the radiation transport codes to predict the performance of PBR systems. MCNP was selected as the preferred radiation analysis tool for the benchmark experiments. Comparison between experimental and calculational results indicate close agreement. This paper describes the analyses of benchmark experiments designed to quantify the accuracy of the MCNP radiation transport code for predicting the performance characteristics of PBR reactors

  1. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Rev 00

    Energy Technology Data Exchange (ETDEWEB)

    David Dobson

    2001-06-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate and

  2. Evaluation of high-resolution precipitation analyses using a dense station network

    OpenAIRE

    A. Kann; I. Meirold-Mautner; F. Schmid; G. Kirchengast; J. Fuchsberger; V. Meyer; L. Tüchler; B. Bica

    2015-01-01

    The ability of radar–rain gauge merging algorithms to precisely analyse convective precipitation patterns is of high interest for many applications, e.g. hydrological modelling, thunderstorm warnings, and, as a reference, to spatially validate numerical weather prediction models. However, due to drawbacks of methods like cross-validation and due to the limited availability of reference data sets on high temporal and spatial scales, an adequate validation is usually hardly po...

  3. Seismic criteria studies and analyses. Quarterly progress report No. 3. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-03

    Information is presented concerning the extent to which vibratory motions at the subsurface foundation level might differ from motions at the ground surface and the effects of the various subsurface materials on the overall Clinch River Breeder Reactor site response; seismic analyses of LMFBR type reactors to establish analytical procedures for predicting structure stresses and deformations; and aspects of the current technology regarding the representation of energy losses in nuclear power plants as equivalent viscous damping.

  4. Life satisfaction and frailty in community-based older adults: cross-sectional and prospective analyses.

    Science.gov (United States)

    St John, Philip D; Tyas, Suzanne L; Montgomery, Patrick R

    2013-10-01

    Frailty may be associated with reduced life satisfaction (LS). The objectives of this paper are to determine if (1) frailty is associated with LS in community-dwelling older adults in cross-sectional analyses; (2) frailty predicts LS five years later; and (3) specific domains of LS are preferentially associated with frailty. This paper presents analysis of an existing population-based cohort study of 1,751 persons aged 65+ who were assessed in 1991, with follow-up five years later. LS was measured using the terrible-delightful scale, which measures overall LS and LS in specific domains. Frailty was measured using the Brief Frailty Instrument. Analyses were adjusted for age, gender, education, and marital status. Frailty was associated with overall LS at time 1 and predicted overall LS at time 2. This was seen in unadjusted analyses and after adjusting for confounding factors. Frailty was associated with all domains of LS at time 1, and predicted LS at time 2 in all domains except housing and self-esteem. However, the effect was stronger for LS with health than with other domains for both times 1 and 2. Frailty is associated with LS, and the effect is strongest for LS with health.

  5. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  6. Seismology for rockburst prediction.

    CSIR Research Space (South Africa)

    De Beer, W

    2000-02-01

    Full Text Available . . . . . . . . . . . . . . . . . . . . . . . . 9 3.1.3.2 Time to failure prediction algorithm . . . . . . . . . . . . . . . 9 3.1.4 Testing for deterministic components of time series of interest, noise reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3... component of seismic time series, noise reduction and limits of predictability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 A3.1 False nearest strands...

  7. The Prediction Value

    NARCIS (Netherlands)

    Koster, M.; Kurz, S.; Lindner, I.; Napel, S.

    2013-01-01

    We introduce the prediction value (PV) as a measure of players’ informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player i’s prediction value equals the difference between the conditional expectations

  8. Predicting AD conversion

    DEFF Research Database (Denmark)

    Liu, Yawu; Mattila, Jussi; Ruiz, Miguel �ngel Mu�oz

    2013-01-01

    To compare the accuracies of predicting AD conversion by using a decision support system (PredictAD tool) and current research criteria of prodromal AD as identified by combinations of episodic memory impairment of hippocampal type and visual assessment of medial temporal lobe atrophy (MTA) on MRI...

  9. Predictability of Stock Returns

    Directory of Open Access Journals (Sweden)

    Ahmet Sekreter

    2017-06-01

    Full Text Available Predictability of stock returns has been shown by empirical studies over time. This article collects the most important theories on forecasting stock returns and investigates the factors that affecting behavior of the stocks’ prices and the market as a whole. Estimation of the factors and the way of estimation are the key issues of predictability of stock returns.

  10. 'Red Flag' Predictions

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    -generation prediction markets and outline its unique features as a third-generation prediction market. It is argued that frontline employees gain deep insights when they execute operational activities on an ongoing basis in the organization. The experiential learning from close interaction with internal and external...

  11. Cultural Resource Predictive Modeling

    Science.gov (United States)

    2017-10-01

    refining formal, inductive predictive models is the quality of the archaeological and environmental data. To build models efficiently, relevant...geomorphology, and historic information . Lessons Learned: The original model was focused on the identification of prehistoric resources. This...system but uses predictive modeling informally . For example, there is no probability for buried archaeological deposits on the Burton Mesa, but there is

  12. Protein Sorting Prediction

    DEFF Research Database (Denmark)

    Nielsen, Henrik

    2017-01-01

    Many computational methods are available for predicting protein sorting in bacteria. When comparing them, it is important to know that they can be grouped into three fundamentally different approaches: signal-based, global-property-based and homology-based prediction. In this chapter, the strengths...

  13. Predicting occupational lung diseases

    NARCIS (Netherlands)

    Suarthana, E.

    2008-01-01

    This thesis aims at demonstrating the development, validation, and application of prediction models for occupational lung diseases. Prediction models are developed to estimate an individual’s probability of the presence or future likelihood of occurrence of an outcome (i.e. disease of interest or

  14. Prediction methods environmental-effect reporting

    International Nuclear Information System (INIS)

    Jonker, R.J.; Koester, H.W.

    1987-12-01

    This report provides a survey of prediction methods which can be applied to the calculation of emissions in cuclear-reactor accidents, in the framework of environment-effect reports (dutch m.e.r.) or risk analyses. Also emissions during normal operation are important for m.e.r.. These can be derived from measured emissions of power plants being in operation. Data concerning the latter are reported. The report consists of an introduction into reactor technology, among which a description of some reactor types, the corresponding fuel cycle and dismantling scenarios - a discussion of risk-analyses for nuclear power plants and the physical processes which can play a role during accidents - a discussion of prediction methods to be employed and the expected developments in this area - some background information. (aughor). 145 refs.; 21 figs.; 20 tabs

  15. Predicting protein structure classes from function predictions

    DEFF Research Database (Denmark)

    Sommer, I.; Rahnenfuhrer, J.; de Lichtenberg, Ulrik

    2004-01-01

    membership. Even for structural families of small size, family members receive significantly higher scores. For some examples, we show that the relevant functional features identified by this method are biologically meaningful. The proposed approach can be used to improve existing sequence......We introduce a new approach to using the information contained in sequence-to-function prediction data in order to recognize protein template classes, a critical step in predicting protein structure. The data on which our method is based comprise probabilities of functional categories; for given...... query sequences these probabilities are obtained by a neural net that has previously been trained on a variety of functionally important features. On a training set of sequences we assess the relevance of individual functional categories for identifying a given structural family. Using a combination...

  16. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  17. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  18. Ground motion predictions

    International Nuclear Information System (INIS)

    Loux, P.C.

    1969-01-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  19. Thermal analyses of in vitro low frequency sonophoresis.

    Science.gov (United States)

    Peng, Han-Min; Zhu, Pan-Cheng; Chen, Zhi-Jun

    2017-03-01

    As a type of transdermal drug delivery method, low frequency sonophoresis (LFS) has been investigated during the last twenty years and is currently being attempted in a clinical setting. However, the safety of low frequency ultrasound on humans has not been completely guaranteed with high-intensity ultrasound. Thermal damage, one of the challenges in the LFS process, e.g., burns, epidermal detachment and necrosis of tissues, hinders its widespread applications. To predict and impede the overheating problems in LFS, an acoustic-flow-thermal finite element method (FEM) based on COMSOL Multiphysics software is proposed in this paper to achieve thermal analyses. The temperature distribution and its rising curves in in vitro LFS are obtained by the FEM method and experimental measurements. Both simulated and experimental maximum temperatures are larger than the safety value (e.g., 42°C on human tissues) when the driving voltage is higher than 40V (5.5W input electric power), which proves that the overheating problem really exists in high-intensity ultrasound. Furthermore, the results show that the calculated temperature rising curves in in vitro LFS correspond to the experimental results, proving the effectiveness of this FEM method. In addition, several potential thermal influence factors have been studied, including a duty ratio and amplitude of the driving voltage, and liquid height in the donor, which may be helpful in restraining the temperature increase to limit thermal damage. According to the calculated and experimental results, the former two factors are sensitive to the rise in temperature, but a small scale of liquid volume increase can enhance the permeation of Calcein without obvious temperature change. Hence, the above factors can be synthetically utilized to restrain the rise in temperature with little sacrifice of permeation ability. So this acoustic-flow-thermal FEM method could be applied to an optimized LFS system design and simulating the thermal

  20. Nurses' intention to leave: critically analyse the theory of reasoned action and organizational commitment model.

    Science.gov (United States)

    Liou, Shwu-Ru

    2009-01-01

    To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.

  1. Analysing the length of care episode after hip fracture: a nonparametric and a parametric Bayesian approach.

    Science.gov (United States)

    Riihimäki, Jaakko; Sund, Reijo; Vehtari, Aki

    2010-06-01

    Effective utilisation of limited resources is a challenge for health care providers. Accurate and relevant information extracted from the length of stay distributions is useful for management purposes. Patient care episodes can be reconstructed from the comprehensive health registers, and in this paper we develop a Bayesian approach to analyse the length of care episode after a fractured hip. We model the large scale data with a flexible nonparametric multilayer perceptron network and with a parametric Weibull mixture model. To assess the performances of the models, we estimate expected utilities using predictive density as a utility measure. Since the model parameters cannot be directly compared, we focus on observables, and estimate the relevances of patient explanatory variables in predicting the length of stay. To demonstrate how the use of the nonparametric flexible model is advantageous for this complex health care data, we also study joint effects of variables in predictions, and visualise nonlinearities and interactions found in the data.

  2. The Effect of Scale Dependent Discretization on the Progressive Failure of Composite Materials Using Multiscale Analyses

    Science.gov (United States)

    Ricks, Trenton M.; Lacy, Thomas E., Jr.; Pineda, Evan J.; Bednarcyk, Brett A.; Arnold, Steven M.

    2013-01-01

    A multiscale modeling methodology, which incorporates a statistical distribution of fiber strengths into coupled micromechanics/ finite element analyses, is applied to unidirectional polymer matrix composites (PMCs) to analyze the effect of mesh discretization both at the micro- and macroscales on the predicted ultimate tensile (UTS) strength and failure behavior. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a PMC tensile specimen that initiates at the repeating unit cell (RUC) level. Three different finite element mesh densities were employed and each coupled with an appropriate RUC. Multiple simulations were performed in order to assess the effect of a statistical distribution of fiber strengths on the bulk composite failure and predicted strength. The coupled effects of both the micro- and macroscale discretizations were found to have a noticeable effect on the predicted UTS and computational efficiency of the simulations.

  3. Using System Dynamic Model and Neural Network Model to Analyse Water Scarcity in Sudan

    Science.gov (United States)

    Li, Y.; Tang, C.; Xu, L.; Ye, S.

    2017-07-01

    Many parts of the world are facing the problem of Water Scarcity. Analysing Water Scarcity quantitatively is an important step to solve the problem. Water scarcity in a region is gauged by WSI (water scarcity index), which incorporate water supply and water demand. To get the WSI, Neural Network Model and SDM (System Dynamic Model) that depict how environmental and social factors affect water supply and demand are developed to depict how environmental and social factors affect water supply and demand. The uneven distribution of water resource and water demand across a region leads to an uneven distribution of WSI within this region. To predict WSI for the future, logistic model, Grey Prediction, and statistics are applied in predicting variables. Sudan suffers from severe water scarcity problem with WSI of 1 in 2014, water resource unevenly distributed. According to the result of modified model, after the intervention, Sudan’s water situation will become better.

  4. "Predicting" parental longevity from offspring endophenotypes

    DEFF Research Database (Denmark)

    Yashin, Anatoli I; Arbeev, Konstantin G; Kulminski, Alexander

    2010-01-01

    , cognitive functioning and health/well-being among offspring predict longevity in parents. Good predictors can be used as endophenotypes for exceptional survival. Our analyses revealed significant associations between cumulative indices describing physiological state, as well as a number of offspring...... phenotypes, and parental lifespan, supporting both their familial basis and relevance to longevity. We conclude that the study of endophenotypes within families is a valid approach to the genetics of human longevity....

  5. Predictive modelling of evidence informed teaching

    OpenAIRE

    Zhang, Dell; Brown, C.

    2017-01-01

    In this paper, we analyse the questionnaire survey data collected from 79 English primary schools about the situation of evidence informed teaching, where the evidences could come from research journals or conferences. Specifically, we build a predictive model to see what external factors could help to close the gap between teachers’ belief and behaviour in evidence informed teaching, which is the first of its kind to our knowledge. The major challenge, from the data mining perspective, is th...

  6. Structural prediction in aphasia

    Directory of Open Access Journals (Sweden)

    Tessa Warren

    2015-05-01

    Full Text Available There is considerable evidence that young healthy comprehenders predict the structure of upcoming material, and that their processing is facilitated when they encounter material matching those predictions (e.g., Staub & Clifton, 2006; Yoshida, Dickey & Sturt, 2013. However, less is known about structural prediction in aphasia. There is evidence that lexical prediction may be spared in aphasia (Dickey et al., 2014; Love & Webb, 1977; cf. Mack et al, 2013. However, predictive mechanisms supporting facilitated lexical access may not necessarily support structural facilitation. Given that many people with aphasia (PWA exhibit syntactic deficits (e.g. Goodglass, 1993, PWA with such impairments may not engage in structural prediction. However, recent evidence suggests that some PWA may indeed predict upcoming structure (Hanne, Burchert, De Bleser, & Vashishth, 2015. Hanne et al. tracked the eyes of PWA (n=8 with sentence-comprehension deficits while they listened to reversible subject-verb-object (SVO and object-verb-subject (OVS sentences in German, in a sentence-picture matching task. Hanne et al. manipulated case and number marking to disambiguate the sentences’ structure. Gazes to an OVS or SVO picture during the unfolding of a sentence were assumed to indicate prediction of the structure congruent with that picture. According to this measure, the PWA’s structural prediction was impaired compared to controls, but they did successfully predict upcoming structure when morphosyntactic cues were strong and unambiguous. Hanne et al.’s visual-world evidence is suggestive, but their forced-choice sentence-picture matching task places tight constraints on possible structural predictions. Clearer evidence of structural prediction would come from paradigms where the content of upcoming material is not as constrained. The current study used self-paced reading study to examine structural prediction among PWA in less constrained contexts. PWA (n=17 who

  7. Tooth width predictions in a sample of Black South Africans.

    Science.gov (United States)

    Khan, M I; Seedat, A K; Hlongwa, P

    2007-07-01

    Space analysis during the mixed dentition requires prediction of the mesiodistal widths of the unerupted permanent canines and premolars and prediction tables and equations may be used for this purpose. The Tanaka and Johnston prediction equations, which were derived from a North American White sample, is one example which is widely used. This prediction equation may be inapplicable to other race groups due to racial tooth size variability. Therefore the purpose of this study was to derive prediction equations that would be applicable to Black South African subjects. One hundred and ten pre-treatment study casts of Black South African subjects were analysed from the Department of Orthodontics' records at the University of Limpopo. The sample was equally divided by gender with all subjects having Class I molar relationship and relatively well aligned teeth. The mesiodistal widths of the maxillary and mandibular canines and premolars were measured with a digital vernier calliper and compared with the measurements predicted with the Tanaka and Johnston equations. The relationship between the measured and predicted values were analysed by correlation and regression analyses. The results indicated that the Tanaka and Johnston prediction equations were not fully applicable to the Black South African sample. The equations tended to underpredict the male sample, while slight overprediction was observed in the female sample. Therefore, new equations were formulated and proposed that would be accurate for Black subjects.

  8. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  9. Reporting and methodological quality of meta-analyses in urological literature

    Directory of Open Access Journals (Sweden)

    Leilei Xia

    2017-04-01

    Full Text Available Purpose To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. Materials and Methods We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items and AMSTAR tool (11 items, respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. Results A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, “a priori” design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and “a priori” design were associated with superior reporting quality, following PRISMA guideline and “a priori” design were associated with superior methodological quality. Conclusions Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having “a priori” protocol.

  10. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  11. Prediction of pork quality parameters by applying fractals and data mining on MRI

    DEFF Research Database (Denmark)

    Caballero, Daniel; Pérez-Palacios, Trinidad; Caro, Andrés

    2017-01-01

    Point Fractal Texture Algorithm, OPFTA) to analyse MRI in order to predict quality parameters of loin. In addition, the effect of the sequence acquisition of MRI (Gradient echo, GE; Spin echo, SE and Turbo 3D, T3D) and the predictive technique of data mining (Isotonic regression, IR and Multiple linear...... regression, MLR) were analysed. Both fractal algorithm, FTA and OPFTA are appropriate to analyse MRI of loins. The sequence acquisition, the fractal algorithm and the data mining technique seems to influence on the prediction results. For most physico-chemical parameters, prediction equations with moderate...

  12. Pan-Cancer Analyses of the Nuclear Receptor Superfamily

    Directory of Open Access Journals (Sweden)

    Mark D. Long

    2015-12-01

    Full Text Available Nuclear receptors (NR act as an integrated conduit for environmental and hormonal signals to govern genomic responses, which relate to cell fate decisions. We review how their integrated actions with each other, shared co-factors and other transcription factors are disrupted in cancer. Steroid hormone nuclear receptors are oncogenic drivers in breast and prostate cancer and blockade of signaling is a major therapeutic goal. By contrast to blockade of receptors, in other cancers enhanced receptor function is attractive, as illustrated initially with targeting of retinoic acid receptors in leukemia. In the post-genomic era large consortia, such as The Cancer Genome Atlas, have developed a remarkable volume of genomic data with which to examine multiple aspects of nuclear receptor status in a pan-cancer manner. Therefore to extend the review of NR function we have also undertaken bioinformatics analyses of NR expression in over 3000 tumors, spread across six different tumor types (bladder, breast, colon, head and neck, liver and prostate. Specifically, to ask how the NR expression was distorted (altered expression, mutation and CNV we have applied bootstrapping approaches to simulate data for comparison, and also compared these NR findings to 12 other transcription factor families. Nuclear receptors were uniquely and uniformly downregulated across all six tumor types, more than predicted by chance. These approaches also revealed that each tumor type had a specific NR expression profile but these were most similar between breast and prostate cancer. Some NRs were down-regulated in at least five tumor types (e.g., NR3C2/MR and NR5A2/LRH-1 whereas others were uniquely down-regulated in one tumor (e.g., NR1B3/RARG. The downregulation was not driven by copy number variation or mutation and epigenetic mechanisms maybe responsible for the altered nuclear receptor expression.

  13. Validation of HELIOS for ATR Core Follow Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bays, Samuel E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Swain, Emily T. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Crawford, Douglas S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nigg, David W. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    This work summarizes the validation analyses for the HELIOS code to support core design and safety assurance calculations of the Advanced Test Reactor (ATR). Past and current core safety assurance is performed by the PDQ-7 diffusion code; a state of the art reactor physics simulation tool from the nuclear industry’s earlier days. Over the past twenty years, improvements in computational speed have enabled the use of modern neutron transport methodologies to replace the role of diffusion theory for simulation of complex systems, such as the ATR. More exact methodologies have enabled a paradigm-shift away from highly tuned codes that force compliance with a bounding safety envelope, and towards codes regularly validated against routine measurements. To validate HELIOS, the 16 ATR operational cycles from late-2009 to present were modeled. The computed power distribution was compared against data collected by the ATR’s on-line power surveillance system. It was found that the ATR’s lobe-powers could be determined with ±10% accuracy. Also, the ATR’s cold startup shim configuration for each of these 16 cycles was estimated and compared against the reported critical position from the reactor log-book. HELIOS successfully predicted criticality within the tolerance set by the ATR startup procedure for 13 out of the 16 cycles. This is compared to 12 times for PDQ (without empirical adjustment). These findings, as well as other insights discussed in this report, suggest that HELIOS is highly suited for replacing PDQ for core safety assurance of the ATR. Furthermore, a modern verification and validation framework has been established that allows reactor and fuel performance data to be computed with a known degree of accuracy and stated uncertainty.

  14. Prediction of bull fertility.

    Science.gov (United States)

    Utt, Matthew D

    2016-06-01

    Prediction of male fertility is an often sought-after endeavor for many species of domestic animals. This review will primarily focus on providing some examples of dependent and independent variables to stimulate thought about the approach and methodology of identifying the most appropriate of those variables to predict bull (bovine) fertility. Although the list of variables will continue to grow with advancements in science, the principles behind making predictions will likely not change significantly. The basic principle of prediction requires identifying a dependent variable that is an estimate of fertility and an independent variable or variables that may be useful in predicting the fertility estimate. Fertility estimates vary in which parts of the process leading to conception that they infer about and the amount of variation that influences the estimate and the uncertainty thereof. The list of potential independent variables can be divided into competence of sperm based on their performance in bioassays or direct measurement of sperm attributes. A good prediction will use a sample population of bulls that is representative of the population to which an inference will be made. Both dependent and independent variables should have a dynamic range in their values. Careful selection of independent variables includes reasonable measurement repeatability and minimal correlation among variables. Proper estimation and having an appreciation of the degree of uncertainty of dependent and independent variables are crucial for using predictions to make decisions regarding bull fertility. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  16. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  17. Some Observations on Damage Tolerance Analyses in Pressure Vessels

    Science.gov (United States)

    Raju, Ivatury S.; Dawicke, David S.; Hampton, Roy W.

    2017-01-01

    that for this loading, using Approach I and the initial detectable crack sizes at the (a/c) endpoints in 5009 specified for the ET and UT NDE methods, the smallest life is not at the two required limits of the (a/c) range, but rather is at an intermediate configuration in the range (a/c) of 0.4 to 0.6. Similar analyses using both Approach I and III with the initial detectable crack size at the (a/c) endpoints in 5009 for PT NDE showed the smallest life may be at an (a/c) endpoint or an intermediate (a/c), depending upon which Approach is used. As such, analyses that interrogate only the two (a/c) values of 0.2 and 1 may result in unconservative life predictions. The standard practice may need to be revised based on these results.

  18. Nonlinear piping damping and response predictions

    International Nuclear Information System (INIS)

    Severud, L.K.; Weiner, E.O.; Lindquist, M.R.; Anderson, M.J.; Wagner, S.E.

    1986-10-01

    The high level dynamic testing of four prototypic piping systems, used to provide benchmarks for analytical prediction comparisons, is overviewed. The size of pipe tested ranged from one-inch to six-inches in diameter and consisted of carbon steel or stainless steel material. Failure of the tested systems included progressive gross deformation or some combination of ratchetting-fatigue. Pretest failure predictions and post test comparisons using simplified elastic and elasto-plastic methods are presented. Detailed non-linear inelastic analyses are also shown, along with a typical ratchet-fatigue failure calculation. A simplified method for calculating modal equivalent viscous damping for snubbers and plastic hinges is also described. Conclusions are made regarding the applicability of the various analytical failure predictive methods and recommendations are made for future analytic and test efforts

  19. Parsimony in personality: predicting sexual prejudice.

    Science.gov (United States)

    Miller, Audrey K; Wagner, Maverick M; Hunt, Amy N

    2012-01-01

    Extant research has established numerous demographic, personal-history, attitudinal, and ideological correlates of sexual prejudice, also known as homophobia. The present study investigated whether Five-Factor Model (FFM) personality domains, particularly Openness, and FFM facets, particularly Openness to Values, contribute independent and incremental variance to the prediction of sexual prejudice beyond these established correlates. Participants were 117 college students who completed a comprehensive FFM measure, measures of sexual prejudice, and a demographics, personal-history, and attitudes-and-ideologies questionnaire. Results of stepwise multiple regression analyses demonstrated that, whereas Openness domain score predicted only marginal incremental variance in sexual prejudice, Openness facet scores (particularly Openness to Values) predicted independent and substantial incremental variance beyond numerous other zero-order correlates of sexual prejudice. The importance of integrating FFM personality variables, especially facet-level variables, into conceptualizations of sexual prejudice is highlighted. Study strengths and weaknesses are discussed as are potential implications for prejudice-reduction interventions.

  20. Accurate predictions for the LHC made easy

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The data recorded by the LHC experiments is of a very high quality. To get the most out of the data, precise theory predictions, including uncertainty estimates, are needed to reduce as much as possible theoretical bias in the experimental analyses. Recently, significant progress has been made in computing Next-to-Leading Order (NLO) computations, including matching to the parton shower, that allow for these accurate, hadron-level predictions. I shall discuss one of these efforts, the MadGraph5_aMC@NLO program, that aims at the complete automation of predictions at the NLO accuracy within the SM as well as New Physics theories. I’ll illustrate some of the theoretical ideas behind this program, show some selected applications to LHC physics, as well as describe the future plans.

  1. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    To improve the productivity of the development process, more and more tools for static software analysis are tightly integrated into the incremental build process of an IDE. If multiple interdependent analyses are used simultaneously, the coordination between the analyses becomes a major obstacle...... to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...

  2. Using a Log Analyser to Assist Research into Haptic Technology

    Science.gov (United States)

    Jónsson, Fannar Freyr; Hvannberg, Ebba Þóra

    Usability evaluations collect subjective and objective measures. Examples of the latter are time to complete a task. The paper describes use cases of a log analyser for haptic feedback. The log analyser reads a log file and extracts information such as time of each practice and assessment session, analyses whether the user goes off curve and measures the force applied. A study case using the analyser is performed using a PHANToM haptic learning environment application that is used to teach young visually impaired students the subject of polynomials. The paper answers six questions to illustrate further use cases of the log analyser.

  3. Strong earthquakes can be predicted: a multidisciplinary method for strong earthquake prediction

    Directory of Open Access Journals (Sweden)

    J. Z. Li

    2003-01-01

    Full Text Available The imminent prediction on a group of strong earthquakes that occurred in Xinjiang, China in April 1997 is introduced in detail. The prediction was made on the basis of comprehensive analyses on the results obtained by multiple innovative methods including measurements of crustal stress, observation of infrasonic wave in an ultra low frequency range, and recording of abnormal behavior of certain animals. Other successful examples of prediction are also enumerated. The statistics shows that above 40% of 20 total predictions jointly presented by J. Z. Li, Z. Q. Ren and others since 1995 can be regarded as effective. With the above methods, precursors of almost every strong earthquake around the world that occurred in recent years were recorded in our laboratory. However, the physical mechanisms of the observed precursors are yet impossible to explain at this stage.

  4. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    This paper provides detailed insights into predictability of the entire stock and bond return distribution through the use of quantile regression. This allows us to examine speci…c parts of the return distribution such as the tails or the center, and for a suf…ciently …ne grid of quantiles we can...... trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions...... are predictable as a function of economic state variables. The results are, however, very different for stocks and bonds. The state variables primarily predict only location shifts in the stock return distribution, while they also predict changes in higher-order moments in the bond return distribution. Out...

  5. Methane prediction in collieries

    CSIR Research Space (South Africa)

    Creedy, DP

    1999-06-01

    Full Text Available The primary aim of the project was to assess the current status of research on methane emission prediction for collieries in South Africa in comparison with methods used and advances achieved elsewhere in the world....

  6. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D

    2015-01-01

    Genomic prediction uses markers (SNPs) across the whole genome to predict individual breeding values at an early growth stage potentially before large scale phenotyping. One of the applications of genomic prediction in plant breeding is to identify the best individual candidate lines to contribute...... to next generation. The main goal of this study was to see the potential of using genomic prediction in a commercial Barley breeding program. The data used in this study was from Nordic Seed company which is located in Denmark. Around 350 advanced lines were genotyped with 9K Barely chip from Illumina....... Traits used in this study were grain yield, plant height and heading date. Heading date is number days it takes after 1st June for plant to head. Heritabilities were 0.33, 0.44 and 0.48 for yield, height and heading, respectively for the average of nine plots. The GBLUP model was used for genomic...

  7. CMAQ predicted concentration files

    Data.gov (United States)

    U.S. Environmental Protection Agency — CMAQ predicted ozone. This dataset is associated with the following publication: Gantt, B., G. Sarwar, J. Xing, H. Simon, D. Schwede, B. Hutzell, R. Mathur, and A....

  8. CMAQ predicted concentration files

    Data.gov (United States)

    U.S. Environmental Protection Agency — model predicted concentrations. This dataset is associated with the following publication: Muñiz-Unamunzaga, M., R. Borge, G. Sarwar, B. Gantt, D. de la Paz, C....

  9. Predicting the outcome of ankylosing spondylitis therapy

    Science.gov (United States)

    Vastesaeger, Nathan; van der Heijde, Désirée; Inman, Robert D; Wang, Yanxin; Deodhar, Atul; Hsu, Benjamin; Rahman, Mahboob U; Dijkmans, Ben; Geusens, Piet; Vander Cruyssen, Bert; Collantes, Eduardo; Sieper, Joachim; Braun, Jürgen

    2011-01-01

    Objectives To create a model that provides a potential basis for candidate selection for anti-tumour necrosis factor (TNF) treatment by predicting future outcomes relative to the current disease profile of individual patients with ankylosing spondylitis (AS). Methods ASSERT and GO–RAISE trial data (n=635) were analysed to identify baseline predictors for various disease-state and disease-activity outcome instruments in AS. Univariate, multivariate, receiver operator characteristic and correlation analyses were performed to select final predictors. Their associations with outcomes were explored. Matrix and algorithm-based prediction models were created using logistic and linear regression, and their accuracies were compared. Numbers needed to treat were calculated to compare the effect size of anti-TNF therapy between the AS matrix subpopulations. Data from registry populations were applied to study how a daily practice AS population is distributed over the prediction model. Results Age, Bath ankylosing spondylitis functional index (BASFI) score, enthesitis, therapy, C-reactive protein (CRP) and HLA-B27 genotype were identified as predictors. Their associations with each outcome instrument varied. However, the combination of these factors enabled adequate prediction of each outcome studied. The matrix model predicted outcomes as well as algorithm-based models and enabled direct comparison of the effect size of anti-TNF treatment outcome in various subpopulations. The trial populations reflected the daily practice AS population. Conclusion Age, BASFI, enthesitis, therapy, CRP and HLA-B27 were associated with outcomes in AS. Their combined use enables adequate prediction of outcome resulting from anti-TNF and conventional therapy in various AS subpopulations. This may help guide clinicians in making treatment decisions in daily practice. PMID:21402563

  10. UXO Burial Prediction Fidelity

    Science.gov (United States)

    2017-07-01

    for-profit company that operates three Federally Funded Research and Development Centers (FFRDCs). We perform scientific and technical analyses for the...1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no...The Institute for Defense Analyses (IDA) is a not-for-profit company that operates three Federally Funded Research and Development Centers (FFRDCs). We

  11. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  12. Genomic prediction using subsampling

    OpenAIRE

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-01-01

    Background Genome-wide assisted selection is a critical tool for the?genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each rou...

  13. Predicting Ideological Prejudice

    OpenAIRE

    Brandt, Mark J.

    2017-01-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants? ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models that can make specific size predictions for this association. A quantitative model that used the perceived ideology of the target group as the primary predictor of the ideology-prejudice relationship...

  14. Predicting Ideological Prejudice

    OpenAIRE

    Brandt, Mark

    2016-01-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants’ ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models that can make specific size predictions. A quantitative model that used the perceived ideology of the target group as the primary predictor of the ideology-prejudice relationship was developed with a...

  15. Predicting ideological prejudice

    OpenAIRE

    Brandt, M.J.

    2018-01-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants’ ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models that can make specific size predictions for this association. A quantitative model that used the perceived ideology of the target group as the primary predictor of the ideology-prejudice relationship...

  16. Multimodal Emoji Prediction

    OpenAIRE

    Barbieri, Francesco; Ballesteros, Miguel; Ronzano, Francesco; Saggion, Horacio

    2018-01-01

    Emojis are small images that are commonly included in social media text messages. The combination of visual and textual content in the same message builds up a modern way of communication, that automatic systems are not used to deal with. In this paper we extend recent advances in emoji prediction by putting forward a multimodal approach that is able to predict emojis in Instagram posts. Instagram posts are composed of pictures together with texts which sometimes include emojis. We show that ...

  17. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  18. Empirical ground motion prediction

    Directory of Open Access Journals (Sweden)

    R. J. Archuleta

    1994-06-01

    Full Text Available New methods of site-specific ground motion prediction in the time and frequency domains are presented. A large earthquake is simulated as a composite (linear combination of observed small earthquakes (subevents assuming Aki-Brune functional models of the source time functions (spectra. Source models incorporate basic scaling relations between source and spectral parameters. Ground motion predictions are consistent with the entire observed seismic spectrum from the lowest to the highest frequencies. These methods are designed to use all the available empirical Green’s functions (or any subset of observations at a site. Thus a prediction is not biased by a single record, and different possible source-receiver paths are taken into account. Directivity is accounted for by adjusting the apparent source duration at each site. Our time-series prediction algorithm is based on determination of a non-uniform distribution of rupture times of subevents. By introducing a specific rupture velocity we avoid the major problem of deficiency of predictions around the main event's corner frequency. A novel notion of partial coherence allows us to sum subevents' amplitude spectra directly without using any information on their rupture times and phase histories. Predictions by this spectral method are not Jependent on details of rupture nucleation and propagation, location of asperities and other predominantly phase-affecting factors, responsible for uncertainties in time-domain simulations.

  19. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  20. Predictability of geomagnetic series

    Directory of Open Access Journals (Sweden)

    E. Bellanger

    Full Text Available The aim of this paper is to lead a practical, rational and rigorous approach concerning what can be done, based on the knowledge of magnetic series, in the field of prediction of the extreme geomagnetic events. We compare the magnetic vector differential at different locations computed with different resolutions, from an entire day to minutes. We study the classical correlations and the simplest possible prediction scheme to conclude a high level of predictability of the magnetic vector variation. The results obtained are far from a random guessing: the error diagrams are either comparable with earthquake prediction studies or out-perform them when the minute sampling is used in accounting for hourly magnetic vector variation. We demonstrate how the magnetic extreme events can be predicted from the hourly value of the magnetic variation with a lead time of several hours. We compute the 2-D empirical distribution of consecutive values of the magnetic vector variation for the estimation of conditional probabilities of different types. The achieved results encourage further development of the approach to prediction of the extreme geomagnetic events.

    Key words. Ionosphere (modeling and forecasting – Magnetospheric physics (storms and substorms

  1. Integrative analyses shed new light on human ribosomal protein gene regulation.

    Science.gov (United States)

    Li, Xin; Zheng, Yiyu; Hu, Haiyan; Li, Xiaoman

    2016-06-27

    Ribosomal protein genes (RPGs) are important house-keeping genes that are well-known for their coordinated expression. Previous studies on RPGs are largely limited to their promoter regions. Recent high-throughput studies provide an unprecedented opportunity to study how human RPGs are transcriptionally modulated and how such transcriptional regulation may contribute to the coordinate gene expression in various tissues and cell types. By analyzing the DNase I hypersensitive sites under 349 experimental conditions, we predicted 217 RPG regulatory regions in the human genome. More than 86.6% of these computationally predicted regulatory regions were partially corroborated by independent experimental measurements. Motif analyses on these predicted regulatory regions identified 31 DNA motifs, including 57.1% of experimentally validated motifs in literature that regulate RPGs. Interestingly, we observed that the majority of the predicted motifs were shared by the predicted distal and proximal regulatory regions of the same RPGs, a likely general mechanism for enhancer-promoter interactions. We also found that RPGs may be differently regulated in different cells, indicating that condition-specific RPG regulatory regions still need to be discovered and investigated. Our study advances the understanding of how RPGs are coordinately modulated, which sheds light to the general principles of gene transcriptional regulation in mammals.

  2. Cytomics in predictive medicine

    Science.gov (United States)

    Tarnok, Attila; Valet, Guenther K.

    2004-07-01

    Predictive Medicine aims at the detection of changes in patient's disease state prior to the manifestation of deterioration or improvement of the current status. Patient-specific, disease-course predictions with >95% or >99% accuracy during therapy would be highly valuable for everyday medicine. If these predictors were available, disease aggravation or progression, frequently accompanied by irreversible tissue damage or therapeutic side effects, could then potentially be avoided by early preventive therapy. The molecular analysis of heterogeneous cellular systems (Cytomics) by cytometry in conjunction with pattern-oriented bioinformatic analysis of the multiparametric cytometric and other data provides a promising approach to individualized or personalized medical treatment or disease management. Predictive medicine is best implemented by cell oriented measurements e.g. by flow or image cytometry. Cell oriented gene or protein arrays as well as bead arrays for the capture of solute molecules form serum, plasma, urine or liquor are equally of high value. Clinical applications of predictive medicine by Cytomics will include multi organ failure in sepsis or non infectious posttraumatic shock in intensive care, or the pretherapeutic identification of high risk patients in cancer cytostatic. Early individualized therapy may provide better survival chances for individual patient at concomitant cost containment. Predictive medicine guided early reduction or stop of therapy may lower or abrogate potential therapeutic side effects. Further important aspects of predictive medicine concern the preoperative identification of patients with a tendency for postoperative complications or coronary artery disease patients with an increased tendency for restenosis. As a consequence, better patient care and new forms of inductive scientific hypothesis development based on the interpretation of predictive data patterns are at reach.

  3. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    International Nuclear Information System (INIS)

    Moinereau, D.; Faidy, C.; Valeta, M.P.; Bhandari, S.; Guichard, D.

    1997-01-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs

  4. Transionospheric propagation predictions

    Science.gov (United States)

    Klobucher, J. A.; Basu, S.; Basu, S.; Bernhardt, P. A.; Davies, K.; Donatelli, D. E.; Fremouw, E. J.; Goodman, J. M.; Hartmann, G. K.; Leitinger, R.

    1979-01-01

    The current status and future prospects of the capability to make transionospheric propagation predictions are addressed, highlighting the effects of the ionized media, which dominate for frequencies below 1 to 3 GHz, depending upon the state of the ionosphere and the elevation angle through the Earth-space path. The primary concerns are the predictions of time delay of signal modulation (group path delay) and of radio wave scintillation. Progress in these areas is strongly tied to knowledge of variable structures in the ionosphere ranging from the large scale (thousands of kilometers in horizontal extent) to the fine scale (kilometer size). Ionospheric variability and the relative importance of various mechanisms responsible for the time histories observed in total electron content (TEC), proportional to signal group delay, and in irregularity formation are discussed in terms of capability to make both short and long term predictions. The data base upon which predictions are made is examined for its adequacy, and the prospects for prediction improvements by more theoretical studies as well as by increasing the available statistical data base are examined.

  5. Seasonal predictability of Kiremt rainfall in coupled general circulation models

    Science.gov (United States)

    Gleixner, Stephanie; Keenlyside, Noel S.; Demissie, Teferi D.; Counillon, François; Wang, Yiguo; Viste, Ellen

    2017-11-01

    The Ethiopian economy and population is strongly dependent on rainfall. Operational seasonal predictions for the main rainy season (Kiremt, June-September) are based on statistical approaches with Pacific sea surface temperatures (SST) as the main predictor. Here we analyse dynamical predictions from 11 coupled general circulation models for the Kiremt seasons from 1985-2005 with the forecasts starting from the beginning of May. We find skillful predictions from three of the 11 models, but no model beats a simple linear prediction model based on the predicted Niño3.4 indices. The skill of the individual models for dynamically predicting Kiremt rainfall depends on the strength of the teleconnection between Kiremt rainfall and concurrent Pacific SST in the models. Models that do not simulate this teleconnection fail to capture the observed relationship between Kiremt rainfall and the large-scale Walker circulation.

  6. Methods and procedures for shielding analyses for the SNS

    International Nuclear Information System (INIS)

    Popova, I.; Ferguson, F.; Gallmeier, F.X.; Iverson, E.; Lu, Wei

    2011-01-01

    In order to provide radiologically safe Spallation Neutron Source operation, shielding analyses are performed according to Oak Ridge National Laboratory internal regulations and to comply with the Code of Federal Regulations. An overview of on-going shielding work for the accelerator facility and neutrons beam lines, methods used for the analyses, and associated procedures and regulations are presented. Methods used to perform shielding analyses are described as well. (author)

  7. Single crystal analyser accepting the narrowest neutron angular profile

    International Nuclear Information System (INIS)

    Abbas, Sohrab; Wagh, Apoorva G.; Strobl, Markus; Treimer, Wolfgang

    2007-01-01

    We have designed, fabricated and tested a novel silicon single crystal analyser. It accepts a 0.21 arcsec (FWHM) wide angular profile of a monochromatic 5.24 A neutron beam, in agreement with its design. This is the narrowest and sharpest acceptance angular profile attained to date in the world with a neutron analyser. This analyser will facilitate SUSANS experiments probing wave vector transfers Q ∼ 10 -6 A -1 . (author)

  8. Genomic Prediction of Gene Bank Wheat Landraces

    Directory of Open Access Journals (Sweden)

    José Crossa

    2016-07-01

    Full Text Available This study examines genomic prediction within 8416 Mexican landrace accessions and 2403 Iranian landrace accessions stored in gene banks. The Mexican and Iranian collections were evaluated in separate field trials, including an optimum environment for several traits, and in two separate environments (drought, D and heat, H for the highly heritable traits, days to heading (DTH, and days to maturity (DTM. Analyses accounting and not accounting for population structure were performed. Genomic prediction models include genotype × environment interaction (G × E. Two alternative prediction strategies were studied: (1 random cross-validation of the data in 20% training (TRN and 80% testing (TST (TRN20-TST80 sets, and (2 two types of core sets, “diversity” and “prediction”, including 10% and 20%, respectively, of the total collections. Accounting for population structure decreased prediction accuracy by 15–20% as compared to prediction accuracy obtained when not accounting for population structure. Accounting for population structure gave prediction accuracies for traits evaluated in one environment for TRN20-TST80 that ranged from 0.407 to 0.677 for Mexican landraces, and from 0.166 to 0.662 for Iranian landraces. Prediction accuracy of the 20% diversity core set was similar to accuracies obtained for TRN20-TST80, ranging from 0.412 to 0.654 for Mexican landraces, and from 0.182 to 0.647 for Iranian landraces. The predictive core set gave similar prediction accuracy as the diversity core set for Mexican collections, but slightly lower for Iranian collections. Prediction accuracy when incorporating G × E for DTH and DTM for Mexican landraces for TRN20-TST80 was around 0.60, which is greater than without the G × E term. For Iranian landraces, accuracies were 0.55 for the G × E model with TRN20-TST80. Results show promising prediction accuracies for potential use in germplasm enhancement and rapid introgression of exotic germplasm

  9. Tide Predictions, California, 2014, NOAA

    Data.gov (United States)

    U.S. Environmental Protection Agency — The predictions from the web based NOAA Tide Predictions are based upon the latest information available as of the date of the user's request. Tide predictions...

  10. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  11. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  12. Inverse and Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-27

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.

  13. Predicting tile drainage discharge

    DEFF Research Database (Denmark)

    Iversen, Bo Vangsø; Kjærgaard, Charlotte; Petersen, Rasmus Jes

    More than 50 % of Danish agricultural areas are expected to be artificial tile drained. Transport of water and nutrients through the tile drain system to the aquatic environment is expected to be significant. For different mitigation strategies such as constructed wetlands an exact knowledge...... of the water load coming from the tile drainage system is therefore essential. This work aims at predicting tile drainage discharge using dynamic as well as a statistical predictive models. A large dataset of historical tile drain discharge data, daily discharge values as well as yearly average values were...... used in the analysis. For the dynamic modelling, a simple linear reservoir model was used where different outlets in the model represented tile drain as well as groundwater discharge outputs. This modelling was based on daily measured tile drain discharge values. The statistical predictive model...

  14. Predicting Ideological Prejudice.

    Science.gov (United States)

    Brandt, Mark J

    2017-06-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants' ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models that can make specific size predictions for this association. A quantitative model that used the perceived ideology of the target group as the primary predictor of the ideology-prejudice relationship was developed with a representative sample of Americans ( N = 4,940) and tested against models using the perceived status of and choice to belong to the target group as predictors. In four studies (total N = 2,093), ideology-prejudice associations were estimated, and these observed estimates were compared with the models' predictions. The model that was based only on perceived ideology was the most parsimonious with the smallest errors.

  15. Permeability prediction in chalks

    DEFF Research Database (Denmark)

    Alam, Mohammad Monzurul; Fabricius, Ida Lykke; Prasad, Manika

    2011-01-01

    The velocity of elastic waves is the primary datum available for acquiring information about subsurface characteristics such as lithology and porosity. Cheap and quick (spatial coverage, ease of measurement) information of permeability can be achieved, if sonic velocity is used for permeability...... prediction, so we have investigated the use of velocity data to predict permeability. The compressional velocity fromwireline logs and core plugs of the chalk reservoir in the South Arne field, North Sea, has been used for this study. We compared various methods of permeability prediction from velocities....... The relationships between permeability and porosity from core data were first examined using Kozeny’s equation. The data were analyzed for any correlations to the specific surface of the grain, Sg, and to the hydraulic property defined as the flow zone indicator (FZI). These two methods use two different approaches...

  16. Predicting the Sunspot Cycle

    Science.gov (United States)

    Hathaway, David H.

    2009-01-01

    The 11-year sunspot cycle was discovered by an amateur astronomer in 1844. Visual and photographic observations of sunspots have been made by both amateurs and professionals over the last 400 years. These observations provide key statistical information about the sunspot cycle that do allow for predictions of future activity. However, sunspots and the sunspot cycle are magnetic in nature. For the last 100 years these magnetic measurements have been acquired and used exclusively by professional astronomers to gain new information about the nature of the solar activity cycle. Recently, magnetic dynamo models have evolved to the stage where they can assimilate past data and provide predictions. With the advent of the Internet and open data policies, amateurs now have equal access to the same data used by professionals and equal opportunities to contribute (but, alas, without pay). This talk will describe some of the more useful prediction techniques and reveal what they say about the intensity of the upcoming sunspot cycle.

  17. [Analyses prognostic factors relevant to sudden sensorineural hearing loss].

    Science.gov (United States)

    Wang, Jun; Xiao, Shuifang; Zeng, Zhengang; Zhen, Zhen; Zhang, Xuexi; Lin, Feng; Dong, Mingmin; Lu, Wei; Qin, Zhaobing; Zuo, Bin; Bai, Xianfeng

    2015-06-01

    To investigate the prognostic factors relevant to sudden sensorineural hearing loss. The internationally accepted standardized clinical research methods, unified design, and unified program were adopted to conduct the prospective clinical multi-center study. The sudden deafness patients between 18 to 65 years old, with the course of this disorder less than two weeks, and without any medical treatments were collected, and then, divided into four types according to the hearing curve: type A, acute sensorineural hearing loss in low tone frequencies; type B, acute sensorineural hearing loss in high tone frequencies; type C, acute sensorineural hearing loss in all frequencies; and type D, total deafness. The factors, in terms of age, gender, type of initial audiogram, time delay before the first visit, and severity of hearing loss, were included in the analyses. A total of 1 024 cases with single side sudden deafness were collected in the study from 33 hospitals in China from August 2007 to October 2011, inclusive of for 492 males (48.05%) and 532 females (51.95%). The average age was (41.2 ± 12.8) years old. There were 553 cases (54.00%) in left ear, and 471 cases (46.00%) in right ear. The curative effects of different types were shown as follows: the type in low tone frequencies had the highest rate of 90.73%, the type in all frequencies was 82.59%; the type of total deafness was 70.29%; and the type in high tone frequencies had the lowest rate of 65.96%. It had significant difference of the effective rate between different types (χ(2) = 231.58, P = 0.000). Age, time delay before first visit, and severity of initial hearing loss were significantly correlated with hearing improvement. Initial audiogram of SSNHL might predict hearing recovery. The young in age and a short time delay before starting treatment are positive prognostic factors for hearing recovery in SSNHL. The initial severity of hearing loss is negative prognostic factor of hearing recovery.

  18. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    Science.gov (United States)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic

  19. Multi-state models: metapopulation and life history analyses

    Directory of Open Access Journals (Sweden)

    Arnason, A. N.

    2004-06-01

    Full Text Available Multi–state models are designed to describe populations that move among a fixed set of categorical states. The obvious application is to population interchange among geographic locations such as breeding sites or feeding areas (e.g., Hestbeck et al., 1991; Blums et al., 2003; Cam et al., 2004 but they are increasingly used to address important questions of evolutionary biology and life history strategies (Nichols & Kendall, 1995. In these applications, the states include life history stages such as breeding states. The multi–state models, by permitting estimation of stage–specific survival and transition rates, can help assess trade–offs between life history mechanisms (e.g. Yoccoz et al., 2000. These trade–offs are also important in meta–population analyses where, for example, the pre–and post–breeding rates of transfer among sub–populations can be analysed in terms of target colony distance, density, and other covariates (e.g., Lebreton et al. 2003; Breton et al., in review. Further examples of the use of multi–state models in analysing dispersal and life–history trade–offs can be found in the session on Migration and Dispersal. In this session, we concentrate on applications that did not involve dispersal. These applications fall in two main categories: those that address life history questions using stage categories, and a more technical use of multi–state models to address problems arising from the violation of mark–recapture assumptions leading to the potential for seriously biased predictions or misleading insights from the models. Our plenary paper, by William Kendall (Kendall, 2004, gives an overview of the use of Multi–state Mark–Recapture (MSMR models to address two such violations. The first is the occurrence of unobservable states that can arise, for example, from temporary emigration or by incomplete sampling coverage of a target population. Such states can also occur for life history reasons, such

  20. Predictive maintenance primer

    International Nuclear Information System (INIS)

    Flude, J.W.; Nicholas, J.R.

    1991-04-01

    This Predictive Maintenance Primer provides utility plant personnel with a single-source reference to predictive maintenance analysis methods and technologies used successfully by utilities and other industries. It is intended to be a ready reference to personnel considering starting, expanding or improving a predictive maintenance program. This Primer includes a discussion of various analysis methods and how they overlap and interrelate. Additionally, eighteen predictive maintenance technologies are discussed in sufficient detail for the user to evaluate the potential of each technology for specific applications. This document is designed to allow inclusion of additional technologies in the future. To gather the information necessary to create this initial Primer the Nuclear Maintenance Applications Center (NMAC) collected experience data from eighteen utilities plus other industry and government sources. NMAC also contacted equipment manufacturers for information pertaining to equipment utilization, maintenance, and technical specifications. The Primer includes a discussion of six methods used by analysts to study predictive maintenance data. These are: trend analysis; pattern recognition; correlation; test against limits or ranges; relative comparison data; and statistical process analysis. Following the analysis methods discussions are detailed descriptions for eighteen technologies analysts have found useful for predictive maintenance programs at power plants and other industrial facilities. Each technology subchapter has a description of the operating principles involved in the technology, a listing of plant equipment where the technology can be applied, and a general description of the monitoring equipment. Additionally, these descriptions include a discussion of results obtained from actual equipment users and preferred analysis techniques to be used on data obtained from the technology. 5 refs., 30 figs

  1. Zephyr - the prediction models

    DEFF Research Database (Denmark)

    Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg

    2001-01-01

    utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models.......This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish...

  2. Towards Predictive Association Theories

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios; Tsivintzelis, Ioannis; Michelsen, Michael Locht

    2011-01-01

    . We use the term predictive in two situations: (i) with no use of binary interaction parameters, and (ii) multicomponent calculations using binary interaction parameters based solely on binary data. It is shown that the CPA equation of state can satisfactorily predict CO2–water–glycols–alkanes VLE...... and water–MEG–aliphatic hydrocarbons LLE using interaction parameters obtained from the binary data alone. Moreover, it is demonstrated that the NRHB equation of state is a versatile tool which can be employed equally well to mixtures with pharmaceuticals and solvents, including mixed solvents, as well...

  3. Basis of predictive mycology.

    Science.gov (United States)

    Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice

    2005-04-15

    For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.

  4. Linguistic Structure Prediction

    CERN Document Server

    Smith, Noah A

    2011-01-01

    A major part of natural language processing now depends on the use of text data to build linguistic analyzers. We consider statistical, computational approaches to modeling linguistic structure. We seek to unify across many approaches and many kinds of linguistic structures. Assuming a basic understanding of natural language processing and/or machine learning, we seek to bridge the gap between the two fields. Approaches to decoding (i.e., carrying out linguistic structure prediction) and supervised and unsupervised learning of models that predict discrete structures as outputs are the focus. W

  5. Can we predict shoulder dystocia?

    Science.gov (United States)

    Revicky, Vladimir; Mukhopadhyay, Sambit; Morris, Edward P; Nieto, Jose J

    2012-02-01

    To analyse the significance of risk factors and the possibility of prediction of shoulder dystocia. This was a retrospective cohort study. There were 9,767 vaginal deliveries at 37 and more weeks of gestation analysed during 2005-2007. Studied population included 234 deliveries complicated by shoulder dystocia. Shoulder dystocia was defined as a delivery that required additional obstetric manoeuvres to release the shoulders after gentle downward traction has failed. First, a univariate analysis was done to identify the factors that had a significant association with shoulder dystocia. Parity, age, gestation, induction of labour, epidural analgesia, birth weight, duration of second stage of labour and mode of delivery were studied factors. All factors were then combined in a multivariate logistic regression analysis. Adjusted odds ratios (Adj. OR) with 95% confidence intervals (CI) were calculated. The incidence of shoulder dystocia was 2.4% (234/9,767). Only mode of delivery and birth weight were independent risk factors for shoulder dystocia. Parity, age, gestation, induction of labour, epidural analgesia and duration of second stage of labour were not independent risk factors. Ventouse delivery increases the risk of shoulder dystocia almost 3 times, forceps delivery comparing to the ventouse delivery increases risk almost 3.4 times. Risk of shoulder dystocia is minimal with the birth weight of 3,000 g or less. It is difficult to foretell the exact birth weight and the mode of delivery, therefore occurrence of shoulder dystocia is highly unpredictable. Regular drills for shoulder dystocia and awareness of increased incidence with instrumental deliveries are important to reduce fetal and maternal morbidity and mortality.

  6. The STD/MHD codes - Comparison of analyses with experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc. [for MHD generator flows

    Science.gov (United States)

    Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.

    1981-01-01

    Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.

  7. Analyses Of High Voltage Transmission Cables In South Western ...

    African Journals Online (AJOL)

    ... unit diameter) of the analysed underground cable is good compared with the standard conductor, the overhead cable has a poor load bearing capability. SEM and EDX analyses show that both the underground and the overhead cables contain impurities that are deleterious to the structure and their functional properties.

  8. Houdbaarheid en conservering van grondwatermonsters voor anorganische analyses

    NARCIS (Netherlands)

    Cleven RFMJ; Gast LFL; Boshuis-Hilverdink ME; LAC

    1995-01-01

    The storage life and the possibilities for preservation of inorganic analyses of groundwater samples have been investigated. Groundwater samples, with and without preservation with acid, from four locations in the Netherlands have been analysed ten times over a period of three months on six

  9. Uranium price trends for use in strategy analyses

    International Nuclear Information System (INIS)

    James, R.A.

    1979-09-01

    Long-term price forecasts for mined uranium are quoted. These will be used in Ontario Hydro's nuclear fuel cycle strategy analyses. They are, of necessity, speculative. The accuracy of the forecasts is considered adequate for long-term strategy analyses, but not for other purposes. (auth)

  10. Descriptive Analyses of Pediatric Food Refusal and Acceptance

    Science.gov (United States)

    Borrero, Carrie S. W.; Woods, Julia N.; Borrero, John C.; Masler, Elizabeth A.; Lesser, Aaron D.

    2010-01-01

    Functional analyses of inappropriate mealtime behavior typically include conditions to determine if the contingent delivery of attention, tangible items, or escape reinforce food refusal. In the current investigation, descriptive analyses were conducted for 25 children who had been admitted to a program for the assessment and treatment of food…

  11. Basic assumptions in statistical analyses of data in biomedical ...

    African Journals Online (AJOL)

    If one or more assumptions are violated, an alternative procedure must be used to obtain valid results. This article aims at highlighting some basic assumptions in statistical analyses of data in biomedical sciences. Keywords: samples, independence, non-parametric, parametric, statistical analyses. Int. J. Biol. Chem. Sci. Vol.

  12. Training Residential Staff to Conduct Trial-Based Functional Analyses

    Science.gov (United States)

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  13. The EADGENE and SABRE post-analyses workshop

    DEFF Research Database (Denmark)

    Jaffrezic, Florence; Hedegaard, Jakob; Sancristobal, Magali

    2009-01-01

    on the statistical analyses of a microarray experiment (i.e. getting a gene list), the subsequently analysis of the gene list is still an area of much confusion to many scientists. During a three-day workshop in November 2008, we discussed five aspects of these so-called post analyses of microarray data: 1) re...

  14. Discrete frequency identification using the HP 5451B Fourier analyser

    International Nuclear Information System (INIS)

    Holland, L.; Barry, P.

    1977-01-01

    The frequency analysis by the HP5451B discrete frequency Fourier analyser is studied. The advantages of cross correlation analysis to identify discrete frequencies in a background noise are discussed in conjuction with the elimination of aliasing and wraparound error. Discrete frequency identification is illustrated by a series of graphs giving the results of analysing 'electrical' and 'acoustical' white noise and sinusoidal signals [pt

  15. Finite strain analyses of deformations in polymer specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2016-01-01

    Analyses of the stress and strain state in test specimens or structural components made of polymer are discussed. This includes the Izod impact test, based on full 3D transient analyses. Also a long thin polymer tube under internal pressure has been studied, where instabilities develop, such as b...

  16. 76 FR 63913 - Commercial Building Workforce Job/Task Analyses

    Science.gov (United States)

    2011-10-14

    ... of persons in the commercial building space. DATES: Comments on the six Job/Task Analyses must be.../Task Analyses. 2. Support increased workforce mobility up career ladders and across career lattices by... held. To help DOE better understand at which point in the commenter's career certain skills are...

  17. 36 CFR 228.102 - Leasing analyses and decisions.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Leasing analyses and... AGRICULTURE MINERALS Oil and Gas Resources Leasing § 228.102 Leasing analyses and decisions. (a) Compliance with the National Environmental Policy Act of 1969. In analyzing lands for leasing, the authorized...

  18. Aftaler om arbejdsmiljø - en analyse af udvalgte overenskomster

    DEFF Research Database (Denmark)

    Petersen, Jens Voxtrup; Wiegmann, Inger-Marie; Vogt-Nielsen, Karl

    En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift.......En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift....

  19. Predicting Marital Therapy Dropouts.

    Science.gov (United States)

    Allgood, Scot M.; Crane, D. Russell

    1991-01-01

    Attempted to predict therapy dropouts using data gathered at marital therapy intake with 474 couples seeking marital therapy who attended at least 1 session. Significant predictors of dropping out included having less than two children, having a male intake clinician, and presenting problem relating only to one spouse. (Author/ABL)

  20. Predicting Sustainable Work Behavior

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft

    2013-01-01

    Sustainable work behavior is an important issue for operations managers – it has implications for most outcomes of OM. This research explores the antecedents of sustainable work behavior. It revisits and extends the sociotechnical model developed by Brown et al. (2000) on predicting safe behavior...... condition influence their sustainable work behavior. A new definition of sustainable work behavior is proposed....

  1. Vertebral Fracture Prediction

    DEFF Research Database (Denmark)

    2008-01-01

    Vertebral Fracture Prediction A method of processing data derived from an image of at least part of a spine is provided for estimating the risk of a future fracture in vertebraeof the spine. Position data relating to at least four neighbouring vertebrae of the spine is processed. The curvature...

  2. Predicting Intrinsic Motivation

    Science.gov (United States)

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the…

  3. Gate valve performance prediction

    International Nuclear Information System (INIS)

    Harrison, D.H.; Damerell, P.S.; Wang, J.K.; Kalsi, M.S.; Wolfe, K.J.

    1994-01-01

    The Electric Power Research Institute is carrying out a program to improve the performance prediction methods for motor-operated valves. As part of this program, an analytical method to predict the stem thrust required to stroke a gate valve has been developed and has been assessed against data from gate valve tests. The method accounts for the loads applied to the disc by fluid flow and for the detailed mechanical interaction of the stem, disc, guides, and seats. To support development of the method, two separate-effects test programs were carried out. One test program determined friction coefficients for contacts between gate valve parts by using material specimens in controlled environments. The other test program investigated the interaction of the stem, disc, guides, and seat using a special fixture with full-sized gate valve parts. The method has been assessed against flow-loop and in-plant test data. These tests include valve sizes from 3 to 18 in. and cover a considerable range of flow, temperature, and differential pressure. Stem thrust predictions for the method bound measured results. In some cases, the bounding predictions are substantially higher than the stem loads required for valve operation, as a result of the bounding nature of the friction coefficients in the method

  4. Predicting Pilot Retention

    Science.gov (United States)

    2012-06-15

    forever… Gig ‘Em! Dale W. Stanley III vii Table of Contents Page Acknowledgments...over the last 20 years. Airbus predicted that these trends would continue as emerging economies , especially in Asia, were creating a fast growing...US economy , pay differential and hiring by the major airlines contributed most to the decision to separate from the Air Force (Fullerton, 2003: 354

  5. Predicting coronary heart disease

    DEFF Research Database (Denmark)

    Sillesen, Henrik; Fuster, Valentin

    2012-01-01

    Atherosclerosis is the leading cause of death and disabling disease. Whereas risk factors are well known and constitute therapeutic targets, they are not useful for prediction of risk of future myocardial infarction, stroke, or death. Therefore, methods to identify atherosclerosis itself have bee...

  6. Prediction method abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  7. Predicting visibility of aircraft.

    Science.gov (United States)

    Watson, Andrew; Ramirez, Cesar V; Salud, Ellen

    2009-05-20

    Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO). In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration.

  8. Predicting ideological prejudice

    NARCIS (Netherlands)

    Brandt, M.J.

    2018-01-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants’ ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models

  9. Predicting Reasoning from Memory

    Science.gov (United States)

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  10. Steering smog prediction

    NARCIS (Netherlands)

    R. van Liere (Robert); J.J. van Wijk (Jack)

    1997-01-01

    textabstractThe use of computational steering for smog prediction is described. This application is representative for many underlying issues found in steering high performance applications: high computing times, large data sets, and many different input parameters. After a short description of the

  11. Predicting Lotto Numbers

    NARCIS (Netherlands)

    Jorgensen, C.B.; Suetens, S.; Tyran, J.R.

    2011-01-01

    We investigate the "law of small numbers" using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto

  12. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  13. Joint analyses model for total cholesterol and triglyceride in human serum with near-infrared spectroscopy

    Science.gov (United States)

    Yao, Lijun; Lyu, Ning; Chen, Jiemei; Pan, Tao; Yu, Jing

    2016-04-01

    The development of a small, dedicated near-infrared (NIR) spectrometer has promising potential applications, such as for joint analyses of total cholesterol (TC) and triglyceride (TG) in human serum for preventing and treating hyperlipidemia of a large population. The appropriate wavelength selection is a key technology for developing such a spectrometer. For this reason, a novel wavelength selection method, named the equidistant combination partial least squares (EC-PLS), was applied to the wavelength selection for the NIR analyses of TC and TG in human serum. A rigorous process based on the various divisions of calibration and prediction sets was performed to achieve modeling optimization with stability. By applying EC-PLS, a model set was developed, which consists of various models that were equivalent to the optimal model. The joint analyses model of the two indicators was further selected with only 50 wavelengths. The random validation samples excluded from the modeling process were used to validate the selected model. The root-mean-square errors, correlation coefficients and ratio of performance to deviation for the prediction were 0.197 mmol L- 1, 0.985 and 5.6 for TC, and 0.101 mmol L- 1, 0.992 and 8.0 for TG, respectively. The sensitivity and specificity for hyperlipidemia were 96.2% and 98.0%. These findings indicate high prediction accuracy and low model complexity. The proposed wavelength selection provided valuable references for the designing of a small, dedicated spectrometer for hyperlipidemia. The methodological framework and optimization algorithm are universal, such that they can be applied to other fields.

  14. Measurement of the analysing power in proton–proton elastic scattering at small angles

    Directory of Open Access Journals (Sweden)

    Z. Bagdasarian

    2014-12-01

    Full Text Available The proton analysing power in p→p elastic scattering has been measured at small angles at COSY-ANKE at 796 MeV and five other beam energies between 1.6 and 2.4 GeV using a polarised proton beam. The asymmetries obtained by detecting the fast proton in the ANKE forward detector or the slow recoil proton in a silicon tracking telescope are completely consistent. Although the analysing power results agree well with the many published data at 796 MeV, and also with the most recent partial wave solution at this energy, the ANKE data at the higher energies lie well above the predictions of this solution at small angles. An updated phase shift analysis that uses the ANKE results together with the World data leads to a much better description of these new measurements.

  15. Loss of Normal Feedwater analyses for Krsko Full Scope Simulator verification

    International Nuclear Information System (INIS)

    Parzer, I.; Prosek, A.; Hrvatin, S.

    2000-01-01

    The purpose of these analyses was to perform calculations of a Loss of Normal (Main) Feedwater transient for Krsko NPP. The results of calculations were used for the verification of reactor coolant system thermal-hydraulic response predicted by Krsko Full Scope Simulator. To perform the thermal-hydraulic analyses, the RELAP5/MOD2 computer code and the NPP Krsko input card deck were used. In the presented paper two scenarios have been analyzed. Both of them started with a loss of normal feedwater event. Thus, a reduction or an interruption of the heat removal by the secondary system occurred. The first scenario assumed that auxiliary feedwater was available during the transient, while in the second scenario both normal and auxiliary feedwater were unavailable. The results showed that with auxiliary feedwater pumps unavailable additional operator actions would be needed to prevent overheating of the core. (author)

  16. Engineering design and exergy analyses for combustion gas turbine based power generation system

    International Nuclear Information System (INIS)

    Sue, D.-C.; Chuang, C.-C.

    2004-01-01

    This paper presents the engineering design and theoretical exergetic analyses of the plant for combustion gas turbine based power generation systems. Exergy analysis is performed based on the first and second laws of thermodynamics for power generation systems. The results show the exergy analyses for a steam cycle system predict the plant efficiency more precisely. The plant efficiency for partial load operation is lower than full load operation. Increasing the pinch points will decrease the combined cycle plant efficiency. The engineering design is based on inlet air-cooling and natural gas preheating for increasing the net power output and efficiency. To evaluate the energy utilization, one combined cycle unit and one cogeneration system, consisting of gas turbine generators, heat recovery steam generators, one steam turbine generator with steam extracted for process have been analyzed. The analytical results are used for engineering design and component selection

  17. Preliminary analyses on hydrogen diffusion through small break of thermo-chemical IS process hydrogen plant

    International Nuclear Information System (INIS)

    Somolova, Marketa; Terada, Atsuhiko; Takegami, Hiroaki; Iwatsuki, Jin

    2008-12-01

    Japan Atomic Energy Agency has been conducting a conceptual design study of nuclear hydrogen demonstration plant, that is, a thermal-chemical IS process hydrogen plant coupled with the High temperature Engineering Test Reactor (HTTR-IS), which will be planed to produce a large amount of hydrogen up to 1000m 3 /h. As part of the conceptual design work of the HTTR-IS system, preliminary analyses on small break of a hydrogen pipeline in the IS process hydrogen plant was carried out as a first step of the safety analyses. This report presents analytical results of hydrogen diffusion behaviors predicted with a CFD code, in which a diffusion model focused on the turbulent Schmidt number was incorporated. By modifying diffusion model, especially a constant accompanying the turbulent Schmidt number in the diffusion term, analytical results was made agreed well with the experimental results. (author)

  18. Flutter and Forced Response Analyses of Cascades using a Two-Dimensional Linearized Euler Solver

    Science.gov (United States)

    Reddy, T. S. R.; Srivastava, R.; Mehmed, O.

    1999-01-01

    Flutter and forced response analyses for a cascade of blades in subsonic and transonic flow is presented. The structural model for each blade is a typical section with bending and torsion degrees of freedom. The unsteady aerodynamic forces due to bending and torsion motions. and due to a vortical gust disturbance are obtained by solving unsteady linearized Euler equations. The unsteady linearized equations are obtained by linearizing the unsteady nonlinear equations about the steady flow. The predicted unsteady aerodynamic forces include the effect of steady aerodynamic loading due to airfoil shape, thickness and angle of attack. The aeroelastic equations are solved in the frequency domain by coupling the un- steady aerodynamic forces to the aeroelastic solver MISER. The present unsteady aerodynamic solver showed good correlation with published results for both flutter and forced response predictions. Further improvements are required to use the unsteady aerodynamic solver in a design cycle.

  19. Experimental and Computational Modal Analyses for Launch Vehicle Models considering Liquid Propellant and Flange Joints

    Directory of Open Access Journals (Sweden)

    Chang-Hoon Sim

    2018-01-01

    Full Text Available In this research, modal tests and analyses are performed for a simplified and scaled first-stage model of a space launch vehicle using liquid propellant. This study aims to establish finite element modeling techniques for computational modal analyses by considering the liquid propellant and flange joints of launch vehicles. The modal tests measure the natural frequencies and mode shapes in the first and second lateral bending modes. As the liquid filling ratio increases, the measured frequencies decrease. In addition, as the number of flange joints increases, the measured natural frequencies increase. Computational modal analyses using the finite element method are conducted. The liquid is modeled by the virtual mass method, and the flange joints are modeled using one-dimensional spring elements along with the node-to-node connection. Comparison of the modal test results and predicted natural frequencies shows good or moderate agreement. The correlation between the modal tests and analyses establishes finite element modeling techniques for modeling the liquid propellant and flange joints of space launch vehicles.

  20. MELCOR analyses of severe accident scenarios in Oconee, a B ampersand W PWR plant

    International Nuclear Information System (INIS)

    Madni, I.K.; Nimnual, S.; Foulds, R.

    1993-01-01

    This paper presents the results and insights gained from MELCOR analyses of two severe accident scenarios, a Loss of Coolant Accident (LOCA) and a Station Blackout (TMLB) in Oconee, a Babcock ampersand Wilcox (B ampersand W) designed PWR with a large dry containment, and comparisons with Source Term Code Package (STCP) calculations of the same sequences. Results include predicted timing of key events, thermal-hydraulic response in the reactor coolant system and containment, and environmental releases of fission products. The paper also explores the impact of varying concrete type, vessel failure temperature, and break location on the accident progression, containment pressurization, and environmental releases of radionuclides