WorldWideScience

Sample records for analyses predict sistergroup

  1. Phylogenomic analyses predict sistergroup relationship of nucleariids and Fungi and paraphyly of zygomycetes with significant support

    Directory of Open Access Journals (Sweden)

    Steenkamp Emma

    2009-01-01

    Full Text Available Abstract Background Resolving the evolutionary relationships among Fungi remains challenging because of their highly variable evolutionary rates, and lack of a close phylogenetic outgroup. Nucleariida, an enigmatic group of amoeboids, have been proposed to emerge close to the fungal-metazoan divergence and might fulfill this role. Yet, published phylogenies with up to five genes are without compelling statistical support, and genome-level data should be used to resolve this question with confidence. Results Our analyses with nuclear (118 proteins and mitochondrial (13 proteins data now robustly associate Nucleariida and Fungi as neighbors, an assemblage that we term 'Holomycota'. With Nucleariida as an outgroup, we revisit unresolved deep fungal relationships. Conclusion Our phylogenomic analysis provides significant support for the paraphyly of the traditional taxon Zygomycota, and contradicts a recent proposal to include Mortierella in a phylum Mucoromycotina. We further question the introduction of separate phyla for Glomeromycota and Blastocladiomycota, whose phylogenetic positions relative to other phyla remain unresolved even with genome-level datasets. Our results motivate broad sampling of additional genome sequences from these phyla.

  2. The sister-group relationships of the largest family of lichenized fungi, Parmeliaceae (Lecanorales, Ascomycota).

    Science.gov (United States)

    Singh, Garima; Divakar, Pradeep K; Dal Grande, Francesco; Otte, Jürgen; Parnmen, Sittiporn; Wedin, Mats; Crespo, Ana; Lumbsch, H Thorsten; Schmitt, Imke

    2013-10-01

    Parmeliaceae is the largest family of lichen-forming fungi. In spite of its importance for fungal diversity, its relationships with other families in Lecanorales remain poorly known. To better understand the evolutionary history of the diversification of lineages and species richness in Parmeliaceae it is important to know the phylogenetic relationships of the closest relatives of the family. A recent study based on two molecular loci suggested that either Protoparmelia s. str. or a group consisting of Gypsoplaca and Protoparmelia s. str. were the possible sister-group candidates of Parmeliaceae, but that study could not distinguish between these two alternatives. Here, we used a four-locus phylogeny (nuLSU, ITS, RPB1, MCM7) to reveal relationships of Parmeliaceae with other potential relatives in Lecanorales. Maximum likelihood and Bayesian analyses showed that Protoparmelia is polyphyletic, with Protoparmelia s. str. (including Protoparmelia badia and Protoparmelia picea) being most closely related to Parmeliaceae s. str., while the Protoparmelia atriseda-group formed the sister-group to Miriquidica. Gypsoplaca formed the sister-group to the Parmeliaceae s. str. + Protoparmelia s. str. clade. Monophyly of Protoparmelia as currently circumscribed, and Gypsoplaca as sister-group to Parmeliaceae s. str. were both significantly rejected by alternative hypothesis testing. PMID:24119410

  3. The first record of a trans-oceanic sister-group relationship between obligate vertebrate troglobites.

    Directory of Open Access Journals (Sweden)

    Prosanta Chakrabarty

    Full Text Available We show using the most complete phylogeny of one of the most species-rich orders of vertebrates (Gobiiformes, and calibrations from the rich fossil record of teleost fishes, that the genus Typhleotris, endemic to subterranean karst habitats in southwestern Madagascar, is the sister group to Milyeringa, endemic to similar subterranean systems in northwestern Australia. Both groups are eyeless, and our phylogenetic and biogeographic results show that these obligate cave fishes now found on opposite ends of the Indian Ocean (separated by nearly 7,000 km are each others closest relatives and owe their origins to the break up of the southern supercontinent, Gondwana, at the end of the Cretaceous period. Trans-oceanic sister-group relationships are otherwise unknown between blind, cave-adapted vertebrates and our results provide an extraordinary case of Gondwanan vicariance.

  4. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  5. Analysing Twitter and web queries for flu trend prediction

    Science.gov (United States)

    2014-01-01

    Background Social media platforms encourage people to share diverse aspects of their daily life. Among these, shared health related information might be used to infer health status and incidence rates for specific conditions or symptoms. In this work, we present an infodemiology study that evaluates the use of Twitter messages and search engine query logs to estimate and predict the incidence rate of influenza like illness in Portugal. Results Based on a manually classified dataset of 2704 tweets from Portugal, we selected a set of 650 textual features to train a Naïve Bayes classifier to identify tweets mentioning flu or flu-like illness or symptoms. We obtained a precision of 0.78 and an F-measure of 0.83, based on cross validation over the complete annotated set. Furthermore, we trained a multiple linear regression model to estimate the health-monitoring data from the Influenzanet project, using as predictors the relative frequencies obtained from the tweet classification results and from query logs, and achieved a correlation ratio of 0.89 (p < 0.001). These classification and regression models were also applied to estimate the flu incidence in the following flu season, achieving a correlation of 0.72. Conclusions Previous studies addressing the estimation of disease incidence based on user-generated content have mostly focused on the english language. Our results further validate those studies and show that by changing the initial steps of data preprocessing and feature extraction and selection, the proposed approaches can be adapted to other languages. Additionally, we investigated whether the predictive model created can be applied to data from the subsequent flu season. In this case, although the prediction result was good, an initial phase to adapt the regression model could be necessary to achieve more robust results. PMID:25077431

  6. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  7. Prediction formulas for individual opioid analgesic requirements based on genetic polymorphism analyses.

    Directory of Open Access Journals (Sweden)

    Kaori Yoshida

    Full Text Available The analgesic efficacy of opioids is well known to vary widely among individuals, and various factors related to individual differences in opioid sensitivity have been identified. However, a prediction model to calculate appropriate opioid analgesic requirements has not yet been established. The present study sought to construct prediction formulas for individual opioid analgesic requirements based on genetic polymorphisms and clinical data from patients who underwent cosmetic orthognathic surgery and validate the utility of the prediction formulas in patients who underwent major open abdominal surgery.To construct the prediction formulas, we performed multiple linear regression analyses using data from subjects who underwent cosmetic orthognathic surgery. The dependent variable was 24-h postoperative or perioperative fentanyl use, and the independent variables were age, gender, height, weight, pain perception latencies (PPL, and genotype data of five single-nucleotide polymorphisms (SNPs. To examine the utility of the prediction formulas, we performed simple linear regression analyses using subjects who underwent major open abdominal surgery. Actual 24-h postoperative or perioperative analgesic use and the predicted values that were calculated using the multiple regression equations were incorporated as dependent and independent variables, respectively.Multiple linear regression analyses showed that the four SNPs, PPL, and weight were retained as independent predictors of 24-h postoperative fentanyl use (R² = 0.145, P = 5.66 × 10⁻¹⁰ and the two SNPs and weight were retained as independent predictors of perioperative fentanyl use (R² = 0.185, P = 1.99 × 10⁻¹⁵. Simple linear regression analyses showed that the predicted values were retained as an independent predictor of actual 24-h postoperative analgesic use (R² = 0.033, P = 0.030 and perioperative analgesic use (R² = 0.100, P = 1.09 × 10⁻⁴, respectively.We constructed

  8. Predictability of the monthly North Atlantic Oscillation index based on fractal analyses and dynamic system theory

    Directory of Open Access Journals (Sweden)

    M. D. Martínez

    2010-03-01

    Full Text Available The predictability of the monthly North Atlantic Oscillation, NAO, index is analysed from the point of view of different fractal concepts and dynamic system theory such as lacunarity, rescaled analysis (Hurst exponent and reconstruction theorem (embedding and correlation dimensions, Kolmogorov entropy and Lyapunov exponents. The main results point out evident signs of randomness and the necessity of stochastic models to represent time evolution of the NAO index. The results also show that the monthly NAO index behaves as a white-noise Gaussian process. The high minimum number of nonlinear equations needed to describe the physical process governing the NAO index fluctuations is evidence of its complexity. A notable predictive instability is indicated by the positive Lyapunov exponents. Besides corroborating the complex time behaviour of the NAO index, present results suggest that random Cantor sets would be an interesting tool to model lacunarity and time evolution of the NAO index.

  9. A multigene phylogeny of the fly superfamily Asiloidea (Insecta): Taxon sampling and additional genes reveal the sister-group to all higher flies (Cyclorrhapha).

    Science.gov (United States)

    Trautwein, Michelle D; Wiegmann, Brian M; Yeates, David K

    2010-09-01

    Asiloidea are a group of 9 lower brachyceran fly families, considered to be the closest relative to the large Metazoan radiation Eremoneura (Cyclorrhapha+Empidoidea). The evidence for asiloid monophyly is limited, and few characters define the relationships between the families of Asiloidea and Eremoneura. Additionally, enigmatic genera, Hilarimorpha and Apystomyia, retain morphological characters of both asiloids and higher flies. We use the nuclear protein-coding gene CAD and 28S rDNA to test the monophyly of Asiloidea and to resolve its relationship to Eremoneura. We explore the effects of taxon sampling on support values and topological stability, the resolving power of additional genes, and hypothesis testing using four-cluster likelihood mapping. We find that: (1) the 'asiloid' genus Apystomyia is sister to Cyclorrhapha, (2) the remaining asiloids are monophyletic at the exclusion of the family Bombyliidae, and (3) our best estimate of relationships places the asiloid flies excluding Bombyliidae as the sister-group to Eremoneura, though high support is lacking. PMID:20399874

  10. The mitochondrial genome of Paraspadella gotoi is highly reduced and reveals that chaetognaths are a sister-group to protostomes

    Energy Technology Data Exchange (ETDEWEB)

    Helfenbein, Kevin G.; Fourcade, H. Matthew; Vanjani, Rohit G.; Boore, Jeffrey L.

    2004-05-01

    We report the first complete mitochondrial (mt) DNA sequence from a member of the phylum Chaetognatha (arrow worms). The Paraspadella gotoi mtDNA is highly unusual, missing 23 of the genes commonly found in animal mtDNAs, including atp6, which has otherwise been found universally to be present. Its 14 genes are unusually arranged into two groups, one on each strand. One group is punctuated by numerous non-coding intergenic nucleotides, while the other group is tightly packed, having no non-coding nucleotides, leading to speculation that there are two transcription units with differing modes of expression. The phylogenetic position of the Chaetognatha within the Metazoa has long been uncertain, with conflicting or equivocal results from various morphological analyses and rRNA sequence comparisons. Comparisons here of amino acid sequences from mitochondrially encoded proteins gives a single most parsimonious tree that supports a position of Chaetognatha as sister to the protostomes studied here. From this, one can more clearly interpret the patterns of evolution of various developmental features, especially regarding the embryological fate of the blastopore.

  11. Design and Antigenic Epitopes Prediction of a New Trial Recombinant Multiepitopic Rotaviral Vaccine: In Silico Analyses.

    Science.gov (United States)

    Jafarpour, Sima; Ayat, Hoda; Ahadi, Ali Mohammad

    2015-01-01

    Rotavirus is the major etiologic factor of severe diarrheal disease. Natural infection provides protection against subsequent rotavirus infection and diarrhea. This research presents a new vaccine designed based on computational models. In this study, three types of epitopes are considered-linear, conformational, and combinational-in a proposed model protein. Several studies on rotavirus vaccines have shown that VP6 and VP4 proteins are good candidates for vaccine production. In the present study, a fusion protein was designed as a new generation of rotavirus vaccines by bioinformatics analyses. This model-based study using ABCpred, BCPREDS, Bcepred, and Ellipro web servers showed that the peptide presented in this article has the necessary properties to act as a vaccine. Prediction of linear B-cell epitopes of peptides is helpful to investigate whether these peptides are able to activate humoral immunity.

  12. Design and Antigenic Epitopes Prediction of a New Trial Recombinant Multiepitopic Rotaviral Vaccine: In Silico Analyses.

    Science.gov (United States)

    Jafarpour, Sima; Ayat, Hoda; Ahadi, Ali Mohammad

    2015-01-01

    Rotavirus is the major etiologic factor of severe diarrheal disease. Natural infection provides protection against subsequent rotavirus infection and diarrhea. This research presents a new vaccine designed based on computational models. In this study, three types of epitopes are considered-linear, conformational, and combinational-in a proposed model protein. Several studies on rotavirus vaccines have shown that VP6 and VP4 proteins are good candidates for vaccine production. In the present study, a fusion protein was designed as a new generation of rotavirus vaccines by bioinformatics analyses. This model-based study using ABCpred, BCPREDS, Bcepred, and Ellipro web servers showed that the peptide presented in this article has the necessary properties to act as a vaccine. Prediction of linear B-cell epitopes of peptides is helpful to investigate whether these peptides are able to activate humoral immunity. PMID:25965449

  13. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-07-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems.

  14. Prevalence and Predictive Factors of Sexual Dysfunction in Iranian Women: Univariate and Multivariate Logistic Regression Analyses

    Science.gov (United States)

    Direkvand-Moghadam, Ashraf; Suhrabi, Zainab; Akbari, Malihe

    2016-01-01

    Background Female sexual dysfunction, which can occur during any stage of a normal sexual activity, is a serious condition for individuals and couples. The present study aimed to determine the prevalence and predictive factors of female sexual dysfunction in women referred to health centers in Ilam, the Western Iran, in 2014. Methods In the present cross-sectional study, 444 women who attended health centers in Ilam were enrolled from May to September 2014. Participants were selected according to the simple random sampling method. Univariate and multivariate logistic regression analyses were used to predict the risk factors of female sexual dysfunction. Diffe rences with an alpha error of 0.05 were regarded as statistically significant. Results Overall, 75.9% of the study population exhibited sexual dysfunction. Univariate logistic regression analysis demonstrated that there was a significant association between female sexual dysfunction and age, menarche age, gravidity, parity, and education (Pwomen suffer from sexual dysfunction. A lack of awareness of Iranian women's sexual pleasure and formal training on sexual function and its influencing factors, such as menarche age, gravida, and level of education, may lead to a high prevalence of female sexual dysfunction. PMID:27688863

  15. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-07-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems. PMID:27174940

  16. Prediction of metabolisable energy value of broiler diets and water excretion from dietary chemical analyses.

    Science.gov (United States)

    Carré, B; Lessire, M; Juin, H

    2013-08-01

    Thirty various pelleted diets were given to broilers (8/diet) for in vivo measurements of dietary metabolisable energy (ME) value and digestibilities of proteins, lipids, starch and sugars from day 27 to day 31, with ad libitum feeding and total collection of excreta. Water excretion was also measured. Amino acid formulation of diets was done on the basis of ratios to crude proteins. Mean in vivo apparent ME values corrected to zero nitrogen retention (AMEn) were always lower than the AMEn values calculated for adult cockerels using predicting equations from literature based on the chemical analyses of diets. The difference between mean in vivo AMEn values and these calculated AMEn values increased linearly with increasing amount of wheat in diets (P = 0.0001). Mean digestibilities of proteins, lipids and starch were negatively related to wheat introduction (P = 0.0001). The correlations between mean in vivo AMEn values and diet analytical parameters were the highest with fibre-related parameters, such as water-insoluble cell-walls (WICW) (r = -0.91) or Real Applied Viscosity (RAV) (r = -0.77). Thirteen multiple regression equations relating mean in vivo AMEn values to dietary analytical data were calculated, with R² values ranging from 0.859 to 0.966 (P = 0.0001). The highest R² values were obtained when the RAV parameter was included in independent variables. The direct regression equations obtained with available components (proteins, lipids, starch, sucrose and oligosaccharides) and the indirect regression equations obtained with WICW and ash parameters showed similar R² values. Direct or indirect theoretical equations predicting AMEn values were established using the overall mean in vivo digestibility values. The principle of indirect equations was based on the assumption that WICW and ashes act as diluters. Addition of RAV or wheat content in variables improved the accuracy of theoretical equations. Efficiencies of theoretical equations for predicting AMEn

  17. Analyses of Optimal Embedding Dimension and Delay for Local Linear Prediction Model

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-Fang; PENG Yu-Hua; LIU Yun-Xia; SUN Wei-Feng

    2007-01-01

    In the reconstructed phase space, a novel local linear prediction model is proposed to predict chaotic time series. The parameters of the proposed model take the values that are different from those of the phase space reconstruction. We propose a criterion based on prediction error to determine the optimal parameters of the proposed model. The simulation results show that the proposed model can effectively make one-step and multistep prediction for chaotic time series, and the one-step and multi-step prediction accuracy of the proposed model is superior to that of the traditional local linear prediction.

  18. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features, to the accur......A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  19. Measuring Usable Knowledge: Teachers' Analyses of Mathematics Classroom Videos Predict Teaching Quality and Student Learning

    Science.gov (United States)

    Kersting, Nicole B.; Givvin, Karen B.; Thompson, Belinda J.; Santagata, Rossella; Stigler, James W.

    2012-01-01

    This study explores the relationships between teacher knowledge, teaching practice, and student learning in mathematics. It extends previous work that developed and evaluated an innovative approach to assessing teacher knowledge based on teachers' analyses of classroom video clips. Teachers watched and commented on 13 fraction clips. These written…

  20. Map on predicted deposition of Cs-137 in Spanish soils from geostatistical analyses

    International Nuclear Information System (INIS)

    The knowledge of the distribution of 137Cs deposition over Spanish mainland soils along with the geographical, physical and morphological terrain information enable us to know the 137Cs background content in soil. This could be useful as a tool in a hypothetical situation of an accident involving a radioactive discharge or in soil erosion studies. A Geographic Information System (GIS) would allow the gathering of all the mentioned information. In this work, gamma measurements of 137Cs on 34 Spanish mainland soils, rainfall data taken from 778 weather stations, soil types and geographical and physical terrain information were input into a GIS. Geostatistical techniques were applied to interpolate values of 137Cs activity at unsampled places, obtaining prediction maps of 137Cs deposition. Up to now, geostatistical methods have been used to model spatial continuity of data. Through semivariance and cross-covariance functions the spatial correlation of such data can be studied and described. Ordinary and simple kriging techniques were carried out to map spatial patterns of 137Cs deposition, and ordinary and simple co-kriging were used to improve the prediction map obtained through a second related variable: namely the rainfall. To choose the best prediction map of 137Cs deposition, the spatial dependence of the variable, the correlation coefficient and the prediction errors were evaluated using the different models previously mentioned. The best result for 137Cs deposition map was obtained when applying the co-kriging techniques. - Highlights: ► Implementation of 137Cs activity data, in Spanish soils, in a GIS. ► Prediction models were performed of Cs-137 fallout with kriging techniques. ► More accurate prediction surfaces were obtained using cokriging techniques. ► Rainfall is the second variable used to cokriging technique.

  1. Finite Element Creep Damage Analyses and Life Prediction of P91 Pipe Containing Local Wall Thinning Defect

    Science.gov (United States)

    Xue, Jilin; Zhou, Changyu

    2016-03-01

    Creep continuum damage finite element (FE) analyses were performed for P91 steel pipe containing local wall thinning (LWT) defect subjected to monotonic internal pressure, monotonic bending moment and combined internal pressure and bending moment by orthogonal experimental design method. The creep damage lives of pipe containing LWT defect under different load conditions were obtained. Then, the creep damage life formulas were regressed based on the creep damage life results from FE method. At the same time a skeletal point rupture stress was found and used for life prediction which was compared with creep damage lives obtained by continuum damage analyses. From the results, the failure lives of pipe containing LWT defect can be obtained accurately by using skeletal point rupture stress method. Finally, the influence of LWT defect geometry was analysed, which indicated that relative defect depth was the most significant factor for creep damage lives of pipe containing LWT defect.

  2. Aeromechanics and Aeroacoustics Predictions of the Boeing-SMART Rotor Using Coupled-CFD/CSD Analyses

    Science.gov (United States)

    Bain, Jeremy; Sim, Ben W.; Sankar, Lakshmi; Brentner, Ken

    2010-01-01

    This paper will highlight helicopter aeromechanics and aeroacoustics prediction capabilities developed by Georgia Institute of Technology, the Pennsylvania State University, and Northern Arizona University under the Helicopter Quieting Program (HQP) sponsored by the Tactical Technology Office of the Defense Advanced Research Projects Agency (DARPA). First initiated in 2004, the goal of the HQP was to develop high fidelity, state-of-the-art computational tools for designing advanced helicopter rotors with reduced acoustic perceptibility and enhanced performance. A critical step towards achieving this objective is the development of rotorcraft prediction codes capable of assessing a wide range of helicopter configurations and operations for future rotorcraft designs. This includes novel next-generation rotor systems that incorporate innovative passive and/or active elements to meet future challenging military performance and survivability goals.

  3. Computational Prediction and Biochemical Analyses of New Inverse Agonists for the CB1 Receptor.

    Science.gov (United States)

    Scott, Caitlin E; Ahn, Kwang H; Graf, Steven T; Goddard, William A; Kendall, Debra A; Abrol, Ravinder

    2016-01-25

    Human cannabinoid type 1 (CB1) G-protein coupled receptor is a potential therapeutic target for obesity. The previously predicted and experimentally validated ensemble of ligand-free conformations of CB1 [Scott, C. E. et al. Protein Sci. 2013 , 22 , 101 - 113 ; Ahn, K. H. et al. Proteins 2013 , 81 , 1304 - 1317] are used here to predict the binding sites for known CB1-selective inverse agonists including rimonabant and its seven known derivatives. This binding pocket, which differs significantly from previously published models, is used to identify 16 novel compounds expected to be CB1 inverse agonists by exploiting potential new interactions. We show experimentally that two of these compounds exhibit inverse agonist properties including inhibition of basal and agonist-induced G-protein coupling activity, as well as an enhanced level of CB1 cell surface localization. This demonstrates the utility of using the predicted binding sites for an ensemble of CB1 receptor structures for designing new CB1 inverse agonists.

  4. Accuracy of Fall Prediction in Parkinson Disease: Six-Month and 12-Month Prospective Analyses

    Directory of Open Access Journals (Sweden)

    Ryan P. Duncan

    2012-01-01

    Full Text Available Introduction. We analyzed the ability of four balance assessments to predict falls in people with Parkinson Disease (PD prospectively over six and 12 months. Materials and Methods. The BESTest, Mini-BESTest, Functional Gait Assessment (FGA, and Berg Balance Scale (BBS were administered to 80 participants with idiopathic PD at baseline. Falls were then tracked for 12 months. Ability of each test to predict falls at six and 12 months was assessed using ROC curves and likelihood ratios (LR. Results. Twenty-seven percent of the sample had fallen at six months, and 32% of the sample had fallen at 12 months. At six months, areas under the ROC curve (AUC for the tests ranged from 0.8 (FGA to 0.89 (BESTest with LR+ of 3.4 (FGA to 5.8 (BESTest. At 12 months, AUCs ranged from 0.68 (BESTest, BBS to 0.77 (Mini-BESTest with LR+ of 1.8 (BESTest to 2.4 (BBS, FGA. Discussion. The various balance tests were effective in predicting falls at six months. All tests were relatively ineffective at 12 months. Conclusion. This pilot study suggests that people with PD should be assessed biannually for fall risk.

  5. Predictability of Regional Climate: A Bayesian Approach to Analysing a WRF Model Ensemble

    Science.gov (United States)

    Bruyere, C. L.; Mesquita, M. D. S.; Paimazumder, D.

    2013-12-01

    This study investigates aspects of climate predictability with a focus on climatic variables and different characteristics of extremes over nine North American climatic regions and two selected Atlantic sectors. An ensemble of state-of-the-art Weather Research and Forecasting Model (WRF) simulations is used for the analysis. The ensemble is comprised of a combination of various physics schemes, initial conditions, domain sizes, boundary conditions and breeding techniques. The main objectives of this research are: 1) to increase our understanding of the ability of WRF to capture regional climate information - both at the individual and collective ensemble members, 2) to investigate the role of different members and their synergy in reproducing regional climate 3) to estimate the associated uncertainty. In this study, we propose a Bayesian framework to study the predictability of extremes and associated uncertainties in order to provide a wealth of knowledge about WRF reliability and provide further clarity and understanding of the sensitivities and optimal combinations. The choice of the Bayesian model, as opposed to standard methods, is made because: a) this method has a mean square error that is less than standard statistics, which makes it a more robust method; b) it allows for the use of small sample sizes, which are typical in high-resolution modeling; c) it provides a probabilistic view of uncertainty, which is useful when making decisions concerning ensemble members.

  6. Street-based Topological Representations and Analyses for Predicting Traffic Flow in GIS

    CERN Document Server

    Jiang, Bin

    2007-01-01

    It is well received in the space syntax community that traffic flow is significantly correlated to a morphological property of streets, which are represented by axial lines, forming a so called axial map. The correlation co-efficient (R square value) approaches 0.8 and even a higher value according to the space syntax literature. In this paper, we study the same issue using the Hong Kong street network and the Hong Kong Annual Average Daily Traffic (AADT) datasets, and find surprisingly that street-based topological representations (or street-street topologies) tend to be better representations than the axial map. In other words, vehicle flow is correlated to a morphological property of streets better than that of axial lines. Based on the finding, we suggest the street-based topological representations as an alternative GIS representation, and the topological analyses as a new analytical means for geographic knowledge discovery.

  7. Development of a feasibility prediction tool for solar power plant installation analyses

    International Nuclear Information System (INIS)

    Highlights: → An agglomerative hierarchical clustering tool is designed for renewable energy sources in this study. → In the model, nearest neighbor approach is used as clustering algorithm and Euclidean, Manhattan, and Minkowski distance metrics as distance equations. → The developed tool assists knowledge domain expert in terms of analysing extensive datasets. → The developed tool clusters the given sample data efficiently and successfully using each distance metrics. → The clustering results are compared according to success rates. -- Abstract: The solar energy becomes a challenging area among other renewable sources since the solar energy sources have the advantages of not causing pollution, having low maintenance cost, and not producing noise due to the absence of the moving parts. Although these advantages, the installation cost of a solar power plant is considerably high. However, feasibility analyses have a great role before installation in order to determine the most appropriate power plant site. Despite there are many methods used in feasibility analysis, this paper is focused on a new intelligent method based on an agglomerative hierarchical clustering approach. The solar irradiation and insolation parameters of Central Anatolian Region of Turkey are evaluated utilizing the intelligent feasibility analysis tool developed in this study. The clustering operation in the tool is performed by using the nearest neighbor algorithm. At the stage of determining the optimum hierarchical clustering results, Euclidean, Manhattan and Minkowski distance metrics are adapted to the tool. The achieved clustering results based on Minkowski distance metric provide the most feasible inferences to knowledge domain expert according to other distance metrics.

  8. CFD Analyses and Jet-Noise Predictions of Chevron Nozzles with Vortex Stabilization

    Science.gov (United States)

    Dippold, Vance

    2008-01-01

    The wind computational fluid dynamics code was used to perform a series of analyses on a single-flow plug nozzle with chevrons. Air was injected from tubes tangent to the nozzle outer surface at three different points along the chevron at the nozzle exit: near the chevron notch, at the chevron mid-point, and near the chevron tip. Three injection pressures were used for each injection tube location--10, 30, and 50 psig-giving injection mass flow rates of 0.1, 0.2, and 0.3 percent of the nozzle mass flow. The results showed subtle changes in the jet plume s turbulence and vorticity structure in the region immediately downstream of the nozzle exit. Distinctive patterns in the plume structure emerged from each injection location, and these became more pronounced as the injection pressure was increased. However, no significant changes in centerline velocity decay or turbulent kinetic energy were observed in the jet plume as a result of flow injection. Furthermore, computational acoustics calculations performed with the JeNo code showed no real reduction in jet noise relative to the baseline chevron nozzle.

  9. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  10. Accuracy of finite element analyses of CT scans in predictions of vertebral failure patterns under axial compression and anterior flexion.

    Science.gov (United States)

    Jackman, Timothy M; DelMonaco, Alex M; Morgan, Elise F

    2016-01-25

    Finite element (FE) models built from quantitative computed tomography (QCT) scans can provide patient-specific estimates of bone strength and fracture risk in the spine. While prior studies demonstrate accurate QCT-based FE predictions of vertebral stiffness and strength, the accuracy of the predicted failure patterns, i.e., the locations where failure occurs within the vertebra and the way in which the vertebra deforms as failure progresses, is less clear. This study used digital volume correlation (DVC) analyses of time-lapse micro-computed tomography (μCT) images acquired during mechanical testing (compression and anterior flexion) of thoracic spine segments (T7-T9, n=28) to measure displacements occurring throughout the T8 vertebral body at the ultimate point. These displacements were compared to those simulated by QCT-based FE analyses of T8. We hypothesized that the FE predictions would be more accurate when the boundary conditions are based on measurements of pressure distributions within intervertebral discs of similar level of disc degeneration vs. boundary conditions representing rigid platens. The FE simulations captured some of the general, qualitative features of the failure patterns; however, displacement errors ranged 12-279%. Contrary to our hypothesis, no differences in displacement errors were found when using boundary conditions representing measurements of disc pressure vs. rigid platens. The smallest displacement errors were obtained using boundary conditions that were measured directly by DVC at the T8 endplates. These findings indicate that further work is needed to develop methods of identifying physiological loading conditions for the vertebral body, for the purpose of achieving robust, patient-specific FE analyses of failure mechanisms.

  11. Accuracy of finite element analyses of CT scans in predictions of vertebral failure patterns under axial compression and anterior flexion.

    Science.gov (United States)

    Jackman, Timothy M; DelMonaco, Alex M; Morgan, Elise F

    2016-01-25

    Finite element (FE) models built from quantitative computed tomography (QCT) scans can provide patient-specific estimates of bone strength and fracture risk in the spine. While prior studies demonstrate accurate QCT-based FE predictions of vertebral stiffness and strength, the accuracy of the predicted failure patterns, i.e., the locations where failure occurs within the vertebra and the way in which the vertebra deforms as failure progresses, is less clear. This study used digital volume correlation (DVC) analyses of time-lapse micro-computed tomography (μCT) images acquired during mechanical testing (compression and anterior flexion) of thoracic spine segments (T7-T9, n=28) to measure displacements occurring throughout the T8 vertebral body at the ultimate point. These displacements were compared to those simulated by QCT-based FE analyses of T8. We hypothesized that the FE predictions would be more accurate when the boundary conditions are based on measurements of pressure distributions within intervertebral discs of similar level of disc degeneration vs. boundary conditions representing rigid platens. The FE simulations captured some of the general, qualitative features of the failure patterns; however, displacement errors ranged 12-279%. Contrary to our hypothesis, no differences in displacement errors were found when using boundary conditions representing measurements of disc pressure vs. rigid platens. The smallest displacement errors were obtained using boundary conditions that were measured directly by DVC at the T8 endplates. These findings indicate that further work is needed to develop methods of identifying physiological loading conditions for the vertebral body, for the purpose of achieving robust, patient-specific FE analyses of failure mechanisms. PMID:26792288

  12. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2010-01-01

    Simulation technology can play an important role in rocket engine test facility design and development by assessing risks, providing analysis of dynamic pressure and thermal loads, identifying failure modes and predicting anomalous behavior of critical systems. Advanced numerical tools assume greater significance in supporting testing and design of high altitude testing facilities and plume induced testing environments of high thrust engines because of the greater inter-dependence and synergy in the functioning of the different sub-systems. This is especially true for facilities such as the proposed A-3 facility at NASA SSC because of a challenging operating envelope linked to variable throttle conditions at relatively low chamber pressures. Facility designs in this case will require a complex network of diffuser ducts, steam ejector trains, fast operating valves, cooling water systems and flow diverters that need to be characterized for steady state performance. In this paper, we will demonstrate with the use of CFD analyses s advanced capability to evaluate supersonic diffuser and steam ejector performance in a sub-scale A-3 facility at NASA Stennis Space Center (SSC) where extensive testing was performed. Furthermore, the focus in this paper relates to modeling of critical sub-systems and components used in facilities such as the A-3 facility. The work here will address deficiencies in empirical models and current CFD analyses that are used for design of supersonic diffusers/turning vanes/ejectors as well as analyses for confined plumes and venting processes. The primary areas that will be addressed are: (1) supersonic diffuser performance including analyses of thermal loads (2) accurate shock capturing in the diffuser duct; (3) effect of turning duct on the performance of the facility (4) prediction of mass flow rates and performance classification for steam ejectors (5) comparisons with test data from sub-scale diffuser testing and assessment of confidence

  13. Benchmark of SCALE (SAS2H) isotopic predictions of depletion analyses for San Onofre PWR MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    2000-02-01

    The isotopic composition of mixed-oxide (MOX) fuel, fabricated with both uranium and plutonium, after discharge from reactors is of significant interest to the Fissile Materials Disposition Program. The validation of the SCALE (SAS2H) depletion code for use in the prediction of isotopic compositions of MOX fuel, similar to previous validation studies on uranium-only fueled reactors, has corresponding significance. The EEI-Westinghouse Plutonium Recycle Demonstration Program examined the use of MOX fuel in the San Onofre PWR, Unit 1, during cycles 2 and 3. Isotopic analyses of the MOX spent fuel were conducted on 13 actinides and {sup 148}Nd by either mass or alpha spectrometry. Six fuel pellet samples were taken from four different fuel pins of an irradiated MOX assembly. The measured actinide inventories from those samples has been used to benchmark SAS2H for MOX fuel applications. The average percentage differences in the code results compared with the measurement were {minus}0.9% for {sup 235}U and 5.2% for {sup 239}Pu. The differences for most of the isotopes were significantly larger than in the cases for uranium-only fueled reactors. In general, comparisons of code results with alpha spectrometer data had extreme differences, although the differences in the calculations compared with mass spectrometer analyses were not extremely larger than that of uranium-only fueled reactors. This benchmark study should be useful in estimating uncertainties of inventory, criticality and dose calculations of MOX spent fuel.

  14. Mitogenomic analyses of eutherian relationships.

    Science.gov (United States)

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology.

  15. Mitogenomic analyses of eutherian relationships.

    Science.gov (United States)

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology. PMID:12438776

  16. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of _ are also given.

  17. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Yan Song

    2015-01-01

    Full Text Available Background: Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC, but biomarkers of activity are lacking. The aim of this study was to investigate the association of Von Hippel-Lindau (VHL gene status, vascular endothelial growth factor receptor (VEGFR or stem cell factor receptor (KIT expression, and their relationships with characteristics and clinical outcome of advanced ccRCC. Methods: A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute, Chinese Academy of Medical Sciences between January 2010 and November 2012. Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry. Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS and overall survival (OS were calculated and then compared based on expression status. The Chi-square test, the Kaplan-Meier method, and the Lon-rank test were used for statistical analyses. Results: Of 59 patients, objective responses were observed in 28 patients (47.5%. The median PFS was 13.8 months and median OS was 39.9 months. There was an improved PFS in patients with the following clinical features: Male gender, number of metastatic sites 2 or less, VEGFR-2 positive or KIT positive. Eleven patients (18.6% had evidence of VHL mutation, with an objective response rate of 45.5%, which showed no difference with patients with no VHL mutation (47.9%. VHL mutation status did not correlate with either overall response rate (P = 0.938 or PFS (P = 0.277. The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients, respectively, which was significantly longer than that of VEGFR-2 or KIT negative patients (P = 0.026 and P = 0.043. Conclusion

  18. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Institute of Scientific and Technical Information of China (English)

    Yan Song; Jing Huang; Ling Shan; Hong-Tu Zhang

    2015-01-01

    Background:Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC),but biomarkers of activity are lacking.The aim of this study was to investigate the association of Von Hippel-Lindau (VHL) gene status,vascular endothelial growth factor receptor (VEGFR) or stem cell factor receptor (KIT) expression,and their relationships with characteristics and clinical outcome of advanced ccRCC.Methods:A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute,Chinese Academy of Medical Sciences between January 2010 and November 2012.Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry.Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS) and ovcrall survival (OS) were calculated and then compared based on expression status.The Chi-square test,the KaplanMeier method,and the Lon-rank test were used for statistical analyses.Results:Of 59 patients,objective responses were observed in 28 patients (47.5%).The median PFS was 13.8 months and median OS was 39.9 months.There was an improved PFS in patients with the following clinical features:Male gender,number of metastatic sites 2 or less,VEGFR-2 positive or KIT positive.Eleven patients (18.6%) had evidence of VHL mutation,with an objective response rate of 45.5%,which showed no difference with patients with no VHL mutation (47.9%).VHL mutation status did not correlate with either overall response rate (P =0.938) or PFS (P =0.277).The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients,respectively,which was significantly longer than that of VEGFR-2 or KIT negative patients (P =0.026 and P =0.043).Conclusion:VHL mutation status could not predict

  19. Palindrome analyser - A new web-based server for predicting and evaluating inverted repeats in nucleotide sequences.

    Science.gov (United States)

    Brázda, Václav; Kolomazník, Jan; Lýsek, Jiří; Hároníková, Lucia; Coufal, Jan; Št'astný, Jiří

    2016-09-30

    DNA cruciform structures play an important role in the regulation of natural processes including gene replication and expression, as well as nucleosome structure and recombination. They have also been implicated in the evolution and development of diseases such as cancer and neurodegenerative disorders. Cruciform structures are formed by inverted repeats, and their stability is enhanced by DNA supercoiling and protein binding. They have received broad attention because of their important roles in biology. Computational approaches to study inverted repeats have allowed detailed analysis of genomes. However, currently there are no easily accessible and user-friendly tools that can analyse inverted repeats, especially among long nucleotide sequences. We have developed a web-based server, Palindrome analyser, which is a user-friendly application for analysing inverted repeats in various DNA (or RNA) sequences including genome sequences and oligonucleotides. It allows users to search and retrieve desired gene/nucleotide sequence entries from the NCBI databases, and provides data on length, sequence, locations and energy required for cruciform formation. Palindrome analyser also features an interactive graphical data representation of the distribution of the inverted repeats, with options for sorting according to the length of inverted repeat, length of loop, and number of mismatches. Palindrome analyser can be accessed at http://bioinformatics.ibp.cz. PMID:27603574

  20. Serial and panel analyses of biomarkers do not improve the prediction of bacteremia compared to one procalcitonin measurement

    NARCIS (Netherlands)

    Tromp, M.; Lansdorp, B.; Bleeker-Rovers, C.P.; Klein Gunnewiek, J.M.; Kullberg, B.J.; Pickkers, P.

    2012-01-01

    Objectives We evaluated the value of a single biomarker, biomarker panels, biomarkers combined with clinical signs of sepsis, and serial determinations of biomarkers in the prediction of bacteremia in patients with sepsis. Methods Adult patients visiting the emergency department because of a susp

  1. Serial and panel analyses of biomarkers do not improve the prediction of bacteremia compared to one procalcitonin measurement.

    NARCIS (Netherlands)

    Tromp, M.; Lansdorp, B.; Bleeker-Rovers, C.P.; Gunnewiek, J.M.; Kullberg, B.J.; Pickkers, P.

    2012-01-01

    OBJECTIVES: We evaluated the value of a single biomarker, biomarker panels, biomarkers combined with clinical signs of sepsis, and serial determinations of biomarkers in the prediction of bacteremia in patients with sepsis. METHODS: Adult patients visiting the emergency department because of a suspe

  2. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  3. Can the lifetime of the superheater tubes be predicted according to the fuel analyses? Assessment from field and laboratory data

    Energy Technology Data Exchange (ETDEWEB)

    Salmenoja, K. [Kvaerner Pulping Oy, Tampere (Finland)

    1998-12-31

    Lifetime of the superheaters in different power boilers is more or less still a mystery. This is especially true in firing biomass based fuels (biofuels), such as bark, forest residues, and straw. Due to the unhomogeneous nature of the biofuels, the lifetime of the superheaters may vary from case to case. Sometimes the lifetime is significantly shorter than originally expected, sometimes no corrosion even in the hottest tubes is observed. This is one of the main reasons why the boiler operators often demand for a better predictability on the corrosion resistance of the materials to avoid unscheduled shutdowns. (orig.) 9 refs.

  4. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    Energy Technology Data Exchange (ETDEWEB)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-08-15

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  5. Standardized Software for Wind Load Forecast Error Analyses and Predictions Based on Wavelet-ARIMA Models - Applications at Multiple Geographically Distributed Wind Farms

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Zhangshuan; Makarov, Yuri V.; Samaan, Nader A.; Etingov, Pavel V.

    2013-03-19

    Given the multi-scale variability and uncertainty of wind generation and forecast errors, it is a natural choice to use time-frequency representation (TFR) as a view of the corresponding time series represented over both time and frequency. Here we use wavelet transform (WT) to expand the signal in terms of wavelet functions which are localized in both time and frequency. Each WT component is more stationary and has consistent auto-correlation pattern. We combined wavelet analyses with time series forecast approaches such as ARIMA, and tested the approach at three different wind farms located far away from each other. The prediction capability is satisfactory -- the day-ahead prediction of errors match the original error values very well, including the patterns. The observations are well located within the predictive intervals. Integrating our wavelet-ARIMA (‘stochastic’) model with the weather forecast model (‘deterministic’) will improve our ability significantly to predict wind power generation and reduce predictive uncertainty.

  6. Landscaping analyses of the ROC predictions of discrete-slots and signal-detection models of visual working memory.

    Science.gov (United States)

    Donkin, Chris; Tran, Sophia Chi; Nosofsky, Robert

    2014-10-01

    A fundamental issue concerning visual working memory is whether its capacity limits are better characterized in terms of a limited number of discrete slots (DSs) or a limited amount of a shared continuous resource. Rouder et al. (2008) found that a mixed-attention, fixed-capacity, DS model provided the best explanation of behavior in a change detection task, outperforming alternative continuous signal detection theory (SDT) models. Here, we extend their analysis in two ways: first, with experiments aimed at better distinguishing between the predictions of the DS and SDT models, and second, using a model-based analysis technique called landscaping, in which the functional-form complexity of the models is taken into account. We find that the balance of evidence supports a DS account of behavior in change detection tasks but that the SDT model is best when the visual displays always consist of the same number of items. In our General Discussion section, we outline, but ultimately reject, a number of potential explanations for the observed pattern of results. We finish by describing future research that is needed to pinpoint the basis for this observed pattern of results.

  7. The GENOTEND chip: a new tool to analyse gene expression in muscles of beef cattle for beef quality prediction

    Directory of Open Access Journals (Sweden)

    Hocquette Jean-Francois

    2012-08-01

    validated in the groups of 30 Charolais young bulls slaughtered in year 2, and in the 21 Charolais steers slaughtered in year 1, but not in the group of 19 steers slaughtered in year 2 which differ from the reference group by two factors (gender and year. When the first three groups of animals were analysed together, this subset of genes explained a 4-fold higher proportion of the variability in tenderness than muscle biochemical traits. Conclusion This study underlined the relevance of the GENOTEND chip to identify markers of beef quality, mainly by confirming previous results and by detecting other genes of the heat shock family as potential markers of beef quality. However, it was not always possible to extrapolate the relevance of these markers to all animal groups which differ by several factors (such as gender or environmental conditions of production from the initial population of reference in which these markers were identified.

  8. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  9. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry.

    Energy Technology Data Exchange (ETDEWEB)

    Tzanos, C. P.; Dionne, B. (Nuclear Engineering Division)

    2011-05-23

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  10. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent

    Science.gov (United States)

    Guo, Yan; Warren Andersen, Shaneda; Shu, Xiao-Ou; Michailidou, Kyriaki; Bolla, Manjeet K.; Wang, Qin; Garcia-Closas, Montserrat; Milne, Roger L.; Schmidt, Marjanka K.; Chang-Claude, Jenny; Dunning, Allison; Bojesen, Stig E.; Ahsan, Habibul; Aittomäki, Kristiina; Andrulis, Irene L.; Anton-Culver, Hoda; Beckmann, Matthias W.; Beeghly-Fadiel, Alicia; Benitez, Javier; Bogdanova, Natalia V.; Bonanni, Bernardo; Børresen-Dale, Anne-Lise; Brand, Judith; Brauch, Hiltrud; Brenner, Hermann; Brüning, Thomas; Burwinkel, Barbara; Casey, Graham; Chenevix-Trench, Georgia; Couch, Fergus J.; Cross, Simon S.; Czene, Kamila; Dörk, Thilo; Dumont, Martine; Fasching, Peter A.; Figueroa, Jonine; Flesch-Janys, Dieter; Fletcher, Olivia; Flyger, Henrik; Fostira, Florentia; Gammon, Marilie; Giles, Graham G.; Guénel, Pascal; Haiman, Christopher A.; Hamann, Ute; Hooning, Maartje J.; Hopper, John L.; Jakubowska, Anna; Jasmine, Farzana; Jenkins, Mark; John, Esther M.; Johnson, Nichola; Jones, Michael E.; Kabisch, Maria; Knight, Julia A.; Koppert, Linetta B.; Kosma, Veli-Matti; Kristensen, Vessela; Le Marchand, Loic; Lee, Eunjung; Li, Jingmei; Lindblom, Annika; Lubinski, Jan; Malone, Kathi E.; Mannermaa, Arto; Margolin, Sara; McLean, Catriona; Meindl, Alfons; Neuhausen, Susan L.; Nevanlinna, Heli; Neven, Patrick; Olson, Janet E.; Perez, Jose I. A.; Perkins, Barbara; Phillips, Kelly-Anne; Pylkäs, Katri; Rudolph, Anja; Santella, Regina; Sawyer, Elinor J.; Schmutzler, Rita K.; Seynaeve, Caroline; Shah, Mitul; Shrubsole, Martha J.; Southey, Melissa C.; Swerdlow, Anthony J.; Toland, Amanda E.; Tomlinson, Ian; Torres, Diana; Truong, Thérèse; Ursin, Giske; Van Der Luijt, Rob B.; Verhoef, Senno; Whittemore, Alice S.; Winqvist, Robert; Zhao, Hui; Zhao, Shilin; Hall, Per; Simard, Jacques; Kraft, Peter; Hunter, David; Easton, Douglas F.; Zheng, Wei

    2016-01-01

    Background Observational epidemiological studies have shown that high body mass index (BMI) is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors. Methods We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC) (cases  =  46,325, controls  =  42,482). We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE) Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively. Results In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56–0.75, p = 3.32 × 10−10). The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31–0.62, p  =  9.91 × 10−8) and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46–0.71, p  =  1.88 × 10−8). This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60–0.84, p   =   1.64 × 10−7). Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs) in association with breast cancer risk at p

  11. Prediction

    CERN Document Server

    Sornette, Didier

    2010-01-01

    This chapter first presents a rather personal view of some different aspects of predictability, going in crescendo from simple linear systems to high-dimensional nonlinear systems with stochastic forcing, which exhibit emergent properties such as phase transitions and regime shifts. Then, a detailed correspondence between the phenomenology of earthquakes, financial crashes and epileptic seizures is offered. The presented statistical evidence provides the substance of a general phase diagram for understanding the many facets of the spatio-temporal organization of these systems. A key insight is to organize the evidence and mechanisms in terms of two summarizing measures: (i) amplitude of disorder or heterogeneity in the system and (ii) level of coupling or interaction strength among the system's components. On the basis of the recently identified remarkable correspondence between earthquakes and seizures, we present detailed information on a class of stochastic point processes that has been found to be particu...

  12. Using meta-analytic path analysis to test theoretical predictions in health behavior: An illustration based on meta-analyses of the theory of planned behavior

    OpenAIRE

    Hagger, Martin; Chan, Dervin K. C.; Protogerou, Cleo; Chatzisarantis, Nikos L. D.

    2016-01-01

    Objective Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs fr...

  13. The Janus-faced nature of time spent on homework : Using latent profile analyses to predict academic achievement over a school year

    NARCIS (Netherlands)

    Flunger, Barbara; Trautwein, Ulrich; Nagengast, Benjamin; Lüdtke, Oliver; Niggli, Alois; Schnyder, Inge

    2015-01-01

    Homework time and achievement are only modestly associated, whereas homework effort has consistently been shown to positively predict later achievement. We argue that time spent on homework can be an important predictor of achievement when combined with measures of homework effort. Latent profile an

  14. Logistic Regression Analyses for Predicting Clinically Important Differences in Motor Capacity, Motor Performance, and Functional Independence after Constraint-Induced Therapy in Children with Cerebral Palsy

    Science.gov (United States)

    Wang, Tien-ni; Wu, Ching-yi; Chen, Chia-ling; Shieh, Jeng-yi; Lu, Lu; Lin, Keh-chung

    2013-01-01

    Given the growing evidence for the effects of constraint-induced therapy (CIT) in children with cerebral palsy (CP), there is a need for investigating the characteristics of potential participants who may benefit most from this intervention. This study aimed to establish predictive models for the effects of pediatric CIT on motor and functional…

  15. A systematic study of coordinate precision in X-ray structure analyses. Pt. 2. Predictive estimates of E.S.D.'s for the general-atom case

    International Nuclear Information System (INIS)

    The relationship between the mean isotropic e.s.d. anti σ(A)o of any element type A in a crystal structure and the R factor and atomic constitution of that structure is explored for 124 905 element-type occurrences calculated from 33 955 entries in the Cambridge Structural Database. On the basis of the work of Cruickshank [Acta Cryst. (1960), 13, 774-777], it is shown that anti σ(A)p values can be estimated by equations of the form anti σ(A)p = KRN1/2c/ZA where Nc is taken as ΣZ2i/Z2C, the Zi are atomic numbers and the summation is over all atoms in the asymmetric unit. Values of K were obtained by regression techniques using the anti σ(A)o as basis. The constant Knc for noncentrosymmetric structures is found to be larger than Kc for centrosymmetric structures by a factor of ∼21/2, as predicted by Cruickshank (1960). Two predictive equations are generated, one for first-row elements and the second for elements with ZA > 10. The relationship between the different constants K that arise in these two situations is linked to shape differentials in scattering-factor (fi) curves for light and heavy atoms. It is found that predictive equations in which the Zi are selectively replaced by fi at a constant sinθ/λ of 0.30 A-1 generate closely similar values of K for the light-atom and heavy-atom subsets. The overall analysis indicates that atomic e.s.d.'s may be seriously underestimated in the more precise structure determinations, that e.s.d.'s for the heaviest atoms may be less reliable than those for lighter atoms and that e.s.d.'s in noncentrosymmetric structures may be less accurate than those in centrosymmetric structures. (orig.)

  16. Barriers to predicting changes in global terrestrial methane fluxes: analyses using CLM4Me, a methane biogeochemistry model integrated in CESM

    Directory of Open Access Journals (Sweden)

    W. J. Riley

    2011-07-01

    Full Text Available Terrestrial net CH4 surface fluxes often represent the difference between much larger gross production and consumption fluxes and depend on multiple physical, biological, and chemical mechanisms that are poorly understood and represented in regional- and global-scale biogeochemical models. To characterize uncertainties, study feedbacks between CH4 fluxes and climate, and to guide future model development and experimentation, we developed and tested a new CH4 biogeochemistry model (CLM4Me integrated in the land component (Community Land Model; CLM4 of the Community Earth System Model (CESM1. CLM4Me includes representations of CH4 production, oxidation, aerenchyma transport, ebullition, aqueous and gaseous diffusion, and fractional inundation. As with most global models, CLM4 lacks important features for predicting current and future CH4 fluxes, including: vertical representation of soil organic matter, accurate subgrid scale hydrology, realistic representation of inundated system vegetation, anaerobic decomposition, thermokarst dynamics, and aqueous chemistry. We compared the seasonality and magnitude of predicted CH4 emissions to observations from 18 sites and three global atmospheric inversions. Simulated net CH4 emissions using our baseline parameter set were 270, 160, 50, and 70 Tg CH4 yr−1 globally, in the tropics, in the temperate zone, and north of 45° N, respectively; these values are within the range of previous estimates. We then used the model to characterize the sensitivity of regional and global CH4 emission estimates to uncertainties in model parameterizations. Of the parameters we tested, the temperature sensitivity of CH4 production, oxidation parameters, and aerenchyma properties had the largest impacts on net CH4 emissions, up to a factor of 4 and 10 at the regional and gridcell scales

  17. Barriers to predicting changes in global terrestrial methane fluxes: analyses using CLM4Me, a methane biogeochemistry model integrated in CESM

    Directory of Open Access Journals (Sweden)

    W. J. Riley

    2011-02-01

    Full Text Available Terrestrial net CH4 surface fluxes often represent the difference between much larger gross production and consumption fluxes and depend on multiple physical, biological, and chemical mechanisms that are poorly understood and represented in regional- and global-scale biogeochemical models. To characterize uncertainties, study feedbacks between CH4 fluxes and climate, and to guide future model development and experimentation, we developed and tested a new CH4 biogeochemistry model (CLM4Me integrated in the land component (Community Land Model; CLM4 of the Community Earth System Model (CESM1. CLM4Me includes representations of CH4 production, oxidation, aerenchymous transport, ebullition, aqueous and gaseous diffusion, and fractional inundation. As with most global models, CLM4Me lacks important features for predicting current and future CH4 fluxes, including: vertical representation of soil organic matter, accurate subgrid scale hydrology, realistic representation of inundated system vegetation, anaerobic decomposition, thermokarst dynamics, and aqueous chemistry. We compared the seasonality and magnitude of predicted CH4 emissions to observations from 18 sites and three global atmospheric inversions. Simulated net CH4 emissions using our baseline parameter set were 270, 160, 50, and 70 Tg CH4 m−2 yr−1 globally, in the tropics, temperate zone, and north of 45° N, respectively; these values are within the range of previous estimates. We then used the model to characterize the sensitivity of regional and global CH4 emission estimates to uncertainties in model parameterizations. Of the parameters we tested, the temperature sensitivity of CH4 production, oxidation parameters, and aerenchyma properties had the largest impacts on net CH4 emissions, up to a factor of 4 and 10 at the regional and gridcell

  18. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  19. Comparative analyses of genetic risk prediction methods reveal extreme diversity of genetic predisposition to nonalcoholic fatty liver disease (NAFLD) among ethnic populations of India

    Indian Academy of Sciences (India)

    Ankita Chatterjee; Analabha Basu; Abhijit Chowdhury; Kausik Das; Neeta Sarkar-Roy; Partha P. Majumder; Priyadarshi Basu

    2015-03-01

    Nonalcoholic fatty liver disease (NAFLD) is a distinct pathologic condition characterized by a disease spectrum ranging from simple steatosis to steato-hepatitis, cirrhosis and hepatocellular carcinoma. Prevalence of NAFLD varies in different ethnic groups, ranging from 12% in Chinese to 45% in Hispanics. Among Indian populations, the diversity in prevalence is high, ranging from 9% in rural populations to 32% in urban populations, with geographic differences as well. Here, we wished to find out if this difference is reflected in their genetic makeup. To date, several candidate genes and a few genomewide association studies (GWAS) have been carried out, and many associations between single nucleotide polymorphisms (SNPs) and NAFLD have been observed. In this study, the risk allele frequencies (RAFs) of NAFLD-associated SNPs in 20 Indian ethnic populations (376 individuals) were analysed. We used two different measures for calculating genetic risk scores and compared their performance. The correlation of additive risk scores of NAFLD for three Hapmap populations with their weighted mean prevalence was found to be high (2 = 0.93). Later we used this method to compare NAFLD risk among ethnic Indian populations. Based on our observation, the Indian caste populations have high risk scores compared to Caucasians, who are often used as surrogate and similar to Indian caste population in disease gene association studies, and is significantly higher than the Indian tribal populations.

  20. In Vitro and in Silico Analyses for Predicting Hepatic Cytochrome P450-Dependent Metabolic Potencies of Polychlorinated Biphenyls in the Baikal Seal.

    Science.gov (United States)

    Yoo, Jean; Hirano, Masashi; Mizukawa, Hazuki; Nomiyama, Kei; Agusa, Tetsuro; Kim, Eun-Young; Tanabe, Shinsuke; Iwata, Hisato

    2015-12-15

    The aim of this study was to understand the cytochrome P450 (CYP)-dependent metabolic pathway and potency of polychlorinated biphenyls (PCBs) in the Baikal seal (Pusa sibirica). In vitro metabolism of 62 PCB congener mixtures was investigated by using liver microsomes of this species. A decreased ratio of over 20% was observed for CB3, CB4, CB8, CB15, CB19, CB22, CB37, CB54, CB77, and CB105, suggesting the preferential metabolism of low-chlorinated PCBs by CYPs. The highly activated metabolic pathways in Baikal seals that were predicted from the decreased PCBs and detected hydroxylated PCBs (OH-PCBs) were CB22 to 4'OH-CB20 and CB77 to 4'OH-CB79. The total amount of OH-PCBs detected as identified and unidentified congeners accounted for only a 3.8 ± 1.7 mol % of loaded PCBs, indicating many unknown PCB metabolic pathways. To explore factors involved in CYP-dependent PCB metabolism, we examined the relationships among the structural and physicochemical properties of PCBs, the in silico PCB-CYP docking parameters, and the in vitro PCB decreased ratios by principal component analysis. Statistical analysis showed that the decreased PCB ratio was at least partly accounted for by the substituted chlorine number of PCBs and the distance from the Cl-unsubstituted carbon of docked PCBs to the heme Fe in CYP2A and 2B. PMID:26579933

  1. Effects of pharmacists' interventions on appropriateness of prescribing and evaluation of the instruments' (MAI, STOPP and STARTs' ability to predict hospitalization--analyses from a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Ulrika Gillespie

    Full Text Available BACKGROUND: Appropriateness of prescribing can be assessed by various measures and screening instruments. The aims of this study were to investigate the effects of pharmacists' interventions on appropriateness of prescribing in elderly patients, and to explore the relationship between these results and hospital care utilization during a 12-month follow-up period. METHODS: The study population from a previous randomized controlled study, in which the effects of a comprehensive pharmacist intervention on re-hospitalization was investigated, was used. The criteria from the instruments MAI, STOPP and START were applied retrospectively to the 368 study patients (intervention group (I n = 182, control group (C n = 186. The assessments were done on admission and at discharge to detect differences over time and between the groups. Hospital care consumption was recorded and the association between scores for appropriateness, and hospitalization was analysed. RESULTS: The number of Potentially Inappropriate Medicines (PIMs per patient as identified by STOPP was reduced for I but not for C (1.42 to 0.93 vs. 1.46 to 1.66 respectively, p<0.01. The number of Potential Prescription Omissions (PPOs per patient as identified by START was reduced for I but not for C (0.36 to 0.09 vs. 0.42 to 0.45 respectively, p<0.001. The summated score for MAI was reduced for I but not for C (8.5 to 5.0 and 8.7 to 10.0 respectively, p<0.001. There was a positive association between scores for MAI and STOPP and drug-related readmissions (RR 8-9% and 30-34% respectively. No association was detected between the scores of the tools and total re-visits to hospital. CONCLUSION: The interventions significantly improved the appropriateness of prescribing for patients in the intervention group as evaluated by the instruments MAI, STOPP and START. High scores in MAI and STOPP were associated with a higher number of drug-related readmissions.

  2. A new tool for prediction and analysis of thermal comfort in steady and transient states; Un nouvel outil pour la prediction et l'analyse du confort thermique en regime permanent et variable

    Energy Technology Data Exchange (ETDEWEB)

    Megri, A.Ch. [Illinois Institute of Technology, Civil and Architectural Engineering Dept., Chicago, Illinois (United States); Megri, A.F. [Centre Universitaire de Tebessa, Dept. d' Electronique (Algeria); El Naqa, I. [Washington Univ., School of Medicine, Dept. of Radiation Oncology, Saint Louis, Missouri (United States); Achard, G. [Universite de Savoie, Lab. Optimisation de la Conception et Ingenierie de L' Environnement (LOCIE) - ESIGEC, 73 - Le Bourget du Lac (France)

    2006-02-15

    Thermal comfort is influenced by psychological as well as physiological factors. This paper proposes the use of support vector machine (SVM) learning for automated prediction of human thermal comfort in steady and transient states. The SVM is an artificial intelligent approach that could capture the input/output mapping from the given data. Support vector machines were developed based on the Structural Risk Minimization principle. Different sets of representative experimental environmental factors that affect a homogenous person's thermal balance were used for training the SVM machine. The SVM is a very efficient, fast, and accurate technique to identify thermal comfort. This technique permits the determination of thermal comfort indices for different sub-categories of people; sick and elderly, in extreme climatic conditions, when the experimental data for such sub-category are available. The experimental data has been used for the learning and testing processes. The results show a good correlation between SVM predicted values and those obtained from conventional thermal comfort, such as Fanger and Gagge models. The 'trained machine' with representative data could be used easily and effectively in comparison with other conventional estimation methods of different indices. (author)

  3. Network class superposition analyses.

    Science.gov (United States)

    Pearson, Carl A B; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30) for the yeast cell cycle process), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  4. Sproglig Metode og Analyse

    DEFF Research Database (Denmark)

    le Fevre Jakobsen, Bjarne

    Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011......Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011...

  5. Uncertainty and Sensitivity Analyses Plan

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  6. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  7. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove;

    2007-01-01

    The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  8. Report sensory analyses veal

    OpenAIRE

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull, pink veal and white veal. The sensory descriptive analyses show that the three groups Young bulls, pink veal and white veal, differ significantly in red colour for the raw meat as well as the baked...

  9. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, M.A.; Luyten, J.W.; Scheerens, J.; Sleegers, P.J.C.; Scheerens, J.

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  10. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  11. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  12. Report sensory analyses veal

    NARCIS (Netherlands)

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,

  13. Possible future HERA analyses

    CERN Document Server

    Geiser, Achim

    2015-01-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing $ep$ collider data and their physics scope. Comparisons to the original scope of the HERA programme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-e...

  14. Statistisk analyse med SPSS

    OpenAIRE

    Linnerud, Kristin; Oklevik, Ove; Slettvold, Harald

    2004-01-01

    Dette notatet har sitt utspring i forelesninger og undervisning for 3.års studenter i økonomi og administrasjon ved høgskolen i Sogn og Fjordane. Notatet er særlig lagt opp mot undervisningen i SPSS i de to kursene ”OR 685 Marknadsanalyse og merkevarestrategi” og ”BD 616 Økonomistyring og analyse med programvare”.

  15. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  16. Possible future HERA analyses

    Energy Technology Data Exchange (ETDEWEB)

    Geiser, Achim

    2015-12-15

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  17. THOR Turbulence Electron Analyser: TEA

    Science.gov (United States)

    Fazakerley, Andrew; Moore, Tom; Owen, Chris; Pollock, Craig; Wicks, Rob; Samara, Marilia; Rae, Jonny; Hancock, Barry; Kataria, Dhiren; Rust, Duncan

    2016-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves).

  18. Digital differential analysers

    CERN Document Server

    Shilejko, A V; Higinbotham, W

    1964-01-01

    Digital Differential Analysers presents the principles, operations, design, and applications of digital differential analyzers, a machine with the ability to present initial quantities and the possibility of dividing them into separate functional units performing a number of basic mathematical operations. The book discusses the theoretical principles underlying the operation of digital differential analyzers, such as the use of the delta-modulation method and function-generator units. Digital integration methods and the classes of digital differential analyzer designs are also reviewed. The te

  19. Wavelet analyses and applications

    Energy Technology Data Exchange (ETDEWEB)

    Bordeianu, Cristian C [Faculty of Physics, University of Bucharest, Bucharest, RO 077125 (Romania); Landau, Rubin H [Department of Physics, Oregon State University, Corvallis, OR 97331 (United States); Paez, Manuel J [Department of Physics, University of Antioquia, Medellin (Colombia)], E-mail: cristian.bordeianu@brahms.fizica.unibuc.ro, E-mail: rubin@science.oregonstate.edu, E-mail: mpaez@fisica.udea.edu.co

    2009-09-15

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each frequency as a function of time. Next, the theory is specialized to discrete values of time and frequency, and the resulting discrete wavelet transform is shown to be useful for data compression. This paper is addressed to a broad community, from undergraduate to graduate students to general physicists and to specialists in other fields than wavelets.

  20. Systemdynamisk analyse av vannkraftsystem

    OpenAIRE

    Rydning, Anja

    2007-01-01

    I denne oppgaven er det gjennomført en dynamisk analyse av vannkraftverket Fortun kraftverk. Tre fenomener er særlig vurdert i denne oppgaven: Sjaktsvingninger mellom svingesjakt og magasin, trykkstøt ved turbinen som følge av retardasjonstrykk ved endring i turbinvannføringen og reguleringsstabilitet. Sjaktsvingningene og trykkstøt beregnes analytisk ut fra kontinuitets- og bevegelsesligningen. Modeller av Fortun kraftverk er laget for å beregne trykkstøt og sjaktsvingninger. En modell e...

  1. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  2. Analysis of K-net and Kik-net data: implications for ground motion prediction - acceleration time histories, response spectra and nonlinear site response; Analyse des donnees accelerometriques de K-net et Kik-net: implications pour la prediction du mouvement sismique - accelerogrammes et spectres de reponse - et la prise en compte des effets de site non-lineaire

    Energy Technology Data Exchange (ETDEWEB)

    Pousse, G

    2005-10-15

    This thesis intends to characterize ground motion during earthquake. This work is based on two Japanese networks. It deals with databases of shallow events, depth less than 25 km, with magnitude between 4.0 and 7.3. The analysis of K-net allows to compute a spectral ground motion prediction equation and to review the shape of the Eurocode 8 design spectra. We show the larger amplification at short period for Japanese data and bring in light the soil amplification that takes place at large period. In addition, we develop a new empirical model for simulating synthetic stochastic nonstationary acceleration time histories. By specifying magnitude, distance and site effect, this model allows to produce many time histories, that a seismic event is liable to produce at the place of interest. Furthermore, the study of near-field borehole records of the Kik-net allows to explore the validity domain of predictive equations and to explain what occurs by extrapolating ground motion predictions. Finally, we show that nonlinearity reduces the dispersion of ground motion at the surface. (author)

  3. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  4. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  5. The application analyses for primary spectrum pyrometer

    Institute of Scientific and Technical Information of China (English)

    FU; TaiRan

    2007-01-01

    In the applications of primary spectrum pyrometry, based on the dynamic range and the minimum sensibility of the sensor, the application issues, such as the measurement range and the measurement partition, were investigated through theoretical analyses. For a developed primary spectrum pyrometer, the theoretical predictions of measurement range and the distributions of measurement partition were presented through numerical simulations. And the measurement experiments of high-temperature blackbody and standard temperature lamp were processed to further verify the above theoretical analyses and numerical results. Therefore the research in the paper provides the helpful supports for the applications of primary spectrum pyrometer and other radiation pyrometers.……

  6. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...... hyperfunktionelle websites. Det primære ærinde for HCI-eksperterne er at udarbejde websites, som er brugervenlige. Ifølge deres direktiver skal websites være opbygget med hurtige og effektive navigations- og interaktionsstrukturer, hvor brugeren kan få sine informationer ubesværet af lange downloadingshastigheder...... eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens...

  7. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    This paper provides detailed insights into predictability of the entire stock and bond return distribution through the use of quantile regression. This allows us to examine speci…c parts of the return distribution such as the tails or the center, and for a suf…ciently …ne grid of quantiles we can...... predictable as a function of economic state variables. The results are, however, very different for stocks and bonds. The state variables primarily predict only location shifts in the stock return distribution, while they also predict changes in higher-order moments in the bond return distribution. Out......-of-sample analyses show that the relative accuracy of the state variables in predicting future returns varies across the distribution. A portfolio study shows that an investor with power utility can obtain economic gains by applying the empirical return distribution in portfolio decisions instead of imposing an...

  8. Seismic stability analyses - embankment dams

    Energy Technology Data Exchange (ETDEWEB)

    Boudreau, Stephane; Boulanger, Pierre; Caron, Louis Philippe [BPR, Montreal, (Canada); Karray, Mourad [Sherbrooke University, Sherbrooke, (Canada)

    2010-07-01

    An understanding of the effect of earthquakes is necessary to the design of safe dams. A wide number of methods are currently used or being developed for analysing the dynamic slop stability of embankments/dams. This paper investigated the effects of the dynamic aspects (natural period, amplifications and intensity of seismic loading) in the analysis of small dams. A procedure was developed to evaluate the performance of pseudo-static analyses by comparison with fully dynamic analyses. Static, pseudo-static, and dynamic analyses were performed using finite elements and Mohr-Coulomb shear strength criteria. The overall safety factor (FS) was compared using the reduction factor concept. The study worked on two examples of small dams located at moderate and violent seismic regions in the province of Quebec. These examples illustrated the difference between pseudo-static and dynamic analyses. The study also investigated the values of the kh coefficient for Eastern Canada seismicity.

  9. Feed analyses and their interpretation.

    Science.gov (United States)

    Hall, Mary Beth

    2014-11-01

    Compositional analysis is central to determining the nutritional value of feedstuffs for use in ration formulation. The utility of the values and how they should be used depends on how representative the feed subsample is, the nutritional relevance and analytical variability of the assays, and whether an analysis is suitable to be applied to a particular feedstuff. Commercial analyses presently available for carbohydrates, protein, and fats have improved nutritionally pertinent description of feed fractions. Factors affecting interpretation of feed analyses and the nutritional relevance and application of currently available analyses are discussed.

  10. STRATEGY PATTERNS PREDICTION MODEL

    OpenAIRE

    Aram Baruch Gonzalez Perez; Jorge Adolfo Ramirez Uresti

    2014-01-01

    Multi-agent systems are broadly known for being able to simulate real-life situations which require the interaction and cooperation of individuals. Opponent modeling can be used along with multi-agent systems to model complex situations such as competitions like soccer games. In this study, a model for predicting opponent moves based on their target is presented. The model is composed by an offline step (learning phase) and an online one (execution phase). The offline step gets and analyses p...

  11. Conjoint-Analyse und Marktsegmentierung

    OpenAIRE

    Steiner, Winfried J.; Baumgartner, Bernhard

    2003-01-01

    Die Marktsegmentierung zählt neben der Neuproduktplanung und Preisgestaltung zu den wesentlichen Einsatzgebieten der Conjoint-Analyse. Neben traditionell eingesetzten zweistufigen Vorgehensweisen, bei denen Conjoint-Analyse und Segmentierung in zwei getrennten Schritten erfolgen, stehen heute mit Methoden wie der Clusterwise Regression oder Mixture-Modellen neuere Entwicklungen, die eine simultane Segmentierung und Präferenzschätzung ermöglichen, zur Verfügung. Der Beitrag gibt einen Überblic...

  12. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... of prediction necessary and possible in spatial planning of urban development. Finally, the political implications of positions within theory of science rejecting the possibility of predictions about social phenomena are addressed....... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...

  13. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L.A.; Gilbert, M Thomas P; Hofreiter, Michael

    2013-01-01

    analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well....... To date, at least 124 partially or fully assembled mitogenomes from more than 20 species have been obtained, and, given the rapid progress in sequencing technology, this number is likely to dramatically increase in the future. The increased information content offered by analysing full mitogenomes has...

  14. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    , this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security......Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software...

  15. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  16. Evaluation "Risk analyses of agroparks"

    NARCIS (Netherlands)

    Ge, L.

    2011-01-01

    Dit TransForum project richt zich op analyse van de onzekerheden en mogelijkheden van agroparken. Dit heeft geleid tot een risicomodel dat de kwalitatieve en/of kwantitatieve onzekerheden van een agropark project in kaart brengt. Daarmee kunnen maatregelen en managementstrategiën worden geïdentifice

  17. Accurate renormalization group analyses in neutrino sector

    Energy Technology Data Exchange (ETDEWEB)

    Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Kaneta, Kunio [Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Takahashi, Ryo [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Yamaguchi, Yuya [Department of Physics, Faculty of Science, Hokkaido University, Sapporo 060-0810 (Japan)

    2014-08-15

    We investigate accurate renormalization group analyses in neutrino sector between ν-oscillation and seesaw energy scales. We consider decoupling effects of top quark and Higgs boson on the renormalization group equations of light neutrino mass matrix. Since the decoupling effects are given in the standard model scale and independent of high energy physics, our method can basically apply to any models beyond the standard model. We find that the decoupling effects of Higgs boson are negligible, while those of top quark are not. Particularly, the decoupling effects of top quark affect neutrino mass eigenvalues, which are important for analyzing predictions such as mass squared differences and neutrinoless double beta decay in an underlying theory existing at high energy scale.

  18. Successful Predictions

    Science.gov (United States)

    Pierrehumbert, R.

    2012-12-01

    In an observational science, it is not possible to test hypotheses through controlled laboratory experiments. One can test parts of the system in the lab (as is done routinely with infrared spectroscopy of greenhouse gases), but the collective behavior cannot be tested experimentally because a star or planet cannot be brought into the lab; it must, instead, itself be the lab. In the case of anthropogenic global warming, this is all too literally true, and the experiment would be quite exciting if it weren't for the unsettling fact that we and all our descendents for the forseeable future will have to continue making our home in the lab. There are nonetheless many routes though which the validity of a theory of the collective behavior can be determined. A convincing explanation must not be a"just-so" story, but must make additional predictions that can be verified against observations that were not originally used in formulating the theory. The field of Earth and planetary climate has racked up an impressive number of such predictions. I will also admit as "predictions" statements about things that happened in the past, provided that observations or proxies pinning down the past climate state were not available at the time the prediction was made. The basic prediction that burning of fossil fuels would lead to an increase of atmospheric CO2, and that this would in turn alter the Earth's energy balance so as to cause tropospheric warming, is one of the great successes of climate science. It began in the lineage of Fourier, Tyndall and Arrhenius, and was largely complete with the the radiative-convective modeling work of Manabe in the 1960's -- all well before the expected warming had progressed far enough to be observable. Similarly, long before the increase in atmospheric CO2 could be detected, Bolin formulated a carbon cycle model and used it to predict atmospheric CO2 out to the year 2000; the actual values come in at the high end of his predicted range, for

  19. Analyse du discours et archive

    OpenAIRE

    Maingueneau, Dominique

    2007-01-01

    Les recherches qui se réclament de "l’analyse du discours" connaissent un développement considérable dans le monde entier ; en revanche, "l’école française d’analyse du discours" (AD) traverse une crise d’identité depuis le début des années 80. Dans cet exposé nous voudrions explorer les raisons de cette crise, puis préciser le concept d’archive qui, à notre sens, permet de prolonger la voie ouverte à la fin des années 1960. Mais il ne s’agit que d’une des voies possibles, dès lors que, comme...

  20. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  1. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial standa...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  2. Tematisk analyse af amerikansk hiphop

    OpenAIRE

    Tranberg-Hansen, Katrine; Bøgh Larsen, Cecilie; Jeppsson,Louise Emilie; Lindberg Kirkegaard, Nanna; Funch Madsen, Signe; Bülow Bach, Maria

    2013-01-01

    This paper examines the possible development in the function of American hiphop. It focuses on specific themes like ghetto, freedom, rebellion, and racial discrimination in hiphop music. To investigate this possible development two text analysis methods are used: a pragmatic and a stylistic text analysis, and a historical method is used: a source criticism. A minimal amount of literature has been published on how hiphop culture arose. The-­‐ se studies, however, make it possible to analyse...

  3. Learner as Statistical Units of Analyses

    Directory of Open Access Journals (Sweden)

    Vivek Venkatesh

    2011-01-01

    Full Text Available Educational psychologists have researched the generality and specificity of metacognitive monitoring in the context of college-level multiple-choice tests, but fairly little is known as to how learners monitor their performance on more complex academic tasks. Even lesser is known about how monitoring proficiencies such as discrimination and bias might be related to key self-regulatory processes associated with task understanding. This quantitative study explores the relationship between monitoring proficiencies and task understanding in 39 adult learners tackling ill-structured writing tasks for a graduate “theories of e-learning” course. Using learner as unit of analysis, the generality of monitoring is confirmed through intra-measure correlation analyses while facets of its specificity stand out due to the absence of inter-measure correlations. Unsurprisingly, learner-based correlational and repeated measures analyses did not reveal how monitoring proficiencies and task understanding might be related. However, using essay as unit of analysis, ordinal and multinomial regressions reveal how monitoring influences different levels of task understanding. Results are interpreted not only in light of novel procedures undertaken in calculating performance prediction capability but also in the application of essay-based, intra-sample statistical analysis that reveal heretofore unseen relationships between academic self-regulatory constructs.

  4. Analyses of containment structures with corrosion damage

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, J.L. [Sandia National Labs., Albuquerque, NM (United States)

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  5. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  6. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael;

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...

  7. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  8. STRATEGY PATTERNS PREDICTION MODEL

    Directory of Open Access Journals (Sweden)

    Aram Baruch Gonzalez Perez

    2014-01-01

    Full Text Available Multi-agent systems are broadly known for being able to simulate real-life situations which require the interaction and cooperation of individuals. Opponent modeling can be used along with multi-agent systems to model complex situations such as competitions like soccer games. In this study, a model for predicting opponent moves based on their target is presented. The model is composed by an offline step (learning phase and an online one (execution phase. The offline step gets and analyses previous experiences while the online step uses the data generated by offline analysis to predict opponent moves. This model is illustrated by an experiment with the RoboCup 2D Soccer Simulator. The proposed model was tested using 22 games to create the knowledge base and getting an accuracy rate over 80%.

  9. Analyses of a Virtual World

    CERN Document Server

    Holovatch, Yurij; Szell, Michael; Thurner, Stefan

    2016-01-01

    We present an overview of a series of results obtained from the analysis of human behavior in a virtual environment. We focus on the massive multiplayer online game (MMOG) Pardus which has a worldwide participant base of more than 400,000 registered players. We provide evidence for striking statistical similarities between social structures and human-action dynamics in the real and virtual worlds. In this sense MMOGs provide an extraordinary way for accurate and falsifiable studies of social phenomena. We further discuss possibilities to apply methods and concepts developed in the course of these studies to analyse oral and written narratives.

  10. Fracturing and brittleness index analyses of shales

    Science.gov (United States)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  11. Experimental review on moment analyses

    CERN Document Server

    Calvi, M

    2003-01-01

    Moments of photon energy spectrum in B->Xs gamma decays, of hadronic mass spectrum and of lepton energy spectrum in B->Xc l nu decays are sensitive to the masses of the heavy quarks as well as to the non-perturbative parameters of the heavy quark expansion. Several measurements have been performed both at the Upsilon(4S) resonance and at Z0 center of mass energies. They provide constraints on the non-perturbative parameters, give a test of the consistency of the theoretical predictions and of the underlying assumptions and allow to reduce the indetermination in the |Vcb| extraction.

  12. Analysing ESP Texts, but How?

    Directory of Open Access Journals (Sweden)

    Borza Natalia

    2015-03-01

    Full Text Available English as a second language (ESL teachers instructing general English and English for specific purposes (ESP in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary school students need to know in terms of language to process academic textbooks cannot avoiding deal with a dilemma. It needs to be decided which way it is most appropriate to analyse the texts in question. Handbooks of English applied linguistics are not immensely helpful with regard to this problem as they tend not to give recommendation as to which major text analytical approaches are advisable to follow in a pre-college setting. The present theoretical research aims to address this lacuna. Respectively, the purpose of this pedagogically motivated theoretical paper is to investigate two major approaches of ESP text analysis, the register and the genre analysis, in order to find the more suitable one for exploring the language use of secondary school subject texts from the point of view of an English as a second language teacher. Comparing and contrasting the merits and limitations of the two contrastive approaches allows for a better understanding of the nature of the two different perspectives of text analysis. The study examines the goals, the scope of analysis, and the achievements of the register perspective and those of the genre approach alike. The paper also investigates and reviews in detail the starkly different methods of ESP text analysis applied by the two perspectives. Discovering text analysis from a theoretical and methodological angle supports a practical aspect of English teaching, namely making an informed choice when setting out to analyse

  13. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  14. Economical analyses in interventional radiology

    International Nuclear Information System (INIS)

    Considerations about the relation between benefit and expenses are also gaining increasing importance in interventional radiology. This review aims at providing a survey about the published data concerning economical analyses of some of the more frequently employed interventions in radiology excluding neuroradiological and coronary interventions. Because of the relative scarcity of literature in this field, all identified articles (n=46) were included without selection for methodological quality. For a number of radiological interventions the cost-effectiveness has already been demonstrated, e.g., PTA of femoropopliteal and iliac artery stenoses, stenting of renal artery stenoses, placement of vena-cava filters, as well as metal stents in malignant biliary and esophageal obstructions. Conflicting data exist for the treatment of abdominal aortic aneurysms. So far, no analysis could be found that directly compares bypass surgery versus PTA+stent in iliac arteries. (orig.)

  15. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  16. Analyse des besoins des usagers

    OpenAIRE

    KHOUDOUR,L; LANGLAIS,A; Charpentier, C.; MOTTE,C; PIAN,C

    2002-01-01

    Il s'agit d'étendre la surveillance vidéo de l'enceinte du métro vers l'intérieur des rames. Les images captées constituent des prises de vue des événements qui se déroulent à l'intérieur des véhicules afin notamment d'améliorer la sécurité des usagers transportes. Il est possible de mémoriser les images des quelques instants précédant un incident usager, d'analyser ces images en temps différé et de mieux appréhender en temps réel le comportement des usagers face à des événements ou des consi...

  17. Isotopic signatures by bulk analyses

    International Nuclear Information System (INIS)

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally

  18. Predicting supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Heinemeyer, S. [Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Weiglein, G. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2010-07-15

    We review the result of SUSY parameter fits based on frequentist analyses of experimental constraints from electroweak precision data, (g-2){sub {mu}}, B physics and cosmological data. We investigate the parameters of the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking mass parameters, and a model with common non-universal Higgs mass parameters in the superpotential (NUHM1). Shown are the results for the SUSY and Higgs spectrum of the models. Many sparticle masses are highly correlated in both the CMSSM and NUHM1, and parts of the regions preferred at the 68% C.L. are accessible to early LHC running. The best-fit points could be tested even with 1 fb{sup -1} at {radical}(s)=7 TeV. (orig.)

  19. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan;

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  20. Thermal and hydraulic analyses of the System 81 cold traps

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.

    1977-06-15

    Thermal and hydraulic analyses of the System 81 Type I and II cold traps were completed except for thermal transients analysis. Results are evaluated, discussed, and reported. Analytical models were developed to determine the physical dimensions of the cold traps and to predict the performance. The FFTF cold trap crystallizer performances were simulated using the thermal model. This simulation shows that the analytical model developed predicts reasonably conservative temperatures. Pressure drop and sodium residence time calculations indicate that the present design will meet the requirements specified in the E-Specification. Steady state temperature data for the critical regions were generated to assess the magnitude of the thermal stress.

  1. NOx analyser interefence from alkenes

    Science.gov (United States)

    Bloss, W. J.; Alam, M. S.; Lee, J. D.; Vazquez, M.; Munoz, A.; Rodenas, M.

    2012-04-01

    Nitrogen oxides (NO and NO2, collectively NOx) are critical intermediates in atmospheric chemistry. NOx abundance controls the levels of the primary atmospheric oxidants OH, NO3 and O3, and regulates the ozone production which results from the degradation of volatile organic compounds. NOx are also atmospheric pollutants in their own right, and NO2 is commonly included in air quality objectives and regulations. In addition to their role in controlling ozone formation, NOx levels affect the production of other pollutants such as the lachrymator PAN, and the nitrate component of secondary aerosol particles. Consequently, accurate measurement of nitrogen oxides in the atmosphere is of major importance for understanding our atmosphere. The most widely employed approach for the measurement of NOx is chemiluminescent detection of NO2* from the NO + O3 reaction, combined with NO2 reduction by either a heated catalyst or photoconvertor. The reaction between alkenes and ozone is also chemiluminescent; therefore alkenes may contribute to the measured NOx signal, depending upon the instrumental background subtraction cycle employed. This interference has been noted previously, and indeed the effect has been used to measure both alkenes and ozone in the atmosphere. Here we report the results of a systematic investigation of the response of a selection of NOx analysers, ranging from systems used for routine air quality monitoring to atmospheric research instrumentation, to a series of alkenes ranging from ethene to the biogenic monoterpenes, as a function of conditions (co-reactants, humidity). Experiments were performed in the European Photoreactor (EUPHORE) to ensure common calibration, a common sample for the monitors, and to unequivocally confirm the alkene (via FTIR) and NO2 (via DOAS) levels present. The instrument responses ranged from negligible levels up to 10 % depending upon the alkene present and conditions used. Such interferences may be of substantial importance

  2. Efficient ALL vs. ALL collision risk analyses

    Science.gov (United States)

    Escobar, D.; Paskowitz, M.; Agueda, A.; Garcia, G.; Molina, M.

    2011-09-01

    In recent years, the space debris has gained a lot of attention due to the increasing amount of uncontrolled man-made objects orbiting the Earth. This population poses a significant and constantly growing thread to operational satellites. In order to face this thread in an independent manner, ESA has launched an initiative for the development of a European SSA System where GMV is participating via several activities. Apart from those activities financed by ESA, GMV has developed closeap, a tool for efficient conjunction assessment and collision probability prediction. ESÁs NAPEOS has been selected as computational engine and numerical propagator to be used in the tool, which can be considered as an add-on to the standard NAPEOS package. closeap makes use of the same orbit computation, conjunction assessment and collision risk algorithms implemented in CRASS, but at the same time both systems are completely independent. Moreover, the implementation in closeap has been validated against CRASS with excellent results. This paper describes the performance improvements implemented in closeap at algorithm level to ensure that the most time demanding scenarios (e.g., all catalogued objects are analysed against each other - all vs. all scenarios -) can be analysed in a reasonable amount of time with commercial-off-the-shelf hardware. However, the amount of space debris increases steadily due to the human activities. Thus, the number of objects involved in a full collision assessment is expected to increase notably and, consequently, the computational cost, which scales as the square of the number of objects, will increase as well. Additionally, orbit propagation algorithms that are computationally expensive might be needed to predict more accurately the trajectories of the space debris. In order to cope with such computational needs, the next natural step in the development of collision assessment tools is the use of parallelization techniques. In this paper we investigate

  3. Budget-Impact Analyses: A Critical Review of Published Studies

    OpenAIRE

    Ewa Orlewska; Laszlo Gulcsi

    2009-01-01

    This article reviews budget-impact analyses (BIAs) published to date in peer-reviewed bio-medical journals with reference to current best practice, and discusses where future research needs to be directed. Published BIAs were identified by conducting a computerized search on PubMed using the search term 'budget impact analysis'. The years covered by the search included January 2000 through November 2008. Only studies (i) named by authors as BIAs and (ii) predicting financial consequences of a...

  4. Partitioning Uncertainty for Non-Ergodic Probabilistic Seismic Hazard Analyses

    OpenAIRE

    Dawood, Haitham Mohamed Mahmoud Mousad

    2014-01-01

    Properly accounting for the uncertainties in predicting ground motion parameters is critical for Probabilistic Seismic Hazard Analyses (PSHA). This is particularly important for critical facilities that are designed for long return period motions. Non-ergodic PSHA is a framework that allows for this proper accounting of uncertainties. This, in turn, allows for more informed decisions by designers, owners and regulating agencies. The ergodic assumption implies that the standard deviation ...

  5. Aerothermodynamic Analyses of Towed Ballutes

    Science.gov (United States)

    Gnoffo, Peter A.; Buck, Greg; Moss, James N.; Nielsen, Eric; Berger, Karen; Jones, William T.; Rudavsky, Rena

    2006-01-01

    A ballute (balloon-parachute) is an inflatable, aerodynamic drag device for application to planetary entry vehicles. Two challenging aspects of aerothermal simulation of towed ballutes are considered. The first challenge, simulation of a complete system including inflatable tethers and a trailing toroidal ballute, is addressed using the unstructured-grid, Navier-Stokes solver FUN3D. Auxiliary simulations of a semi-infinite cylinder using the rarefied flow, Direct Simulation Monte Carlo solver, DSV2, provide additional insight into limiting behavior of the aerothermal environment around tethers directly exposed to the free stream. Simulations reveal pressures higher than stagnation and corresponding large heating rates on the tether as it emerges from the spacecraft base flow and passes through the spacecraft bow shock. The footprint of the tether shock on the toroidal ballute is also subject to heating amplification. Design options to accommodate or reduce these environments are discussed. The second challenge addresses time-accurate simulation to detect the onset of unsteady flow interactions as a function of geometry and Reynolds number. Video of unsteady interactions measured in the Langley Aerothermodynamic Laboratory 20-Inch Mach 6 Air Tunnel and CFD simulations using the structured grid, Navier-Stokes solver LAURA are compared for flow over a rigid spacecraft-sting-toroid system. The experimental data provides qualitative information on the amplitude and onset of unsteady motion which is captured in the numerical simulations. The presence of severe unsteady fluid - structure interactions is undesirable and numerical simulation must be able to predict the onset of such motion.

  6. Measuring Quality Across Three Child Care Quality Rating and Improvement Systems: Findings from Secondary Analyses.

    OpenAIRE

    Lizabeth Malone; Gretchen Kirby; Pia Caronongan; Kimberly Boller; Kathryn Tout

    2011-01-01

    This report presents findings from an exploratory analysis of administrative data from three QRISs. The analyses examine the prevalence of quality components across centers and how they combine to result in an overall rating level and to predict observed quality.

  7. Nonlinear Analyses of the Dynamic Properties of Hydrostatic Bearing Systems

    Institute of Scientific and Technical Information of China (English)

    LIU Wei(刘伟); WU Xiujiang(吴秀江); V.A. Prokopenko

    2003-01-01

    Nonlinear analyses of hydrostatic bearing systems are necessary to adequately model the fluid-solid interaction. The dynamic properties of linear and nonlinear analytical models of hydrostatic bearings are compared in this paper. The analyses were based on the determination of the aperiodic border of transient processes with external step loads. The results show that the dynamic properties can be most effectively improved by increasing the hydrostatic bearing crosspiece width and additional pocket volume in a bearing can extend the load range for which the transient process is aperiodic, but an additional restrictor and capacitor (RC) chain must be introduced for increasing damping. The nonlinear analyses can also be used to predict typical design parameters for a hydrostatic bearing.

  8. Genetic Analyses in Health Laboratories: Current Status and Expectations

    Science.gov (United States)

    Finotti, Alessia; Breveglieri, Giulia; Borgatti, Monica; Gambari, Roberto

    Genetic analyses performed in health laboratories involve adult patients, newborns, embryos/fetuses, pre-implanted pre-embryos, pre-fertilized oocytes and should meet the major medical needs of hospitals and pharmaceutical companies. Recent data support the concept that, in addition to diagnosis and prognosis, genetic analyses might lead to development of personalized therapy. Novel frontiers in genetic testing involve the development of single cell analyses and non-invasive assays, including those able to predict outcome of cancer pathologies by looking at circulating tumor cells, DNA, mRNA and microRNAs. In this respect, PCR-free diagnostics appears to be one of the most interesting and appealing approaches.

  9. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  10. Residual Strength Analyses of Monolithic Structures

    Science.gov (United States)

    Forth, Scott (Technical Monitor); Ambur, Damodar R. (Technical Monitor); Seshadri, B. R.; Tiwari, S. N.

    2003-01-01

    Finite-element fracture simulation methodology predicts the residual strength of damaged aircraft structures. The methodology uses the critical crack-tip-opening-angle (CTOA) fracture criterion to characterize the fracture behavior of the material. The CTOA fracture criterion assumes that stable crack growth occurs when the crack-tip angle reaches a constant critical value. The use of the CTOA criterion requires an elastic- plastic, finite-element analysis. The critical CTOA value is determined by simulating fracture behavior in laboratory specimens, such as a compact specimen, to obtain the angle that best fits the observed test behavior. The critical CTOA value appears to be independent of loading, crack length, and in-plane dimensions. However, it is a function of material thickness and local crack-front constraint. Modeling the local constraint requires either a three-dimensional analysis or a two-dimensional analysis with an approximation to account for the constraint effects. In recent times as the aircraft industry is leaning towards monolithic structures with the intention of reducing part count and manufacturing cost, there has been a consistent effort at NASA Langley to extend critical CTOA based numerical methodology in the analysis of integrally-stiffened panels.In this regard, a series of fracture tests were conducted on both flat and curved aluminum alloy integrally-stiffened panels. These flat panels were subjected to uniaxial tension and during the test, applied load-crack extension, out-of-plane displacements and local deformations around the crack tip region were measured. Compact and middle-crack tension specimens were tested to determine the critical angle (wc) using three-dimensional code (ZIP3D) and the plane-strain core height (hJ using two-dimensional code (STAGS). These values were then used in the STAGS analysis to predict the fracture behavior of the integrally-stiffened panels. The analyses modeled stable tearing, buckling, and crack

  11. Uncertainty and Sensitivity Analyses of Duct Propagation Models

    Science.gov (United States)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2008-01-01

    This paper presents results of uncertainty and sensitivity analyses conducted to assess the relative merits of three duct propagation codes. Results from this study are intended to support identification of a "working envelope" within which to use the various approaches underlying these propagation codes. This investigation considers a segmented liner configuration that models the NASA Langley Grazing Incidence Tube, for which a large set of measured data was available. For the uncertainty analysis, the selected input parameters (source sound pressure level, average Mach number, liner impedance, exit impedance, static pressure and static temperature) are randomly varied over a range of values. Uncertainty limits (95% confidence levels) are computed for the predicted values from each code, and are compared with the corresponding 95% confidence intervals in the measured data. Generally, the mean values of the predicted attenuation are observed to track the mean values of the measured attenuation quite well and predicted confidence intervals tend to be larger in the presence of mean flow. A two-level, six factor sensitivity study is also conducted in which the six inputs are varied one at a time to assess their effect on the predicted attenuation. As expected, the results demonstrate the liner resistance and reactance to be the most important input parameters. They also indicate the exit impedance is a significant contributor to uncertainty in the predicted attenuation.

  12. VICTORIA-92 pretest analyses of PHEBUS-FPT0

    Energy Technology Data Exchange (ETDEWEB)

    Bixler, N.E.; Erickson, C.M.

    1994-01-01

    FPT0 is the first of six tests that are scheduled to be conducted in an experimental reactor in Cadarache, France. The test apparatus consists of an in-pile fuel bundle, an upper plenum, a hot leg, a steam generator, a cold leg, and a small containment. Thus, the test is integral in the sense that it attempts to simulate all of the processes that would be operative in a severe nuclear accident. In FPT0, the fuel will be trace irradiated; in subsequent tests high burn-up fuel will be used. This report discusses separate pretest analyses of the FPT0 fuel bundle and primary circuit have been conducted using the USNRC`s source term code, VICTORIA-92. Predictions for release of fission product, control rod, and structural elements from the test section are compared with those given by CORSOR-M. In general, the releases predicted by VICTORIA-92 occur earlier than those predicted by CORSOR-M. The other notable difference is that U release is predicted to be on a par with that of the control rod elements; CORSOR-M predicts U release to be about 2 orders of magnitude greater.

  13. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  14. 神经元蜡样质脂褐质沉积病(NCL)的基因型与表型相关性研究%Genotype-phenotype analyses of classic neuronal ceroid lipofuscinosis (NCLs): genetic predictions from clinical and pathological findings

    Institute of Scientific and Technical Information of China (English)

    Weina JU; W. Ted BROWN; Nanbert ZHONG; Anetta WRONSKA; Dorota N. MOROZIEWICZ; Rocksheng ZHONG; Natalia WISNIEWSKI; Anna JURKIEWICZ; Michael FIORY; Krystyna E. WISNIEWSKI; Lance JOHNSTON

    2006-01-01

    Objective:Genotype-phenotype associations were studied in 517 subjects clinically affected by classical neuronal ceroid lipofuscinosis (NCL). Methods:Genetic loci CLN1-3 were analyzed in regard to age of onset, initial neurological symptoms, and electron microscope (EM) profiles. Results: The most common initial symptom leading to a clinical evaluation was developmental delay (30%) in NCL1, seizures (42.4%) in NCL2, and vision problems (53.5%) in NCL3. Eighty-two percent of NCL1 cases had granular osmiophilic deposits (GRODs) or mixed-GROD-containing EM profiles; 94% of NCL2 cases had curvilinear (CV) or mixed-CV-containing profiles; and 91% of NCL3 had fingerprint (FP) or mixed-FP-containing profiles. The mixed-type EM profile was found in approximately one-third of the NCL cases. DNA mutations within a specific CLN gene were further correlated with NCL phenotypes. Seizures were noticed to associate with common mutations 523G>A and 636C>T of CLN2 in NCL2 but not with common mutations 223G>A and 451C>T of CLN1 in NCL1. Vision loss was the initial symptom in all types of mutations in NCL3. Surprisingly, our data showed that the age of onset was atypical in 51.3% of NCL1 (infantile form) cases, 19.7% of NCL2 (late-infantile form) cases, and 42.8% of NCL3 (juvenile form) cases.Conclusion:Our data provide an overall picture regarding the clinical recognition of classical childhood NCLs. This may assist in the prediction and genetic identification of NCL1-3 via their characteristic clinical features.

  15. Economische analyse van de Nederlandse biotechnologiesector

    OpenAIRE

    Giessen, A.M. van der; Gijsbers, G.W.; Koops, R.; Zee, F.A. van der

    2014-01-01

    In opdracht van de Commissie Genetische Modificatie (COGEM) heeft TNO een deskstudie uitgevoerd getiteld “Economische analyse van de Nederlandse biotechnologiesector”. Deze analyse is één van de voorstudies die de COGEM laat uitvoeren als voorbereiding op de Trendanalyse Biotechnologie, die naar verwachting in 2015 zal worden uitgevoerd. Voor deze analyse heeft de COGEM aan TNO gevraagd ontwikkelingen, trends en kansen van de biotechnologie opnieuw in kaart te brengen, met een nadruk op econo...

  16. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  17. Predicting protein structure classes from function predictions

    DEFF Research Database (Denmark)

    Sommer, I.; Rahnenfuhrer, J.; de Lichtenberg, Ulrik;

    2004-01-01

    We introduce a new approach to using the information contained in sequence-to-function prediction data in order to recognize protein template classes, a critical step in predicting protein structure. The data on which our method is based comprise probabilities of functional categories; for given......-to-structure prediction methods....

  18. Star 48 solid rocket motor nozzle analyses and instrumented firings

    Science.gov (United States)

    Porter, R. L.

    1986-01-01

    The analyses and testing performed by NASA in support of an expanded and improved nozzle design data base for use by the U.S. solid rocket motor industry is presented. A production nozzle with a history of one ground failure and two flight failures was selected for analyses and testing. The stress analysis was performed with the Champion computer code developed by the U.S. Navy. Several improvements were made to the code. Strain predictions were made and compared to test data. Two short duration motor firings were conducted with highly instrumented nozzles. The first nozzle had 58 thermocouples, 66 strain gages, and 8 bondline pressure measurements. The second nozzle had 59 thermocouples, 68 strain measurements, and 8 bondline pressure measurements. Most of this instrumentation was on the nonmetallic parts, and provided significantly more thermal and strain data on the nonmetallic components of a nozzle than has been accumulated in a solid rocket motor test to date.

  19. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-11-22

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  20. Economische analyse van de Nederlandse biotechnologiesector

    NARCIS (Netherlands)

    Giessen, A.M. van der; Gijsbers, G.W.; Koops, R.; Zee, F.A. van der

    2014-01-01

    In opdracht van de Commissie Genetische Modificatie (COGEM) heeft TNO een deskstudie uitgevoerd getiteld “Economische analyse van de Nederlandse biotechnologiesector”. Deze analyse is één van de voorstudies die de COGEM laat uitvoeren als voorbereiding op de Trendanalyse Biotechnologie, die naar ver

  1. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.;

    2012-01-01

    Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized for ta....... The metabarcoding approach has considerable potential for biodiversity screening of modern samples and also as a palaeoecological tool....

  2. Novel Algorithms for Astronomical Plate Analyses

    Indian Academy of Sciences (India)

    Rene Hudec; Lukas Hudec

    2011-03-01

    Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness changes.

  3. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...... and needs among population groups with a low ability to pay. Instead of cost-benefit analyses, impact analyses evaluating the likely effects of project alternatives against a wide range of societal goals is recommended, with quantification and economic valorisation only for impact categories where this can...

  4. Prediction of coefficients of thermal expansion for unidirectional composites

    Science.gov (United States)

    Bowles, David E.; Tompkins, Stephen S.

    1989-01-01

    Several analyses for predicting the longitudinal, alpha(1), and transverse, alpha(2), coefficients of thermal expansion of unidirectional composites were compared with each other, and with experimental data on different graphite fiber reinforced resin, metal, and ceramic matrix composites. Analytical and numerical analyses that accurately accounted for Poisson restraining effects in the transverse direction were in consistently better agreement with experimental data for alpha(2), than the less rigorous analyses. All of the analyses predicted similar values of alpha(1), and were in good agreement with the experimental data. A sensitivity analysis was conducted to determine the relative influence of constituent properties on the predicted values of alpha(1), and alpha(2). As would be expected, the prediction of alpha(1) was most sensitive to longitudinal fiber properties and the prediction of alpha(2) was most sensitive to matrix properties.

  5. The psychological status of phonological analyses

    Directory of Open Access Journals (Sweden)

    David Eddington

    2015-09-01

    Full Text Available This paper casts doubt on the psychological relevance of many phonological analyses. There are four reasons for this: 1 theoretical adequacy does not necessarily imply psychological significance; 2 most approaches are nonempirical in that they are not subject to potential spatiotemporal falsification; 3 phonological analyses are estab­ lished with little or no recourse to the speakers of the language via experimental psy­ chology; 4 the limited base of evidence which most analyses are founded on is further cause for skepticism.

  6. L’Analyse de discours des Sociologues

    OpenAIRE

    Demailly, Lise

    2013-01-01

    Les sociologues utilisent, comme méthode d'analyse, l'analyse de discours. Des recherches, ici exposées, ont été menées sur cette méthode, ses spécificités et ses apports à la formation aux techniques d'expression (T.E.). Il ressort que le sociologue produit d'abord des discours (par l'entretien et l'observation) puis les analyse, les traite. Ces discours sont difficilement utilisables en T.E. tant ils sont saturés d'enjeux théoriques voire idéologiques.

  7. Predictability of blocking

    International Nuclear Information System (INIS)

    Tibaldi and Molteni (1990, hereafter referred to as TM) had previously investigated operational blocking predictability by the ECMWF model and the possible relationships between model systematic error and blocking in the winter season of the Northern Hemisphere, using seven years of ECMWF operational archives of analyses and day 1 to 10 forecasts. They showed that fewer blocking episodes than in the real atmosphere were generally simulated by the model, and that this deficiency increased with increasing forecast time. As a consequence of this, a major contribution to the systematic error in the winter season was shown to derive from the inability of the model to properly forecast blocking. In this study, the analysis performed in TM for the first seven winter seasons of the ECMWF operational model is extended to the subsequent five winters, during which model development, reflecting both resolution increases and parametrisation modifications, continued unabated. In addition the objective blocking index developed by TM has been applied to the observed data to study the natural low frequency variability of blocking. The ability to simulate blocking of some climate models has also been tested

  8. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  9. 49 CFR 1180.7 - Market analyses.

    Science.gov (United States)

    2010-10-01

    ... OF TRANSPORTATION RULES OF PRACTICE RAILROAD ACQUISITION, CONTROL, MERGER, CONSOLIDATION PROJECT, TRACKAGE RIGHTS, AND LEASE PROCEDURES General Acquisition Procedures § 1180.7 Market analyses. (a) For... company's marketing plan and existing and potential competitive alternatives (inter- as well as...

  10. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  11. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian;

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We...... examined all reviews approved and published by the Cochrane Heart Group in the 2012 Cochrane Library that included at least one meta-analysis with 5 or more randomized trials. We used trial sequential analysis to classify statistically significant meta-analyses as true positives if their pooled sample size...... and/or their cumulative Z-curve crossed the O'Brien-Fleming monitoring boundaries for detecting a RRR of at least 25%. We classified meta-analyses that did not achieve statistical significance as true negatives if their pooled sample size was sufficient to reject a RRR of 25%. RESULTS: Twenty three...

  12. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  13. Thermal Analyse sof Cross-Linked Polyethylene

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2007-01-01

    Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.

  14. Finite element analyses of CCAT preliminary design

    Science.gov (United States)

    Sarawit, Andrew T.; Kan, Frank W.

    2014-07-01

    This paper describes the development of the CCAT telescope finite element model (FEM) and the analyses performed to support the preliminary design work. CCAT will be a 25 m diameter telescope operating in the 0.2 to 2 mm wavelength range. It will be located at an elevation of 5600 m on Cerro Chajnantor in Northern Chile, near ALMA. The telescope will be equipped with wide-field cameras and spectrometers mounted at the two Nasmyth foci. The telescope will be inside an enclosure to protect it from wind buffeting, direct solar heating, and bad weather. The main structures of the telescope include a steel Mount and a carbon-fiber-reinforced-plastic (CFRP) primary truss. The finite element model developed in this study was used to perform modal, frequency response, seismic response spectrum, stress, and deflection analyses of telescope. Modal analyses of telescope were performed to compute the structure natural frequencies and mode shapes and to obtain reduced order modal output at selected locations in the telescope structure to support the design of the Mount control system. Modal frequency response analyses were also performed to compute transfer functions at these selected locations. Seismic response spectrum analyses of the telescope subject to the Maximum Likely Earthquake were performed to compute peak accelerations and seismic demand stresses. Stress analyses were performed for gravity load to obtain gravity demand stresses. Deflection analyses for gravity load, thermal load, and differential elevation drive torque were performed so that the CCAT Observatory can verify that the structures meet the stringent telescope surface and pointing error requirements.

  15. Nonparametric bootstrap prediction

    OpenAIRE

    Fushiki, Tadayoshi; Komaki, Fumiyasu; Aihara, Kazuyuki

    2005-01-01

    Ensemble learning has recently been intensively studied in the field of machine learning. `Bagging' is a method of ensemble learning and uses bootstrap data to construct various predictors. The required prediction is then obtained by averaging the predictors. Harris proposed using this technique with the parametric bootstrap predictive distribution to construct predictive distributions, and showed that the parametric bootstrap predictive distribution gives asymptotically better prediction tha...

  16. Integrative genomic analyses of a novel cytokine, interleukin-34 and its potential role in cancer prediction

    OpenAIRE

    Wang, Bo; Xu, Wenming; TAN, MIAOLIAN; Xiao, Yan; Yang, Haiwei; Xia, Tian-Song

    2014-01-01

    Interleukin-34 (IL-34) is a novel cytokine, which is composed of 222 amino acids and forms homodimers. It binds to the macrophage colony-stimulating factor (M-CSF) receptor and plays an important role in innate immunity and inflammatory processes. In the present study, we identified the completed IL-34 gene in 25 various mammalian genomes and found that IL-34 existed in all types of vertebrates, including fish, amphibians, birds and mammals. These species have a similar 7 exon/6 intron gene o...

  17. Use of CFD Analyses to Predict Disk Friction Loss of Centrifugal Compressor Impellers

    Science.gov (United States)

    Cho, Leesang; Lee, Seawook; Cho, Jinsoo

    To improve the total efficiency of centrifugal compressors, it is necessary to reduce disk friction loss, which is expressed as the power loss. In this study, to reduce the disk friction loss due to the effect of axial clearance and surface roughness is analyzed and methods to reduce disk friction loss are proposed. The rotating reference frame technique using a commercial CFD tool (FLUENT) is used for steady-state analysis of the centrifugal compressor. Numerical results of the CFD analysis are compared with theoretical results using established experimental empirical equations. The disk friction loss of the impeller is decreased in line with increments in axial clearance until the axial clearance between the impeller disk and the casing is smaller than the boundary layer thickness. In addition, the disk friction loss of the impeller is increased in line with the increments in surface roughness in a similar pattern as that of existing experimental empirical formulas. The disk friction loss of the impeller is more affected by the surface roughness than the change of the axial clearance. To minimize disk friction loss on the centrifugal compressor impeller, the axial clearance and the theoretical boundary layer thickness should be designed to be the same. The design of the impeller requires careful consideration in order to optimize axial clearance and minimize surface roughness.

  18. Fertility prediction of frozen boar sperm using novel and conventional analyses

    Science.gov (United States)

    Frozen-thawed boar sperm is seldom used for artificial insemination (AI) because fertility is lower than fresh or cooled semen. Despite the many advantages of AI including reduced pathogen exposure and ease of semen transport, cryo-induced damage to sperm usually results in decreased litter sizes a...

  19. Analyses of the predicted changes of the global oceans under the increased greenhouse gases scenarios

    Institute of Scientific and Technical Information of China (English)

    MU Lin; WU Dexing; CHEN Xue'en; J Jungclaus

    2006-01-01

    A new climate model (ECHAM5/MPIOM1) developed for the fourth assessment report of the Intergovernmental Panel on Climate Change (IPCC) at Max-Planck Institute for Meteorology is used to study the climate changes under the different increased CO2 scenarios (B1, A1B and A2). Based on the corresponding model results, the sea surface temperature and salinity structure, the variations of the thermohaline circulation (THC) and the changes of sea ice in the northern hemisphere are analyzed. It is concluded that from the year of 2000 to 2100, under the B1, A1B and A2 scenarios, the global mean sea surface temperatures (SST) would increase by 2.5℃, 3.5℃ and 4.0℃ respectively, especially in the region of the Arctic, the increase of SST would be even above 10.0℃; the maximal negative value of the variation of the fresh water flux is located in the subtropical oceans, while the precipitation in the eastern tropical Pacific increases. The strength of THC decreases under the B1, A1B and A2 scenarios, and the reductions would be about 20%, 25% and 25.1% of the present THC strength respectively. In the northern hemisphere, the area of the sea ice cover would decrease by about 50% under the A1B scenario.

  20. ANALYSING URBAN EFFECTS IN BUDAPEST USING THE WRF NUMERICAL WEATHER PREDICTION MODEL

    Directory of Open Access Journals (Sweden)

    JÚLIA GÖNDÖCS

    2016-03-01

    Full Text Available Continuously growing cities significantly modify the entire environment through air pollution and modification of land surface, resulting altered energy budget and land-atmosphere exchange processes over built-up areas. These effects mainly appear in cities or metropolitan areas, leading to the Urban Heat Island (UHI phenomenon, which occurs due to the temperature difference between the built-up areas and their cooler surroundings. The Weather Research and Forecasting (WRF mesoscale model coupled to multilayer urban canopy parameterisation is used to investigate this phenomenon for Budapest and its surroundings with actual land surface properties. In this paper the basic ideas of our research and the methodology in brief are presented. The simulation is completed for one week in summer 2015 with initial meteorological fields from Global Forecasting System (GFS outputs, under atmospheric conditions of weak wind and clear sky for the Pannonian Basin. Then, to improve the WRF model and its settings, the calculated skin temperature is compared to the remotely sensed measurements derived from satellites Aqua and Terra, and the temporal and spatial bias values are estimated.

  1. Behavioral and Physiological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction

    Science.gov (United States)

    Ninness, Chris; Lauter, Judy L.; Coffee, Michael; Clary, Logan; Kelly, Elizabeth; Rumph, Marilyn; Rumph, Robin; Kyle, Betty; Ninness, Sharon K.

    2012-01-01

    Using 3 diversified datasets, we explored the pattern-recognition ability of the Self-Organizing Map (SOM) artificial neural network as applied to diversified nonlinear data distributions in the areas of behavioral and physiological research. Experiment 1 employed a dataset obtained from the UCI Machine Learning Repository. Data for this study…

  2. Integrative genomic analyses of a novel cytokine, interleukin-34 and its potential role in cancer prediction.

    Science.gov (United States)

    Wang, Bo; Xu, Wenming; Tan, Miaolian; Xiao, Yan; Yang, Haiwei; Xia, Tian-Song

    2015-01-01

    Interleukin-34 (IL-34) is a novel cytokine, which is composed of 222 amino acids and forms homodimers. It binds to the macrophage colony-stimulating factor (M-CSF) receptor and plays an important role in innate immunity and inflammatory processes. In the present study, we identified the completed IL-34 gene in 25 various mammalian genomes and found that IL-34 existed in all types of vertebrates, including fish, amphibians, birds and mammals. These species have a similar 7 exon/6 intron gene organization. The phylogenetic tree indicated that the IL-34 gene from the primate lineage, rodent lineage and teleost lineage form a species-specific cluster. It was found mammalian that IL-34 was under positive selection pressure with the identified positively selected site, 196Val. Fifty-five functionally relevant single nucleotide polymorphisms (SNPs), including 32 SNPs causing missense mutations, 3 exonic splicing enhancer SNPs and 20 SNPs causing nonsense mutations were identified from 2,141 available SNPs in the human IL-34 gene. IL-34 was expressed in various types of cancer, including blood, brain, breast, colorectal, eye, head and neck, lung, ovarian and skin cancer. A total of 5 out of 40 tests (1 blood cancer, 1 brain cancer, 1 colorectal cancer and 2 lung cancer) revealed an association between IL-34 gene expression and cancer prognosis. It was found that the association between the expression of IL-34 and cancer prognosis varied in different types of cancer, even in the same types of cancer from different databases. This suggests that the function of IL-34 in these tumors may be multidimensional. The upstream transcription factor 1 (USF1), regulatory factor X-1 (RFX1), the Sp1 transcription factor 1 , POU class 3 homeobox 2 (POU3F2) and forkhead box L1 (FOXL1) regulatory transcription factor binding sites were identified in the IL-34 gene upstream (promoter) region, which may be involved in the effects of IL-34 in tumors. PMID:25395235

  3. Gene expression array analyses predict increased proto-oncogene expression in MMTV induced mammary tumors.

    Science.gov (United States)

    Popken-Harris, Pamela; Kirchhof, Nicole; Harrison, Ben; Harris, Lester F

    2006-08-01

    Exogenous infection by milk-borne mouse mammary tumor viruses (MMTV) typically induce mouse mammary tumors in genetically susceptible mice at a rate of 90-95% by 1 year of age. In contrast to other transforming retroviruses, MMTV acts as an insertional mutagen and under the influence of steroid hormones induces oncogenic transformation after insertion into the host genome. As these events correspond with increases in adjacent proto-oncogene transcription, we used expression array profiling to determine which commonly associated MMTV insertion site proto-oncogenes were transcriptionally active in MMTV induced mouse mammary tumors. To verify our gene expression array results we developed real-time quantitative RT-PCR assays for the common MMTV insertion site genes found in RIII/Sa mice (int-1/wnt-1, int-2/fgf-3, int-3/Notch 4, and fgf8/AIGF) as well as two genes that were consistently up regulated (CCND1, and MAT-8) and two genes that were consistently down regulated (FN1 and MAT-8) in the MMTV induced tumors as compared to normal mammary gland. Finally, each tumor was also examined histopathologically. Our expression array findings support a model whereby just one or a few common MMTV insertions into the host genome sets up a dominant cascade of events that leave a characteristic molecular signature.

  4. Monitoring and prediction of natural disasters

    International Nuclear Information System (INIS)

    The problems of natural disaster predicting and accomplishing a synthesis of environmental monitoring systems to collect, store, and process relevant information for their solution are analysed. A three-level methodology is proposed for making decisions concerning the natural disaster dynamics. The methodology is based on the assessment of environmental indicators and the use of numerical models of the environment

  5. Identifying, analysing and solving problems in practice.

    Science.gov (United States)

    Hewitt-Taylor, Jaqui

    When a problem is identified in practice, it is important to clarify exactly what it is and establish the cause before seeking a solution. This solution-seeking process should include input from those directly involved in the problematic situation, to enable individuals to contribute their perspective, appreciate why any change in practice is necessary and what will be achieved by the change. This article describes some approaches to identifying and analysing problems in practice so that effective solutions can be devised. It includes a case study and examples of how the Five Whys analysis, fishbone diagram, problem tree analysis, and Seven-S Model can be used to analyse a problem.

  6. TOGGLE: toolbox for generic NGS analyses

    OpenAIRE

    Monat, Cécile; Tranchant-Dubreuil, Christine; Kougbeadjo, Ayité; Farcy , Cédric; Ortega-Abboud, Enrique; Amanzougarene, Souhila; Ravel, Sébastien; Agbessi, Mawusse; Orjuela-Bouniol, Julie; Summo, Marilyne; Sabot, François

    2015-01-01

    Background: The explosion of NGS (Next Generation Sequencing) sequence data requires a huge effort in Bioinformatics methods and analyses. The creation of dedicated, robust and reliable pipelines able to handle dozens of samples from raw FASTQ data to relevant biological data is a time-consuming task in all projects relying on NGS. To address this, we created a generic and modular toolbox for developing such pipelines. Results: TOGGLE (TOolbox for Generic nGs anaLysEs) is a suite of tools abl...

  7. Analyse de discours et demande sociale

    OpenAIRE

    Cislaru, Georgeta; Garnier, Sylvie; Matras, Marie-Thérèse; Pugnière-Saavedra, Frédéric; Rousseau, Patrick; Sitri, Frédérique; Veniard, Marie

    2010-01-01

    Que peut nous révéler l’analyse de discours des pratiques sociétales et des pratiques discursives qui les sous-tendent ? En questionnant le discours,l'analyse de discours questionne aussi ses instances productrices : instances politiques, médiatiques, institutionnelles. Elle a ainsi engagé, depuis une quarantaine d’années, un dialogue interdisciplinaire fructueux. Avec cinq contributions d’analystes de discours et deux de professionnels de la protection de l’enfance, ce numéro des Carnets du ...

  8. TOGGLE: toolbox for generic NGS analyses

    OpenAIRE

    Monat, Cécile; Tranchant-Dubreuil, Christine; Kougbeadjo, Ayité; Farcy, Cédric; Ortega-Abboud, Enrique; Amanzougarene, Souhila; Ravel, Sébastien; Agbessi, Mawusse; Orjuela-Bouniol , Julie; Summo, Marilyne; Sabot, François

    2015-01-01

    Background The explosion of NGS (Next Generation Sequencing) sequence data requires a huge effort in Bioinformatics methods and analyses. The creation of dedicated, robust and reliable pipelines able to handle dozens of samples from raw FASTQ data to relevant biological data is a time-consuming task in all projects relying on NGS. To address this, we created a generic and modular toolbox for developing such pipelines. Results TOGGLE (TOolbox for Generic nGs anaLysEs) is a suite of tools able ...

  9. Prosjektering og analyse av en spennarmert betongbru

    OpenAIRE

    Strand, Elin Holsten; Kaldbekkdalen, Ann-Kristin

    2014-01-01

    Hensikten med rapporten er å gjennomføre analyse og dimensjonering av en etteroppspent betongbru. Modellering og analyse er gjennomført i NovaFrame 5. En del av oppgaven var å bestemme spennsystem og tverrsnittshøyden i brua. Det ble antatt seks spennkabler i felt, og tolv over støtte. Videre ble tverrsnittshøyden satt lik 1,3 meter. Dimensjoneringen ble gjennomført i henhold til gjeldende Eurokoder, aktuelle dokumenter og Håndbok 185, som er utarb...

  10. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...... teaching as a practice....

  11. Interferences in reactor neutron activation analyses

    International Nuclear Information System (INIS)

    It has been shown that interfering reactions may occur in neutron activation analyses of aluminum and zinc matrixes, commonly used in nuclear areas. The interferences analysed were: Al2713 (n, α) Na2411 and Zn6430 (n, p) Cu6429. The method used was the non-destructive neutron activation analysis and the spectra were obtained in a 1024 multichannel system coupled with a Ge(Li) detector. Sodium was detected in aluminum samples from the reactor tank and pneumatic transfer system. The independence of the sodium concentration in samples in the range of 0 - 100 ppm is shown by the attenuation obtained with the samples encapsulated in cadmium. (Author)

  12. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions. PMID:27286683

  13. PREDICTING TURBINE STAGE PERFORMANCE

    Science.gov (United States)

    Boyle, R. J.

    1994-01-01

    This program was developed to predict turbine stage performance taking into account the effects of complex passage geometries. The method uses a quasi-3D inviscid-flow analysis iteratively coupled to calculated losses so that changes in losses result in changes in the flow distribution. In this manner the effects of both the geometry on the flow distribution and the flow distribution on losses are accounted for. The flow may be subsonic or shock-free transonic. The blade row may be fixed or rotating, and the blades may be twisted and leaned. This program has been applied to axial and radial turbines, and is helpful in the analysis of mixed flow machines. This program is a combination of the flow analysis programs MERIDL and TSONIC coupled to the boundary layer program BLAYER. The subsonic flow solution is obtained by a finite difference, stream function analysis. Transonic blade-to-blade solutions are obtained using information from the finite difference, stream function solution with a reduced flow factor. Upstream and downstream flow variables may vary from hub to shroud and provision is made to correct for loss of stagnation pressure. Boundary layer analyses are made to determine profile and end-wall friction losses. Empirical loss models are used to account for incidence, secondary flow, disc windage, and clearance losses. The total losses are then used to calculate stator, rotor, and stage efficiency. This program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370/3033 under TSS with a central memory requirement of approximately 4.5 Megs of 8 bit bytes. This program was developed in 1985.

  14. The prediction of different experiences of longterm illness

    DEFF Research Database (Denmark)

    Blank, N; Diderichsen, Finn

    1996-01-01

    To analyse the role played by socioeconomic factors and self rated general health in the prediction of the reporting of severe longterm illness, and the extent to which these factors explain social class differences in the reporting of such illness.......To analyse the role played by socioeconomic factors and self rated general health in the prediction of the reporting of severe longterm illness, and the extent to which these factors explain social class differences in the reporting of such illness....

  15. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary courses…

  16. Chemical Analyses of Silicon Aerogel Samples

    CERN Document Server

    van der Werf, I; De Leo, R; Marrone, S

    2008-01-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  17. A gamma model for {DNA} mixture analyses

    OpenAIRE

    Cowell, R. G.; Lauritzen, S L; Mortera, J.

    2007-01-01

    We present a new methodology for analysing forensic identification problems involving DNA mixture traces where several individuals may have contributed to the trace. The model used for identification and separation of DNA mixtures is based on a gamma distribution for peak area values. In this paper we illustrate the gamma model and apply it on several real examples from forensic casework.

  18. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein the...

  19. Amino acid analyses of Apollo 14 samples.

    Science.gov (United States)

    Gehrke, C. W.; Zumwalt, R. W.; Kuo, K.; Aue, W. A.; Stalling, D. L.; Kvenvolden, K. A.; Ponnamperuma, C.

    1972-01-01

    Detection limits were between 300 pg and 1 ng for different amino acids, in an analysis by gas-liquid chromatography of water extracts from Apollo 14 lunar fines in which amino acids were converted to their N-trifluoro-acetyl-n-butyl esters. Initial analyses of water and HCl extracts of sample 14240 and 14298 samples showed no amino acids above background levels.

  20. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  1. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe;

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  2. Comparing functional annotation analyses with Catmap

    Directory of Open Access Journals (Sweden)

    Krogh Morten

    2004-12-01

    Full Text Available Abstract Background Ranked gene lists from microarray experiments are usually analysed by assigning significance to predefined gene categories, e.g., based on functional annotations. Tools performing such analyses are often restricted to a category score based on a cutoff in the ranked list and a significance calculation based on random gene permutations as null hypothesis. Results We analysed three publicly available data sets, in each of which samples were divided in two classes and genes ranked according to their correlation to class labels. We developed a program, Catmap (available for download at http://bioinfo.thep.lu.se/Catmap, to compare different scores and null hypotheses in gene category analysis, using Gene Ontology annotations for category definition. When a cutoff-based score was used, results depended strongly on the choice of cutoff, introducing an arbitrariness in the analysis. Comparing results using random gene permutations and random sample permutations, respectively, we found that the assigned significance of a category depended strongly on the choice of null hypothesis. Compared to sample label permutations, gene permutations gave much smaller p-values for large categories with many coexpressed genes. Conclusions In gene category analyses of ranked gene lists, a cutoff independent score is preferable. The choice of null hypothesis is very important; random gene permutations does not work well as an approximation to sample label permutations.

  3. FAME: Software for analysing rock microstructures

    Science.gov (United States)

    Hammes, Daniel M.; Peternell, Mark

    2016-05-01

    Determination of rock microstructures leads to a better understanding of the formation and deformation of polycrystalline solids. Here, we present FAME (Fabric Analyser based Microstructure Evaluation), an easy-to-use MATLAB®-based software for processing datasets recorded by an automated fabric analyser microscope. FAME is provided as a MATLAB®-independent Windows® executable with an intuitive graphical user interface. Raw data from the fabric analyser microscope can be automatically loaded, filtered and cropped before analysis. Accurate and efficient rock microstructure analysis is based on an advanced user-controlled grain labelling algorithm. The preview and testing environments simplify the determination of appropriate analysis parameters. Various statistic and plotting tools allow a graphical visualisation of the results such as grain size, shape, c-axis orientation and misorientation. The FAME2elle algorithm exports fabric analyser data to an elle (modelling software)-supported format. FAME supports batch processing for multiple thin section analysis or large datasets that are generated for example during 2D in-situ deformation experiments. The use and versatility of FAME is demonstrated on quartz and deuterium ice samples.

  4. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  5. What's missing from avian global diversification analyses?

    Science.gov (United States)

    Reddy, Sushma

    2014-08-01

    The accumulation of vast numbers of molecular phylogenetic studies has contributed to huge knowledge gains in the evolutionary history of birds. This permits subsequent analyses of avian diversity, such as how and why diversification varies across the globe and among taxonomic groups. However, available genetic data for these meta-analyses are unevenly distributed across different geographic regions and taxonomic groups. To comprehend the impact of this variation on the interpretation of global diversity patterns, I examined the availability of genetic data for possible biases in geographic and taxonomic sampling of birds. I identified three main disparities of sampling that are geographically associated with latitude (temperate, tropical), hemispheres (East, West), and range size. Tropical regions, which host the vast majority of species, are substantially less studied. Moreover, Eastern regions, such as the Old World Tropics and Australasia, stand out as being disproportionately undersampled, with up to half of communities not being represented in recent studies. In terms of taxonomic discrepancies, a majority of genetically undersampled clades are exclusively found in tropical regions. My analysis identifies several disparities in the key regions of interest of global diversity analyses. Differential sampling can have considerable impacts on these global comparisons and call into question recent interpretations of latitudinal or hemispheric differences of diversification rates. Moreover, this review pinpoints understudied regions whose biota are in critical need of modern systematic analyses.

  6. The Economic Cost of Homosexuality: Multilevel Analyses

    Science.gov (United States)

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  7. Comparison of veterinary import risk analyses studies

    NARCIS (Netherlands)

    Vos-de Jong, de C.J.; Conraths, F.J.; Adkin, A.; Jones, E.M.; Hallgren, G.S.; Paisley, L.G.

    2011-01-01

    Twenty-two veterinary import risk analyses (IRAs) were audited: a) for inclusion of the main elements of risk analysis; b) between different types of IRAs; c) between reviewers' scores. No significant differences were detected between different types of IRAs, although quantitative IRAs and IRAs publ

  8. Nonlinear Combustion Instability Prediction

    Science.gov (United States)

    Flandro, Gary

    2010-01-01

    The liquid rocket engine stability prediction software (LCI) predicts combustion stability of systems using LOX-LH2 propellants. Both longitudinal and transverse mode stability characteristics are calculated. This software has the unique feature of being able to predict system limit amplitude.

  9. Spent fuel shipping costs for transportation logistics analyses

    International Nuclear Information System (INIS)

    Logistics analyses supplied to the nuclear waste management programs of the U.S. Department of Energy through the Transportation Technology Center (TTC) at Sandia National Laboratories are used to predict nuclear waste material logistics, transportation packaging demands, shipping and receiving rates and transportation-related costs for alternative strategies. This study is an in-depth analysis of the problems and contingencies associated with the costs of shipping irradiated reactor fuel. These costs are extremely variable however, and have changed frequently (sometimes monthly) during the past few years due to changes in capital, fuel, and labor costs. All costs and charges reported in this study are based on January 1982 data using existing transport cask systems and should be used as relative indices only. Actual shipping costs would be negotiable for each origin-destination combination

  10. Monte Carlo uncertainty analyses for integral beryllium experiments

    CERN Document Server

    Fischer, U; Tsige-Tamirat, H

    2000-01-01

    The novel Monte Carlo technique for calculating point detector sensitivities has been applied to two representative beryllium transmission experiments with the objective to investigate the sensitivity of important responses such as the neutron multiplication and to assess the related uncertainties due to the underlying cross-section data uncertainties. As an important result, it has been revealed that the neutron multiplication power of beryllium can be predicted with good accuracy using state-of-the-art nuclear data evaluations. Severe discrepancies do exist for the spectral neutron flux distribution that would transmit into significant uncertainties of the calculated neutron spectra and of the nuclear blanket performance in blanket design calculations. With regard to this, it is suggested to re-analyse the secondary energy and angle distribution data of beryllium by means of Monte Carlo based sensitivity and uncertainty calculations. Related code development work is underway.

  11. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  12. Testing earthquake predictions

    Science.gov (United States)

    Luen, Brad; Stark, Philip B.

    2008-01-01

    Statistical tests of earthquake predictions require a null hypothesis to model occasional chance successes. To define and quantify 'chance success' is knotty. Some null hypotheses ascribe chance to the Earth: Seismicity is modeled as random. The null distribution of the number of successful predictions - or any other test statistic - is taken to be its distribution when the fixed set of predictions is applied to random seismicity. Such tests tacitly assume that the predictions do not depend on the observed seismicity. Conditioning on the predictions in this way sets a low hurdle for statistical significance. Consider this scheme: When an earthquake of magnitude 5.5 or greater occurs anywhere in the world, predict that an earthquake at least as large will occur within 21 days and within an epicentral distance of 50 km. We apply this rule to the Harvard centroid-moment-tensor (CMT) catalog for 2000-2004 to generate a set of predictions. The null hypothesis is that earthquake times are exchangeable conditional on their magnitudes and locations and on the predictions - a common "nonparametric" assumption in the literature. We generate random seismicity by permuting the times of events in the CMT catalog. We consider an event successfully predicted only if (i) it is predicted and (ii) there is no larger event within 50 km in the previous 21 days. The P-value for the observed success rate is <0.001: The method successfully predicts about 5% of earthquakes, far better than 'chance' because the predictor exploits the clustering of earthquakes - occasional foreshocks - which the null hypothesis lacks. Rather than condition on the predictions and use a stochastic model for seismicity, it is preferable to treat the observed seismicity as fixed, and to compare the success rate of the predictions to the success rate of simple-minded predictions like those just described. If the proffered predictions do no better than a simple scheme, they have little value.

  13. STACE: Source Term Analyses for Containment Evaluations of transport casks

    International Nuclear Information System (INIS)

    The development of the Source Term Analyses for Containment Evaluations (STACE) methodology provides a unique means for estimating the probability of cladding breach within transport casks, quantifying the amount of radioactive material released into the cask interior, and calculating the releasable radionuclide concentrations and corresponding maximum permissible leakage rates. Following the guidance of ANSI N14.5, the STACE methodology provides a technically defensible means for estimating maximum permissible leakage rates. These containment criteria attempt to reflect the true radiological hazard by performing a detailed examination of the spent fuel, CRUD, and residual contamination contributions to the releasable source term. The evaluation of the spent fuel contribution to the source team has been modeled fairly accurately using the STACE methodology. The structural model predicts the cask drop load history, the mechanical response of the fuel assembly, and the probability of cladding breach. These data are then used to predict the amount of fission gas, volitile species, and fuel fines that are releasable from the cask. There are some areas where data are sparse or lacking in which experimental validation is planned. Finally, the ANSI N14.5 recommendation that 3% and 100% of the fuel rods fail during normal and hypothetical accident conditions of transport, respectively, has been show to be overly conservative by several degrees of magnitude for these example analyses. Furthermore, the maximum permissible leakage rates for this example assembly under normal and hypothetical accident conditions are significanly higher that the leaktight requirements. By relaxing the maximum permissible leakage rates, the source term methodology is expected to significantly improvecask economics and safety

  14. Three-dimensional lake water quality modeling: sensitivity and uncertainty analyses.

    Science.gov (United States)

    Missaghi, Shahram; Hondzo, Miki; Melching, Charles

    2013-11-01

    Two sensitivity and uncertainty analysis methods are applied to a three-dimensional coupled hydrodynamic-ecological model (ELCOM-CAEDYM) of a morphologically complex lake. The primary goals of the analyses are to increase confidence in the model predictions, identify influential model parameters, quantify the uncertainty of model prediction, and explore the spatial and temporal variabilities of model predictions. The influence of model parameters on four model-predicted variables (model output) and the contributions of each of the model-predicted variables to the total variations in model output are presented. The contributions of predicted water temperature, dissolved oxygen, total phosphorus, and algal biomass contributed 3, 13, 26, and 58% of total model output variance, respectively. The fraction of variance resulting from model parameter uncertainty was calculated by two methods and used for evaluation and ranking of the most influential model parameters. Nine out of the top 10 parameters identified by each method agreed, but their ranks were different. Spatial and temporal changes of model uncertainty were investigated and visualized. Model uncertainty appeared to be concentrated around specific water depths and dates that corresponded to significant storm events. The results suggest that spatial and temporal variations in the predicted water quality variables are sensitive to the hydrodynamics of physical perturbations such as those caused by stream inflows generated by storm events. The sensitivity and uncertainty analyses identified the mineralization of dissolved organic carbon, sediment phosphorus release rate, algal metabolic loss rate, internal phosphorus concentration, and phosphorus uptake rate as the most influential model parameters.

  15. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    Science.gov (United States)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  16. Identifying, analysing and solving problems in practice.

    Science.gov (United States)

    Hewitt-Taylor, Jaqui

    When a problem is identified in practice, it is important to clarify exactly what it is and establish the cause before seeking a solution. This solution-seeking process should include input from those directly involved in the problematic situation, to enable individuals to contribute their perspective, appreciate why any change in practice is necessary and what will be achieved by the change. This article describes some approaches to identifying and analysing problems in practice so that effective solutions can be devised. It includes a case study and examples of how the Five Whys analysis, fishbone diagram, problem tree analysis, and Seven-S Model can be used to analyse a problem. PMID:22848969

  17. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    , and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses...... analyzed for a material containing a periodic distribution of spherical voids with two different void sizes, where the stress fields around larger voids may accelerate the growth of smaller voids. Another approach has been an analysis of a unit cell model in which a central cavity is discretely represented......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  18. Reliability of chemical analyses of water samples

    Energy Technology Data Exchange (ETDEWEB)

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  19. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  20. Mikromechanische Analyse der Wirkungsmechanismen elektrischer Dehnungsmessstreifen

    OpenAIRE

    Stockmann, Martin

    2000-01-01

    Die elektrische Dehnungsmesstechnik auf der Grundlage separater Dehnungsmessstreifen (DMS) stellt heute eine der wesentlichsten Methoden zur experimentellen Beanspruchungs- analyse dar. Präzise Messungen außerhalb der Kalibrierbedingungen, insbesondere bei großen Deformationen oder hohen Querdehnungsanteilen, erfordern die Berücksichtigung nicht- linearer Zusammenhänge zwischen den zu bestimmenden Komponenten der Bauteildehnung und der Widerstandsänderung des Messgitters. ...

  1. El Cours d’Analyse de Cauchy

    OpenAIRE

    Pérez, Javier; Aizpuru, Antonio

    1999-01-01

    En este artículo presentamos un estudio contextualizado de Cours d’Analyse de Cauchy, analizando su significado e importancia. Presentamos especial atención al grado de elaboración teórica de límites, continuidad, series, números reales funciones y series completas, relacionando las aportaciones de Cauchi del nivel conceptual anterior a esta ahora.

  2. Mass spectrometer for the analyses of gases

    International Nuclear Information System (INIS)

    A 6-in-radius, 600 magnetic-sector mass spectrometer (designated as the MS-200) has been constructed for the quantitative and qualitative analyses of fixed gases and volatile organics in the concentration range from 1 ppM (by volume) to 100%. A partial pressure of 1 x 10-6 torr in the inlet expansion volume is required to achieve a useful signal at an electron-multiplier gain of 10,000

  3. Ethics of cost analyses in medical education

    OpenAIRE

    Walsh, Kieran

    2013-01-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses–specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconscious...

  4. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.;

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  5. Delvis drenert analyse av innvendig avstivet utgraving

    OpenAIRE

    Myhrvold, Michael F

    2013-01-01

    Denne masteroppgaven omhandler analyser av de delvis drenerte effektene som kan oppstå ved innvendig, avstivede utgravinger. Formålet med masteroppgaven er å gjennomføre en numerisk studie av prosessen som styrer den tidsavhengige utviklingen ved avstivede utgravinger i lavpermeable jordtyper. Det gir muligheten til å vurdere de delvis drenerte effektene og innflytelsen disse utgjør ved denne typen utgravinger. Ettersom jordens oppførsel ved små tø...

  6. ANALYSING SPACE: ADAPTING AND EXTENDING MULTIMODAL SEMIOTICS

    Directory of Open Access Journals (Sweden)

    Louise J. Ravelli

    2015-07-01

    Full Text Available In the field of multimodal discourse analysis, one of the most exciting sites of application is that of 3D space: examining aspects of built environment for its meaningmaking potential. For the built environment – homes, offices, public buildings, parks, etc. – does indeed make meaning. These are spaces which speak – often their meanings are so familiar, we no longer hear what they say; sometimes, new and unusual sites draw attention to their meanings, and they are hotly contested. This chapter will suggest ways of analyzing 3D texts, based on the framework of Kress and van Leeuwen (2006. This framework, developed primarily for the analysis of 2D images, has been successfully extended to a range of other multimodal texts. Extension to the built environment includes Pang (2004, O’Toole (1994, Ravelli (2006, Safeyton (2004, Stenglin (2004 and White (1994, whose studies will inform the analyses presented here. This article will identify some of the key theoretical principles which underline this approach, including the notions of text, context and metafunction, and will describe some of the main areas of analysis for 3D texts. Also, ways of bringing the analyses together will be considered. The analyses will be demonstrated in relation to the Scientia building at the University of New South Wales, Australia.

  7. Sequencing and comparative analyses of the genomes of zoysiagrasses.

    Science.gov (United States)

    Tanaka, Hidenori; Hirakawa, Hideki; Kosugi, Shunichi; Nakayama, Shinobu; Ono, Akiko; Watanabe, Akiko; Hashiguchi, Masatsugu; Gondo, Takahiro; Ishigaki, Genki; Muguerza, Melody; Shimizu, Katsuya; Sawamura, Noriko; Inoue, Takayasu; Shigeki, Yuichi; Ohno, Naoki; Tabata, Satoshi; Akashi, Ryo; Sato, Shusei

    2016-04-01

    Zoysiais a warm-season turfgrass, which comprises 11 allotetraploid species (2n= 4x= 40), each possessing different morphological and physiological traits. To characterize the genetic systems ofZoysiaplants and to analyse their structural and functional differences in individual species and accessions, we sequenced the genomes ofZoysiaspecies using HiSeq and MiSeq platforms. As a reference sequence ofZoysiaspecies, we generated a high-quality draft sequence of the genome ofZ. japonicaaccession 'Nagirizaki' (334 Mb) in which 59,271 protein-coding genes were predicted. In parallel, draft genome sequences ofZ. matrella'Wakaba' andZ. pacifica'Zanpa' were also generated for comparative analyses. To investigate the genetic diversity among theZoysiaspecies, genome sequence reads of three additional accessions,Z. japonica'Kyoto',Z. japonica'Miyagi' andZ. matrella'Chiba Fair Green', were accumulated, and aligned against the reference genome of 'Nagirizaki' along with those from 'Wakaba' and 'Zanpa'. As a result, we detected 7,424,163 single-nucleotide polymorphisms and 852,488 short indels among these species. The information obtained in this study will be valuable for basic studies on zoysiagrass evolution and genetics as well as for the breeding of zoysiagrasses, and is made available in the 'Zoysia Genome Database' athttp://zoysia.kazusa.or.jp. PMID:26975196

  8. Sequencing and comparative analyses of the genomes of zoysiagrasses.

    Science.gov (United States)

    Tanaka, Hidenori; Hirakawa, Hideki; Kosugi, Shunichi; Nakayama, Shinobu; Ono, Akiko; Watanabe, Akiko; Hashiguchi, Masatsugu; Gondo, Takahiro; Ishigaki, Genki; Muguerza, Melody; Shimizu, Katsuya; Sawamura, Noriko; Inoue, Takayasu; Shigeki, Yuichi; Ohno, Naoki; Tabata, Satoshi; Akashi, Ryo; Sato, Shusei

    2016-04-01

    Zoysiais a warm-season turfgrass, which comprises 11 allotetraploid species (2n= 4x= 40), each possessing different morphological and physiological traits. To characterize the genetic systems of Zoysia plants and to analyse their structural and functional differences in individual species and accessions, we sequenced the genomes of Zoysia species using HiSeq and MiSeq platforms. As a reference sequence of Zoysia species, we generated a high-quality draft sequence of the genome of Z. japonica accession 'Nagirizaki' (334 Mb) in which 59,271 protein-coding genes were predicted. In parallel, draft genome sequences of Z. matrella 'Wakaba' and Z. pacifica 'Zanpa' were also generated for comparative analyses. To investigate the genetic diversity among the Zoysia species, genome sequence reads of three additional accessions, Z. japonica'Kyoto', Z. japonica'Miyagi' and Z. matrella'Chiba Fair Green', were accumulated, and aligned against the reference genome of 'Nagirizaki' along with those from 'Wakaba' and 'Zanpa'. As a result, we detected 7,424,163 single-nucleotide polymorphisms and 852,488 short indels among these species. The information obtained in this study will be valuable for basic studies on zoysiagrass evolution and genetics as well as for the breeding of zoysiagrasses, and is made available in the 'Zoysia Genome Database' at http://zoysia.kazusa.or.jp.

  9. Consumer brand choice: individual and group analyses of demand elasticity.

    Science.gov (United States)

    Oliveira-Castro, Jorge M; Foxall, Gordon R; Schrezenmaier, Teresa C

    2006-03-01

    Following the behavior-analytic tradition of analyzing individual behavior, the present research investigated demand elasticity of individual consumers purchasing supermarket products, and compared individual and group analyses of elasticity. Panel data from 80 UK consumers purchasing 9 product categories (i.e., baked beans, biscuits, breakfast cereals, butter, cheese, fruit juice, instant coffee, margarine and tea) during a 16-week period were used. Elasticity coefficients were calculated for individual consumers with data from all or only 1 product category (intra-consumer elasticities), and for each product category using all data points from all consumers (overall product elasticity) or 1 average data point per consumer (interconsumer elasticity). In addition to this, split-sample elasticity coefficients were obtained for each individual with data from all product categories purchased during weeks 1 to 8 and 9 to 16. The results suggest that: 1) demand elasticity coefficients calculated for individual consumers purchasing supermarket food products are compatible with predictions from economic theory and behavioral economics; 2) overall product elasticities, typically employed in marketing and econometric research, include effects of interconsumer and intraconsumer elasticities; 3) when comparing demand elasticities of different product categories, group and individual analyses yield similar trends; and 4) individual differences in demand elasticity are relatively consistent across time, but do not seem to be consistent across products. These results demonstrate the theoretical, methodological, and managerial relevance of investigating the behavior of individual consumers.

  10. COMPARATIVE ANALYSES OF MORPHOLOGICAL CHARACTERS IN SPHAERODORIDAE AND ALLIES (ANNELIDA REVEALED BY AN INTEGRATIVE MICROSCOPICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Conrad eHelm

    2015-01-01

    Full Text Available Sphaerodoridae is a group of benthic marine worms (Annelida characterized by the presence of spherical tubercles covering their whole surface. They are commonly considered as belonging to Phyllodocida although sistergroup relationships are still far from being understood. Primary homology assessment of their morphological features are lacking, hindering the appraisal of evolutionary relationships between taxa. Therefore, our detailed morphological investigation focuses on different Sphaerodoridae as well as on other members of Phyllodocida using an integrative approach combining scanning electron microscopy (SEM as well as immunohistochemistry with standard neuronal (anti-5-HT and muscular (phalloidin-rhodamine markers and subsequent CLSM analysis of whole mounts and sections. Furthermore, we provide histological (HES and light microscopical data to shed light on the structures and hypothetical function of sphaerodorid key morphological features. We provide fundamental details into the sphaerodorid morphology supporting a Phyllodocida ancestry of these enigmatic worms. However, the muscular arrangement and the presence of an axial muscular pharynx is similar to conditions observed in other members of the Errantia too. Furthermore, nervous system and muscle staining as well as SEM and histological observations of different types of tubercles indicate a homology of the so called microtubercles, present in the long-bodied sphaerodorids, to the dorsal cirri of other Errantia. The macrotubercles seem to represent a sphaerodorid autapomorphy based on our investigations. Therefore, our results allow comparisons concerning morphological patterns between Sphaerodoridae and other Phyllodocida and constitute a starting point for further comparative investigations to reveal the evolution of the remarkable Sphaerodoridae.

  11. Department of Energy's team's analyses of Soviet designed VVERs

    Energy Technology Data Exchange (ETDEWEB)

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  12. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  13. Predictability of conversation partners

    CERN Document Server

    Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-01-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information theoretic method to the spatiotemporal data of cell-phone locations, Song et al. (2010) found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one's conversation partners is defined as the degree to which one's next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between close sensor nodes. We find t...

  14. Evaluation of Model Operational Analyses during DYNAMO

    Science.gov (United States)

    Ciesielski, Paul; Johnson, Richard

    2013-04-01

    A primary component of the observing system in the DYNAMO-CINDY2011-AMIE field campaign was an atmospheric sounding network comprised of two sounding quadrilaterals, one north and one south of the equator over the central Indian Ocean. During the experiment a major effort was undertaken to ensure the real-time transmission of these data onto the GTS (Global Telecommunication System) for dissemination to the operational centers (ECMWF, NCEP, JMA, etc.). Preliminary estimates indicate that ~95% of the soundings from the enhanced sounding network were successfully transmitted and potentially used in their data assimilation systems. Because of the wide use of operational and reanalysis products (e.g., in process studies, initializing numerical simulations, construction of large-scale forcing datasets for CRMs, etc.), their validity will be examined by comparing a variety of basic and diagnosed fields from two operational analyses (ECMWF and NCEP) to similar analyses based solely on sounding observations. Particular attention will be given to the vertical structures of apparent heating (Q1) and drying (Q2) from the operational analyses (OA), which are strongly influenced by cumulus parameterizations, a source of model infidelity. Preliminary results indicate that the OA products did a reasonable job at capturing the mean and temporal characteristics of convection during the DYNAMO enhanced observing period, which included the passage of two significant MJO events during the October-November 2011 period. For example, temporal correlations between Q2-budget derived rainfall from the OA products and that estimated from the TRMM satellite (i.e., the 3B42V7 product) were greater than 0.9 over the Northern Sounding Array of DYNAMO. However closer inspection of the budget profiles show notable differences between the OA products and the sounding-derived results in low-level (surface to 700 hPa) heating and drying structures. This presentation will examine these differences and

  15. Stable isotopic analyses in paleoclimatic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Wigand, P.E. [Univ. and Community College System of Nevada, Reno, NV (United States)

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  16. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  17. Sivers function: SIDIS data, fits and predictions

    CERN Document Server

    Anselmino, M; D'Alesio, U; Kotzinian, A; Murgia, F; Prokudin, A

    2005-01-01

    The most recent data on the weighted transverse single spin asymmetry A_{UT}^{\\sin(\\phi_h-\\phi_S)} from HERMES and COMPASS collaborations are analysed within LO parton model; all transverse motions are taken into account. Extraction of the Sivers function for u and d quarks is performed. Based on the extracted Sivers functions, predictions for A_{UT}^{\\sin(\\phi_h-\\phi_S)} asymmetries at JLab are given; suggestions for further measurements at COMPASS, with a transversely polarized hydrogen target and selecting favourable kinematical ranges, are discussed. Predictions are also presented for Single Spin Asymmetries (SSA) in Drell-Yan processes at RHIC and GSI.

  18. FEM-ANALYSE AV INDUSTRIELL ALUMINIUMSPROFILEKSTRUDERING

    OpenAIRE

    Christenssen, Wenche

    2014-01-01

    Avhandlingen er skrevet for å øke forståelsen og kunnskapen rundt materialflyt ved ekstruderingav komplekse og tynnvegde aluminiumprofiler. Det gjennomgås også hvordan ujevn materialflyt utav en matrise kan avbalanseres ved bruk av forkammer.Rapporten tar for seg oppbygning av modeller og simulering for to forskjellige profilgeometrier.Det første profilet er et U-profil som det tidligere er gjort analyser av ved bruk av modellmateriale.Dette ble gjort i en Diplom...

  19. Erregerspektrum bei tiefen Halsinfektionen: Eine retrospektive Analyse

    OpenAIRE

    Sömmer, C; Haid, M; Hommerich, C; Laskawi, R; Canis, M; Matthias, C

    2014-01-01

    Einleitung: Tiefe Halsinfektionen zählen zu den gefährlichsten Erkrankungen in der HNO-Heilkunde. Diese Analyse gibt einen Überblick über die Mikrobiologie tiefer Halsinfektionen und Einflussfaktoren, die zu einer Änderung des Keimspektrums führen können. Methoden: Von Januar 2002 bis Dezember 2012 wurden 63 Patienten mit tiefen Halsinfektionen in der HNO-Klinik der Universitätsmedizin Göttingen behandelt. Es wurden intraoperative Abstriche entnommen. Die Inzidenz der häufigsten Erreger wur...

  20. Use of Geospatial Analyses for Semantic Reasoning

    OpenAIRE

    Karmacharya, Ashish; Cruz, Christophe; Boochs, Frank; Marzani, Franck

    2010-01-01

    International audience This work focuses on the integration of the spatial analyses for semantic reasoning in order to compute new axioms of an existing OWL ontology. To make it concrete, we have defined Spatial Built-ins, an extension of existing Built-ins of the SWRL rule language. It permits to run deductive rules with the help of a translation rule engine. Thus, the Spatial SWRL rules are translated to standard SWRL rules. Once the spatial functions of the Spatial SWRL rules are comput...

  1. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...... be exploited to reduce the "administrative overhead" of the analysis specification and thus simplify it. Finally, we use HyLoTab, a fully automated theorem prover for hybrid logic, both as a convenient platform for a prototype implementation as well as to formally prove the correctness of the analysis. (C...

  2. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis

  3. Rod Ellis, Gary Barkhuizen, Analysing Learner Language

    OpenAIRE

    Narcy-Combes, Marie-Françoise

    2014-01-01

    Ce livre vient à point nommé pour compléter les outils à la disposition des jeunes chercheurs en linguistique appliquée et didactique des langues, comme des praticiens de terrain désireux de conduire une recherche-action. Comme souvent en ce qui concerne les ouvrages de Rod Ellis, il s’agit d’une somme : une étude diachronique des outils utilisés depuis les années soixante par les chercheurs en acquisition des langues pour l’analyse des productions écrites et orales des apprenants de langue. ...

  4. En analyse av Yoga-kundalini-upanisad

    OpenAIRE

    2006-01-01

    Avhandlingen En analyse av Yoga-kundalini-upanisad bygger på den indiske asketen Narayanaswamy Aiyers engelske oversettelse av Yoga-kundalini-upanisad, utgitt i Thirty Minor Upanisad-s, Including the Yoga Upanisad-s (Oklahoma, Santarasa Publications, 1980). Denne hinduistiske teksten er omtalt som en av de 21 yoga-upanishadene, den åttisjette av de 108 klassiske upanishadene, og utgjør en del av tekstkorpuset Krsna-Yajurveda. Teksten fungerer som en manual i øvelser fra disiplinene hathayoga,...

  5. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  6. Fully Coupled FE Analyses of Buried Structures

    Directory of Open Access Journals (Sweden)

    James T. Baylot

    1994-01-01

    Full Text Available Current procedures for determining the response of buried structures to the effects of the detonation of buried high explosives recommend decoupling the free-field stress analysis from the structure response analysis. A fully coupled (explosive–soil structure finite element analysis procedure was developed so that the accuracies of current decoupling procedures could be evaluated. Comparisons of the results of analyses performed using this procedure with scale-model experiments indicate that this finite element procedure can be used to effectively evaluate the accuracies of the methods currently being used to decouple the free-field stress analysis from the structure response analysis.

  7. Large scale breeder reactor pump dynamic analyses

    International Nuclear Information System (INIS)

    The lateral natural frequency and vibration response analyses of the Large Scale Breeder Reactor (LSBR) primary pump were performed as part of the total dynamic analysis effort to obtain the fabrication release. The special features of pump modeling are outlined in this paper. The analysis clearly demonstrates the method of increasing the system natural frequency by reducing the generalized mass without significantly changing the generalized stiffness of the structure. Also, a method of computing the maximum relative and absolute steady state responses and associated phase angles at given locations is provided. This type of information is very helpful in generating response versus frequency and phase angle versus frequency plots

  8. Visualizing Risk Prediction Models

    OpenAIRE

    Vanya Van Belle; Ben Van Calster

    2015-01-01

    Objective Risk prediction models can assist clinicians in making decisions. To boost the uptake of these models in clinical practice, it is important that end-users understand how the model works and can efficiently communicate its results. We introduce novel methods for interpretable model visualization. Methods The proposed visualization techniques are applied to two prediction models from the Framingham Heart Study for the prediction of intermittent claudication and stroke after atrial fib...

  9. Pyroshock prediction procedures

    Science.gov (United States)

    Piersol, Allan G.

    2002-05-01

    Given sufficient effort, pyroshock loads can be predicted by direct analytical procedures using Hydrocodes that analytically model the details of the pyrotechnic explosion and its interaction with adjacent structures, including nonlinear effects. However, it is more common to predict pyroshock environments using empirical procedures based upon extensive studies of past pyroshock data. Various empirical pyroshock prediction procedures are discussed, including those developed by the Jet Propulsion Laboratory, Lockheed-Martin, and Boeing.

  10. Predictability of Conversation Partners

    Science.gov (United States)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  11. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    -case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only......Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst...... compare the worst-case execution time bounds of different architectures....

  12. Hitchhikers’ guide to analysing bird ringing data

    Directory of Open Access Journals (Sweden)

    Harnos Andrea

    2015-12-01

    Full Text Available Bird ringing datasets constitute possibly the largest source of temporal and spatial information on vertebrate taxa available on the globe. Initially, the method was invented to understand avian migration patterns. However, data deriving from bird ringing has been used in an array of other disciplines including population monitoring, changes in demography, conservation management and to study the effects of climate change to name a few. Despite the widespread usage and importance, there are no guidelines available specifically describing the practice of data management, preparation and analyses of ringing datasets. Here, we present the first of a series of comprehensive tutorials that may help fill this gap. We describe in detail and through a real-life example the intricacies of data cleaning and how to create a data table ready for analyses from raw ringing data in the R software environment. Moreover, we created and present here the R package; ringR, designed to carry out various specific tasks and plots related to bird ringing data. Most methods described here can also be applied to a wide range of capture-recapture type data based on individual marking, regardless to taxa or research question.

  13. Transportation systems analyses: Volume 1: Executive Summary

    Science.gov (United States)

    1993-05-01

    The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.

  14. Hierarchical regression for analyses of multiple outcomes.

    Science.gov (United States)

    Richardson, David B; Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R; Chu, Haitao

    2015-09-01

    In cohort mortality studies, there often is interest in associations between an exposure of primary interest and mortality due to a range of different causes. A standard approach to such analyses involves fitting a separate regression model for each type of outcome. However, the statistical precision of some estimated associations may be poor because of sparse data. In this paper, we describe a hierarchical regression model for estimation of parameters describing outcome-specific relative rate functions and associated credible intervals. The proposed model uses background stratification to provide flexible control for the outcome-specific associations of potential confounders, and it employs a hierarchical "shrinkage" approach to stabilize estimates of an exposure's associations with mortality due to different causes of death. The approach is illustrated in analyses of cancer mortality in 2 cohorts: a cohort of dioxin-exposed US chemical workers and a cohort of radiation-exposed Japanese atomic bomb survivors. Compared with standard regression estimates of associations, hierarchical regression yielded estimates with improved precision that tended to have less extreme values. The hierarchical regression approach also allowed the fitting of models with effect-measure modification. The proposed hierarchical approach can yield estimates of association that are more precise than conventional estimates when one wishes to estimate associations with multiple outcomes. PMID:26232395

  15. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working. PMID:24203859

  16. Bioinformatics tools for analysing viral genomic data.

    Science.gov (United States)

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.

  17. CHEMICAL ANALYSES OF SODIUM SYSTEMS FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, W. O.; Yunker, W. H.; Scott, F. A.

    1970-06-01

    BNWL-1407 summarizes information gained from the Chemical Analyses of Sodium Systems Program pursued by Battelle- Northwest over the period from July 1967 through June 1969. Tasks included feasibility studies for performing coulometric titration and polarographic determinations of oxygen in sodium, and the development of new separation techniques for sodium impurities and their subsequent analyses. The program was terminated ahead of schedule so firm conclusions were not obtained in all areas of the work. At least 40 coulometric titrations were carried out and special test cells were developed for coulometric application. Data indicated that polarographic measurements are theoretically feasible, but practical application of the method was not verified. An emission spectrographic procedure for trace metal impurities was developed and published. Trace metal analysis by a neutron activation technique was shown to be feasible; key to the success of the activation technique was the application of a new ion exchange resin which provided a sodium separation factor of 10{sup 11}. Preliminary studies on direct scavenging of trace metals produced no conclusive results.

  18. ANALYSES ON SYSTEMATIC CONFRONTATION OF FIGHTER AIRCRAFT

    Institute of Scientific and Technical Information of China (English)

    HuaiJinpeng; WuZhe; HuangJun

    2002-01-01

    Analyses of the systematic confrontation between two military forcfes are the highest hierarchy on opera-tional effectiveness study of weapon systema.The physi-cal model for tactical many-on-many engagements of an aerial warfare with heterogeneous figher aircraft is estab-lished.On the basis of Lanchester multivariate equations of square law,a mathematical model corresponding to the established physical model is given.A superiorityh parame-ter is then derived directly from the mathematical model.With view to the high -tech condition of modern war-fare,the concept of superiority parameter which more well and truly reflects the essential of an air-to-air en-gagement is further formulated.The attrition coeffi-cients,which are key to the differential equations,are de-termined by using tactics of random target assignment and air-to-air capability index of the fighter aircraft.Hereby,taking the mathematical model and superiority parameter as cores,calculations amd analyses of complicate systemic problems such as evaluation of battle superiority,prog-mostication of combat process and optimization of colloca-tions have been accomplished.Results indicate that a clas-sical combat theory with its certain recent development has received newer applications in the military operation research for complicated confrontation analysis issues.

  19. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  20. Waste Stream Analyses for Nuclear Fuel Cycles

    Energy Technology Data Exchange (ETDEWEB)

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  1. Les conditions de l’analyse qualitative

    Directory of Open Access Journals (Sweden)

    Pierre Paillé

    2011-07-01

    Full Text Available Les méthodes d’analyse des données qualitatives et le monde informatique étaient faits pour se rencontrer. Et en effet, la question est d’actualité et les outils informatiques nombreux et avancés. Ce phénomène ne saurait s’estomper, d’autant moins que l’analyse des données secondaires connaît en même temps des développements importants. Mais l’attrait pour les logiciels d’analyse peut devenir tel qu’on ne verrait plus trop à quel titre et pour quelles raisons on pourrait s’en passer. L’article tente de cerner une vision et une pratique de l’analyse qualitative qui, dans son essence, ne se prête pas à l’utilisation d’outils informatiques spécialisés. Il situe sa réflexion dans le cadre de la méthodologie qualitative (démarche qualitative, recherche qualitative, analyse qualitative, plus particulièrement au niveau de l’enquête qualitative de terrain.The conditions of qualitative analysisThe methods for analyzing qualitative data and the computer world were meant to meet. And, by that, the question is valid and computer tools numerous and advanced. This phenomenon will not slowdown, especially not since secondary data analysis experiences significant developments. But the attraction for analysis software could develop so that we would not see too much why and for what type of reasons we could part from them. This article attempts to define a vision and a practice of qualitative analysis that, in essence, does not use specialized computer tools. It situates its reflection within qualitative methodology (qualitative approaches, qualitative research, qualitative analysis and moreover in the level of qualitative fieldwork investigation.Las condiciones del análisis cualitativo. Reflexiones sobre la utilización de programas informáticosLos métodos de análisis de los datos cualitativos y el mundo de la informática han nacido para entenderse mutualmente. En efecto, la cuestión es actual y los

  2. Machine learning algorithms for datasets popularity prediction

    CERN Document Server

    Kancys, Kipras

    2016-01-01

    This report represents continued study where ML algorithms were used to predict databases popularity. Three topics were covered. First of all, there was a discrepancy between old and new meta-data collection procedures, so a reason for that had to be found. Secondly, different parameters were analysed and dropped to make algorithms perform better. And third, it was decided to move modelling part on Spark.

  3. Improved nonlinear prediction method

    Science.gov (United States)

    Adenan, Nur Hamiza; Md Noorani, Mohd Salmi

    2014-06-01

    The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.

  4. Predicting AD conversion

    DEFF Research Database (Denmark)

    Liu, Yawu; Mattila, Jussi; Ruiz, Miguel �ngel Mu�oz;

    2013-01-01

    To compare the accuracies of predicting AD conversion by using a decision support system (PredictAD tool) and current research criteria of prodromal AD as identified by combinations of episodic memory impairment of hippocampal type and visual assessment of medial temporal lobe atrophy (MTA) on MRI...

  5. Prediction of Antibody Epitopes

    DEFF Research Database (Denmark)

    Nielsen, Morten; Marcatili, Paolo

    2015-01-01

    self-proteins. Given the sequence or the structure of a protein of interest, several methods exploit such features to predict the residues that are more likely to be recognized by an immunoglobulin.Here, we present two methods (BepiPred and DiscoTope) to predict linear and discontinuous antibody...

  6. Nuclear Analyses For ITER NB System

    International Nuclear Information System (INIS)

    Full text: Detailed nuclear analyses for the latest ITER NB system are required to ensure that NB design conforms to the nuclear regulations and licensing. A variety of nuclear analyses was conducted for the NB system including a tokamak building and outside the building by using Monte Carlo code MCNP5.14, activation code ACT-4 and Fusion Evaluated Nuclear Data Library FENDL-2.1. A special “Direct 1-step Monte Carlo” method is adopted for the shutdown dose rate calculation. The NB system and the tokamak building are very complicated, and it is practically impossible to make geometry input data manually. We used the automatic converter code GEOMIT from CAD data to MCNP geometry input data. GEOMIT was improved for these analyses, and the conversion performance was drastically enhanced. Void cells in MCNP input data were generated by subtracting solid cells data from simple rectangular void cells. The CAD data were successfully converted to MCNP geometry input data, and void data were also adequately produced with GEOMIT. The effective dose rates at external zones (non-controlled areas) should be less than 80 μSv/month according to French regulations. Shielding structures are under analysis to reduce the radiation streaming through the openings. We are confirming that the criterion is satisfied for the NB system. The effective dose rate data in the NB cell after shutdown are necessary to check the dose rate during possible rad-works for maintenance. Dose rates for workers must be maintained as low as reasonably achievable, and at locations where hands-on maintenance is performed should be below a target of 100 μSv/h at 12 days after shutdown. We are specifying the adequate zoning and area where hands-on maintenance can be allowed, based on the analysis results. The cask design for transport activated NB components is an important issue, and we are calculating the effective dose rates. The target of the effective dose rate from the activated NB components is less

  7. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-02-01

    Full Text Available Objectives: The study examined and identified the factors that affect lawyers’ attitudes to knowledge sharing, and their knowledge sharing behaviour. Specifically, it investigated the relationship between the salient beliefs affecting the knowledge sharing attitude of lawyers’, and applied a modified version of the Theory of Reasoned Action (TRA in the knowledge sharing context, to predict how these factors affect their knowledge sharing behaviour.Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance.Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’.Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers. 

  8. Castor-1C spent fuel storage cask decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    This report documents the decay heat, heat transfer, and shielding analyses of the Gesellschaft fuer Nuklear Services (GNS) CASTOR-1C cask used in a spent fuel storage demonstration performed at Preussen Elektra's Wurgassen nuclear power plant. The demonstration was performed between March 1982 and January 1984, and resulted in cask and fuel temperature data and cask exterior surface gamma-ray and neutron radiation dose rate measurements. The purpose of the analyses reported here was to evaluate decay heat, heat transfer, and shielding computer codes. The analyses consisted of (1) performing pre-look predictions (predictions performed before the analysts were provided the test data), (2) comparing ORIGEN2 (decay heat), COBRA-SFS and HYDRA (heat transfer), and QAD and DOT (shielding) results to data, and (3) performing post-test analyses if appropriate. Even though two heat transfer codes were used to predict CASTOR-1C cask test data, no attempt was made to compare the two codes. The codes are being evaluated with other test data (single-assembly data and other cask data), and to compare the codes based on one set of data may be premature and lead to erroneous conclusions

  9. Evaluating prediction uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D. [Los Alamos National Lab., NM (United States)

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  10. Trend Analyses of Nitrate in Danish Groundwater

    DEFF Research Database (Denmark)

    Hansen, B.; Thorling, L.; Dalgaard, Tommy;

    2012-01-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis...... two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded...... in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest...

  11. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  12. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  13. Anthocyanin analyses of Vaccinium fruit dietary supplements.

    Science.gov (United States)

    Lee, Jungmin

    2016-09-01

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed, their anthocyanin profiles (based on high-performance liquid chromatography [HPLC] separation) indicated if products' fruit origin listings were authentic. Over 30% of the Vaccinium fruit (cranberry, lingonberry, bilberry, and blueberry; 14 of 45) products available as dietary supplements did not contain the fruit listed as ingredients. Six supplements contained no anthocyanins. Five others had contents differing from labeled fruit (e.g., bilberry capsules containing Andean blueberry fruit). Of the samples that did contain the specified fruit (n = 27), anthocyanin content ranged from 0.04 to 14.37 mg per capsule, tablet, or teaspoon (5 g). Approaches to utilizing anthocyanins in assessment of sample authenticity, and a discussion of the challenges with anthocyanin profiles in quality control are both presented. PMID:27625778

  14. Feasibility Analyses of Integrated Broiler Production

    Directory of Open Access Journals (Sweden)

    L. Komalasari

    2010-12-01

    Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.

  15. Genetic Analyses of Meiotic Recombination in Arabidopsis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Meiosis is essential for sexual reproduction and recombination is a critical step required for normal meiosis. Understanding the underlying molecular mechanisms that regulate recombination ie important for medical, agricultural and ecological reasons. Readily available molecular and cytological tools make Arabidopsis an excellent system to study meiosis. Here we review recent developments in molecular genetic analyses on meiotic recombination. These Include studies on plant homologs of yeast and animal genes, as well as novel genes that were first identified in plants. The characterizations of these genes have demonstrated essential functions from the initiation of recombination by double-strand breaks to repair of such breaks, from the formation of double-Holliday junctions to possible resolution of these junctions, both of which are critical for crossover formation. The recent advances have ushered a new era in plant meiosis, in which the combination of genetics, genomics, and molecular cytology can uncover important gene functions.

  16. Attitude stability analyses for small artificial satellites

    International Nuclear Information System (INIS)

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude

  17. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  18. Analysing weak orbital signals in Gaia data

    CERN Document Server

    Lucy, L B

    2014-01-01

    Anomalous orbits are found when minimum-chi^{2} estimation is applied to synthetic Gaia data for weak orbital signals - i.e., orbits whose astrometric signatures are comparable to the single-scan measurement error (Pourbaix 2002). These orbits are nearly parabolic, edge-on, and their major axes align with the line-of-sight to the observer. Such orbits violate the Copernican principle (CPr) and as such could be rejected. However, the preferred alternative is to develop a statistical technique that incorporates the CPr as a fundamental postulate. This can be achieved in the context of Bayesian estimation by defining a Copernican prior. With this development, Pourbaix's anomalous orbits no longer arise. Instead, orbits with a somewhat higher chi^{2} but which do not violate the CPr are selected. Other areas of astronomy where the investigator must analyse data from 'imperfect experiments' might similarly benefit from appropriately- defined Copernican priors.

  19. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming;

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in-creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack...... with social engineering. Due to this combination of attack steps on technical and social levels, risk assessment in socio-technical systems is complex. Therefore, established risk assessment methods often abstract away the internal structure of an organisation and ignore human factors when modelling...... and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...

  20. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof;

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  1. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  2. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  3. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  4. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  5. First international intercomparison of image analysers

    CERN Document Server

    Pálfalvi, J; Eoerdoegh, I

    1999-01-01

    Image analyser systems used for evaluating solid state nuclear track detectors (SSNTD) were compared in order to establish minimum hardware and software requirements and methodology necessary in different fields of radiation dosimetry. For the purpose, CR-39 detectors (TASL, Bristol, U.K.) were irradiated with different (n,alpha) and (n,p) converters in a reference Pu-Be neutron field, in an underground laboratory with high radon concentration and by different alpha sources at the Atomic Energy Research Institute (AERI) in Budapest, Hungary. 6 sets of etched and pre-evaluated detectors and the 7th one without etching were distributed among the 14 laboratories from 11 countries. The participants measured the different track parameters and statistically evaluated the results, to determine the performance of their system. The statistical analysis of results showed high deviations from the mean values in many cases. As the conclusion of the intercomparison recommendations were given to fulfill those requirements ...

  6. Accounting for demand failures in Markovian analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wood, A.P.

    1980-01-03

    Reliability has become a fundamental concern in the development of nuclear power plants. The use of Markov Analysis in reliability evaluations is progressing quickly since it offers methods for dealing with repair considerations, common cause failures, time-dependent failure rates, cyclic failures, and availability calculations. A Markov process is a stochastic or time-dependent process allowing time-dependent failure processes to be modelled easily. In redundant safety systems, however, there are often passive system components whose failure to start on demand is higher than their probability of failure during the time for which the safety system must function. It may be impossible to model this failure process as a system with standby failure rates because the actual failure may be caused by the demand itself. This paper deals with a method for extending Markov Analyses to include demand failures.

  7. Communication analyses of plant operator crews

    International Nuclear Information System (INIS)

    Elucidation of crew communication aspects is required to improve the man-man interface which supports operators' diagnoses and decisions. Experiments to clarify operator performance under abnormal condition were evaluated by protocol analyses, interviews, etc. using a training simulator. We had the working hypothesis, based on experimental observations, that operator performance can be evaluated by analysis of crew communications. The following four approaches were tried to evaluate operator performance. (1) Crew performance was quantitatively evaluated by the number of tasks undertaken by an operator crew. (2) The group thinking process was clarified by cognition-communication flow. (3) The group response process was clarified by movement flow. (4) Quantitative indexes for evaluating crew performance were considered to be represented by the amount of information effectively exchanged among operators. (author)

  8. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  9. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  10. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40o. The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  11. The radiation analyses of ITER lower ports

    Energy Technology Data Exchange (ETDEWEB)

    Petrizzi, L., E-mail: petrizzi@frascati.enea.it [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy); Brolatti, G. [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy); Martin, A.; Loughlin, M. [ITER Organization, Cadarache, 13108 St Paul-lez-Durance (France); Moro, F.; Villari, R. [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy)

    2010-12-15

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40{sup o}. The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40{sup o} model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  12. Structural prediction in aphasia

    Directory of Open Access Journals (Sweden)

    Tessa Warren

    2015-05-01

    Full Text Available There is considerable evidence that young healthy comprehenders predict the structure of upcoming material, and that their processing is facilitated when they encounter material matching those predictions (e.g., Staub & Clifton, 2006; Yoshida, Dickey & Sturt, 2013. However, less is known about structural prediction in aphasia. There is evidence that lexical prediction may be spared in aphasia (Dickey et al., 2014; Love & Webb, 1977; cf. Mack et al, 2013. However, predictive mechanisms supporting facilitated lexical access may not necessarily support structural facilitation. Given that many people with aphasia (PWA exhibit syntactic deficits (e.g. Goodglass, 1993, PWA with such impairments may not engage in structural prediction. However, recent evidence suggests that some PWA may indeed predict upcoming structure (Hanne, Burchert, De Bleser, & Vashishth, 2015. Hanne et al. tracked the eyes of PWA (n=8 with sentence-comprehension deficits while they listened to reversible subject-verb-object (SVO and object-verb-subject (OVS sentences in German, in a sentence-picture matching task. Hanne et al. manipulated case and number marking to disambiguate the sentences’ structure. Gazes to an OVS or SVO picture during the unfolding of a sentence were assumed to indicate prediction of the structure congruent with that picture. According to this measure, the PWA’s structural prediction was impaired compared to controls, but they did successfully predict upcoming structure when morphosyntactic cues were strong and unambiguous. Hanne et al.’s visual-world evidence is suggestive, but their forced-choice sentence-picture matching task places tight constraints on possible structural predictions. Clearer evidence of structural prediction would come from paradigms where the content of upcoming material is not as constrained. The current study used self-paced reading study to examine structural prediction among PWA in less constrained contexts. PWA (n=17 who

  13. RAMA Surveillance Capsule and Component Activation Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Kenneth E.; Jones, Eric N. [TransWare Enterprises Inc., 1565 Mediterranean Dr., Sycamore, IL 60178 (United States); Carter, Robert G. [Electric Power Research Institute, 1300 West W. T. Harris Blvd., Charlotte, NC 28262 (United States)

    2011-07-01

    This paper presents the calculated-to-measured ratios associated with the application of the RAMA Fluence Methodology software to light water reactor surveillance capsule and reactor component activation evaluations. Comparisons to measurements are performed for pressurized water reactor and boiling water reactor surveillance capsule activity specimens from seventeen operating light water reactors. Comparisons to measurements are also performed for samples removed from the core shroud, top guide, and jet pump brace pads from two reactors. In conclusion: The flexible geometry modeling capabilities provided by RAMA, combined with the detailed representation of operating reactor history and anisotropic scattering detail, produces accurate predictions of the fast neutron fluence and neutron activation for BWR and PWR surveillance capsule geometries. This allows best estimate RPV fluence to be determined without the need for multiplicative bias corrections. The three-dimensional modeling capability in RAMA provides an accurate estimate of the fast neutron fluence for regions far removed from the core mid-plane elevation. The comparisons to activation measurements for various core components indicate that the RAMA predictions are reasonable, and notably conservative (i.e., C/M ratios are consistently greater than unity). It should be noted that in the current evaluations, the top and bottom fuel regions are represented by six inch height nodes. As a result, the leakage-induced decrease in power near the upper and lower edges of the core are not well represented in the current models. More precise predictions of fluence for components that lie above and below the core boundaries could be obtained if the upper and lower fuel nodes were subdivided into multiple axial regions with assigned powers that reflect the neutron leakage at the top and bottom of the core. This use of additional axial sub-meshing at the top and bottom of the core is analogous to the use of pin

  14. Analysing CMS transfers using Machine Learning techniques

    CERN Document Server

    Diotalevi, Tommaso

    2016-01-01

    LHC experiments transfer more than 10 PB/week between all grid sites using the FTS transfer service. In particular, CMS manages almost 5 PB/week of FTS transfers with PhEDEx (Physics Experiment Data Export). FTS sends metrics about each transfer (e.g. transfer rate, duration, size) to a central HDFS storage at CERN. The work done during these three months, here as a Summer Student, involved the usage of ML techniques, using a CMS framework called DCAFPilot, to process this new data and generate predictions of transfer latencies on all links between Grid sites. This analysis will provide, as a future service, the necessary information in order to proactively identify and maybe fix latency issued transfer over the WLCG.

  15. Failure analyses of composite bolted joints

    Science.gov (United States)

    Wilson, D. W.; Gillespie, J. W.; York, J. L.; Pipes, R. B.

    1980-01-01

    The complex failure behavior exhibited by bolted joints of graphite epoxy (Hercules AS/3501) was investigated for the net tension, bearing and shearout failure modes using combined analytical and experimental techniques. Plane stress, linear elastic, finite element methods were employed to determine the two dimensional state of stress resulting from a loaded hole in a finite width, semiinfinite strip. The stresses predicted by the finite element method were verified by experiment to lend credence to the analysis. The influence of joint geometric parameters on the state of stress and resultant strength of the joint was also studied. The resulting functional relationships found to exist between bolted joint strength and the geometric parameters, were applied in the formulation of semiempirical strength models for the basic failure modes. A point stress failure criterion was successfully applied as the failure criterion for the net tension and shearout failure modes.

  16. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  17. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  18. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  19. Zephyr - The prediction models

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, T.S.; Madsen, H.; Nielsen, H.Aa. [Informatics and Mathematical Modelling - DTU, Kgs. Lyngby (Denmark); Landberg, L.; Giebel, G. [Risoe National Lab., Roskilde (Denmark)

    2006-07-01

    This paper briefly describes new models and methods for predicting the wind power output from wind farms. The system is being developed in a project which has the research organization Risoe and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obtained by state-of-the-art parametric models. (au)

  20. On Prediction of EOP

    CERN Document Server

    Malkin, Z

    2009-01-01

    Two methods of prediction of the Pole coordinates and TAI-UTC were tested -- extrapolation of the deterministic components and ARIMA. It was found that each of these methods is most effective for certain length of prognosis. For short-time prediction ARIMA algorithm yields more accurate prognosis, and for long-time one extrapolation is preferable. So, the combined algorithm is being used in practice of IAA EOP Service. The accuracy of prognosis is close to accuracy of IERS algorithms. For prediction of nutation the program KSV-1996-1 by T. Herring is being used.

  1. Bond return predictability in expansions and recessions

    DEFF Research Database (Denmark)

    Engsted, Tom; Møller, Stig Vinther; Jensen, Magnus David Sander

    We document that over the period 1953-2011 US bond returns are predictable in expansionary periods but unpredictable during recessions. This result holds in both in-sample and out-of-sample analyses and using both univariate regressions and combination forecasting techniques. A simulation study...... shows that our tests have power to reject unpredictability in both expansions and recessions. To judge the economic significance of the results we compute utility gains for a meanvariance investor who takes the predictability patterns into account and show that utility gains are positive in expansions...... but negative in recessions. The results are also consistent with tests showing that the expectations hypothesis of the term structure holds in recessions but not in expansions. However, the results for bonds are in sharp contrast to results for stocks showing that stock returns are predictable in recessions...

  2. Differential AR algorithm for packet delay prediction

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Different delay prediction algorithms have been applied in multimedia communication, among which linear prediction is attractive because of its low complexity. AR (auto regressive) algorithm is a traditional one with low computation cost, while NLMS (normalize least mean square) algorithm is more precise. In this paper, referring to ARIMA (auto regression integrated with moving averages) model, a differential AR algorithm (DIAR) is proposed based on the analyses of both AR and NLMS algorithms. The prediction precision of the new algorithm is about 5-10 db higher than that of the AR algorithm without increasing the computation complexity.Compared with NLMS algorithm, its precision slightly improves by 0.1 db on average, but the algorithm complexity reduces more than 90%. Our simulation and tests also demonstrate that this method improves the performance of the average end-to-end delay and packet loss ratio significantly.

  3. Split Hopkinson pressure bar technique: Experiments, analyses and applications

    Science.gov (United States)

    Gama, Bazle Anwer

    A critical review of the Hopkinson bar experimental technique is performed to identify the validity and applicability of the classic one-dimensional theory. A finite element model of the Hopkinson bar experiment is developed in three-dimensions and is used in detailed numerical analyses. For a small diameter hard specimen, the bar-specimen interfaces are non-planar, which predicts higher specimen strain and, thus, lower initial modulus in the linear elastic phase of deformation. In such cases, the stress distribution in the specimen is not uni-axial and a chamfered specimen geometry is found to provide better uni-axial stress condition in the specimen. In addition, a new Hopkinson bar with transmission tube is found suitable for small strain measurement of small diameter specimens. A one-dimensional exact Hopkinson bar theory considering the stress wave propagation in an equal diameter specimen has been formulated which predicts physically meaningful results in all extreme cases as compared to classic theory. In light of the theoretical and numerical investigations, an experimental methodology for rate dependent modulus and strength is developed. Quasi-static and dynamic behavior of plain weave (15 x 15) S-2 glass/SC15 composites has been investigated. A new circular-rectangular prism specimen (C-RPS) geometry is found suitable for testing laminated composites in the in-plane directions. Rate sensitive strength, non-linear strain and elastic modulus parameters for plain-weave (15 x 15) S-2 glass/SC15 composites have been experimentally determined.

  4. An Illumination Modeling System for Human Factors Analyses

    Science.gov (United States)

    Huynh, Thong; Maida, James C.; Bond, Robert L. (Technical Monitor)

    2002-01-01

    Seeing is critical to human performance. Lighting is critical for seeing. Therefore, lighting is critical to human performance. This is common sense, and here on earth, it is easily taken for granted. However, on orbit, because the sun will rise or set every 45 minutes on average, humans working in space must cope with extremely dynamic lighting conditions. Contrast conditions of harsh shadowing and glare is also severe. The prediction of lighting conditions for critical operations is essential. Crew training can factor lighting into the lesson plans when necessary. Mission planners can determine whether low-light video cameras are required or whether additional luminaires need to be flown. The optimization of the quantity and quality of light is needed because of the effects on crew safety, on electrical power and on equipment maintainability. To address all of these issues, an illumination modeling system has been developed by the Graphics Research and Analyses Facility (GRAF) and Lighting Environment Test Facility (LETF) in the Space Human Factors Laboratory at NASA Johnson Space Center. The system uses physically based ray tracing software (Radiance) developed at Lawrence Berkeley Laboratories, a human factors oriented geometric modeling system (PLAID) and an extensive database of humans and environments. Material reflectivity properties of major surfaces and critical surfaces are measured using a gonio-reflectometer. Luminaires (lights) are measured for beam spread distribution, color and intensity. Video camera performances are measured for color and light sensitivity. 3D geometric models of humans and the environment are combined with the material and light models to form a system capable of predicting lighting conditions and visibility conditions in space.

  5. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  6. Hungarian approach to LOCA analyses for SARs

    International Nuclear Information System (INIS)

    The Hungarian AGNES project in the period of 1992-94 was performed with the aim to reassess the safety of the Paks NPP using state-of-the-art techniques. The project comprised - among others - a complete design basis accident (DBA) analysis. Major part of the thermal-hydraulic analyses has been performed by the RELAP5/mod2.5/V251 code version with conservative approach. In the medium size LOCA calculations and the PTS studies the six reactor cooling loops of the WWER-440/213 system were modelled by three loops (a single, a double and a triple loop). In the further developed version of the input model used in small break LOCA and other DBA analyses the six loops were modelled separately. The nodalisation schemes of the reactor vessel and the pressurizer, moreover the single primary loops are identical in the two input models. For the six-loop inputs model the trip cards, general tables and control variables are generated by using a RELAP5 object-oriented pre-processing interactive code, the TROPIC 4.0 code received from TRACTEBEL Belgium. The six-loop input model for WWER-440/V213 system was verified by the data of two operational transients measured in Paks NPP. The analysis of large break LOCAs, where the combined simultaneous upper plenum and downcomer injection results in a rather complicated process during reflooding phase, was carried out by using the ATHLET mod 1.1 Cycle code version (developed by GRS) in the framework of a bilateral German-Hungarian cooperation agreement using two-loop (1+5) input model. Later on in our safety analysis activities the application of best estimate methodology gained ground. In the last years AEKI in framework of different projects as US CAMP activity, EU PHARE and 5th Framework Programmes, as well as national projects to support the plant operation performed also many cases of LOCA analysis including primary to secondary leakages, feedwater and steam line breaks. These can be the preparation for a new DBA Analysis project

  7. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D;

    Genomic prediction uses markers (SNPs) across the whole genome to predict individual breeding values at an early growth stage potentially before large scale phenotyping. One of the applications of genomic prediction in plant breeding is to identify the best individual candidate lines to contribute...... to next generation. The main goal of this study was to see the potential of using genomic prediction in a commercial Barley breeding program. The data used in this study was from Nordic Seed company which is located in Denmark. Around 350 advanced lines were genotyped with 9K Barely chip from...... Illumina. Traits used in this study were grain yield, plant height and heading date. Heading date is number days it takes after 1st June for plant to head. Heritabilities were 0.33, 0.44 and 0.48 for yield, height and heading, respectively for the average of nine plots. The GBLUP model was used for genomic...

  8. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D;

    2015-01-01

    Genomic prediction uses markers (SNPs) across the whole genome to predict individual breeding values at an early growth stage potentially before large scale phenotyping. One of the applications of genomic prediction in plant breeding is to identify the best individual candidate lines to contribute...... to next generation. The main goal of this study was to see the potential of using genomic prediction in a commercial Barley breeding program. The data used in this study was from Nordic Seed company which is located in Denmark. Around 350 advanced lines were genotyped with 9K Barely chip from...... Illumina. Traits used in this study were grain yield, plant height and heading date. Heading date is number days it takes after 1st June for plant to head. Heritabilities were 0.33, 0.44 and 0.48 for yield, height and heading, respectively for the average of nine plots. The GBLUP model was used for genomic...

  9. Epitope prediction methods

    DEFF Research Database (Denmark)

    Karosiene, Edita

    leucocyte antigen (HLA) molecules, are encoded by extremely polymorphic genes on chromosome 6. Due to this polymorphism, thousands of different MHC molecules exist, making the experimental identification of peptide-MHC interactions a very costly procedure. This has primed the need for in silico peptide......-MHC prediction methods, and over the last decade several such methods have been successfully developed and used for epitope discovery purposes. My PhD project has been dedicated to improve methods for predicting peptide-MHC interactions by developing new strategies for training prediction algorithms based...... on machine learning techniques. Several MHC class I binding prediction algorithms have been developed and due to their high accuracy they are used by many immunologists to facilitate the conventional experimental process of epitope discovery. However, the accuracy of these methods depends on data defining...

  10. Predictable grammatical constructions

    DEFF Research Database (Denmark)

    Lucas, Sandra

    2015-01-01

    My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting...... that these predictable units should be considered grammatical constructions on a par with the nonpredictable constructions. Frequency has usually been seen as the only possible argument speaking in favor of viewing some formally and semantically fully predictable units as grammatical constructions. However, this paper...... semantically and formally predictable. Despite this difference, [méllo INF], like the other future periphrases, seems to be highly entrenched in the cognition (and grammar) of Early Medieval Greek language users, and consequently a grammatical construction. The syntactic evidence speaking in favor of [méllo...

  11. Predicting toxicity of nanoparticles

    OpenAIRE

    BURELLO ENRICO; Worth, Andrew

    2011-01-01

    A statistical model based on a quantitative structure–activity relationship accurately predicts the cytotoxicity of various metal oxide nanoparticles, thus offering a way to rapidly screen nanomaterials and prioritize testing.

  12. Robust Distributed Online Prediction

    CERN Document Server

    Dekel, Ofer; Shamir, Ohad; Xiao, Lin

    2010-01-01

    The standard model of online prediction deals with serial processing of inputs by a single processor. However, in large-scale online prediction problems, where inputs arrive at a high rate, an increasingly common necessity is to distribute the computation across several processors. A non-trivial challenge is to design distributed algorithms for online prediction, which maintain good regret guarantees. In \\cite{DMB}, we presented the DMB algorithm, which is a generic framework to convert any serial gradient-based online prediction algorithm into a distributed algorithm. Moreover, its regret guarantee is asymptotically optimal for smooth convex loss functions and stochastic inputs. On the flip side, it is fragile to many types of failures that are common in distributed environments. In this companion paper, we present variants of the DMB algorithm, which are resilient to many types of network failures, and tolerant to varying performance of the computing nodes.

  13. Trend Analyses of Nitrate in Danish Groundwater

    Science.gov (United States)

    Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.

    2012-04-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?

  14. Network-Based and Binless Frequency Analyses.

    Directory of Open Access Journals (Sweden)

    Sybil Derrible

    Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.

  15. Consumption patterns and perception analyses of hangwa.

    Science.gov (United States)

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-03-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price. PMID:24471065

  16. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  17. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  18. Parametric analyses of fusion-fission systems

    International Nuclear Information System (INIS)

    After a short review of the nuclear reactions relevant to fusion-fission systems the various types of blankets and characteristic model cases are presented. The fusion-fission system is modelled by its energy flow diagram. The system components and the system as a whole are characterized by 'component parameters' and 'system parameters' all of which are energy ratios. A cost estimate is given for the net energy delivered by the system, and a collection of formulas for the various energies flowing in the system in terms of the thermal energy delivered by the fusion part is presented. For sensitivity analysis four reference cases are defined which combine two plasma confinement schemes (mirror and tokamak) with two fissile fuel cycles (thorium-uranium and uranium-plutonium). The sensitivity of the critical plasma energy multiplication, of the circulating energy fraction, and of the energy cost with respect to changes of the component parameters is analysed. For the mirror case only superconducting magnets are considered, whereas two tokimak cases take into account both superconducting and normal-conducting coils. A section presenting relations between the plasma energy multiplication and the confinement parameter n tausub(E) of driven tokamak plasmas is added for reference. The conclusions summarize the results which could be obtained within the framework of energy balances, cost estimates and their parametric sensitivities. This is supplemented by listing those issues which lie beyond this scope but have to be taken into account when assessments of fusion-fission systems are made. (orig.)

  19. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  20. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  1. Field analyses of tritium at environmental levels

    Energy Technology Data Exchange (ETDEWEB)

    Hofstetter, K.J.; Cable, P.R.; Beals, D.M

    1999-02-11

    An automated, remote system to analyze tritium in aqueous solutions at environmental levels has been tested and has demonstrated laboratory quality tritium analysis capability in near real time. The field deployable tritium analysis system (FDTAS) consists of a novel multi-port autosampler, an on-line water purification system, and a prototype stop-flow liquid scintillation counter (LSC) which can be remotely controlled for unmanned operation. Backgrounds of {approx}1.5 counts/min in the tritium channel are routinely measured with a tritium detection efficiency of {approx}25% for the custom 11 ml cell. A detection limit of <0.3 pCi/ml has been achieved for 100-min counts using a 50 : 50 mixture of sample and cocktail. To assess the long-term performance characteristics of the FDTAS, a composite sampler was installed on the Savannah River, downstream of the Savannah River Site, and collected repetitive 12-hour composite samples over a 14 day period. The samples were analyzed using the FDTAS and in the laboratory using a standard bench-top LSC. The results of the tritium analyses by the FDTAS and by the laboratory LSC were consistent for comparable counting times at the typical river tritium background levels ({approx}1 pCi/ml)

  2. ANALYSES AND INFLUENCES OF GLAZED BUILDING ENVELOPES

    Directory of Open Access Journals (Sweden)

    Sabina Jordan

    2011-01-01

    Full Text Available The article presents the results of an analytical study of the functioning of glazing at two different yet interacting levels: at the level of the building as a whole, and at that of glazing as a building element. At the building level, analyses were performed on a sample of high-rise business buildings in Slovenia, where the glazing"s share of the building envelope was calculated, and estimates of the proportion of shade provided by external blinds were made. It is shown that, especially in the case of modern buildings with large proportions of glazing and buildings with no shading devices, careful glazing design is needed, together with a sound knowledge of energy performance. In the second part of the article, the energy balance values relating to selected types of glazing are presented, including solar control glazing. The paper demonstrates the need for a holistic energy approach to glazing problems, as well as how different types of glazing can be methodically compared, thus improving the design of sustainability-orientated buildings.

  3. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.)

  4. Kinematic gait analyses in healthy Golden Retrievers

    Directory of Open Access Journals (Sweden)

    Gabriela C.A. Silva

    2014-12-01

    Full Text Available Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female, aged between 2 and 4 years, weighing 21.5 to 28 kg, clinically normal. Flexion and extension were described for shoulder, elbow, carpal, hip, femorotibialis and tarsal joints. The gait was characterized lateral and had accepted hypothesis of normality for all variables, except for the stance of hip and elbow, considering a confidence level of 95%, significance level α = 0.05. Variations have been attributed to displacement of the stripes during movement and the duplicated number of reviews. The kinematic analysis proved to be a consistent method of evaluation of the movement during canine gait and the data can be used in the diagnosis and evaluation of canine gait in comparison to other studies and treatment of dogs with musculoskeletal disorders.

  5. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  6. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  7. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  8. 'Red Flag' Predictions

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    -generation prediction markets and outline its unique features as a third-generation prediction market. It is argued that frontline employees gain deep insights when they execute operational activities on an ongoing basis in the organization. The experiential learning from close interaction with internal and external...... stakeholders provides unique insights not otherwise available to senior management. We outline a methodology to agglomerate these insights in a performance barometer as an important source for problem identification and innovation....

  9. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  10. Use of EBSD Data in Numerical Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Becker, R; Wiland, H

    2000-01-14

    obtained for comparison with the model predictions. More recent work has taken advantage of automated data collection on deformed specimens as a means of collecting detailed and spatially correlated data for model validation. Although it will not be discussed in detail here, another area in which EBSD data is having a great impact is on recrystallization modeling. EBSD techniques can be used to collect data for quantitative microstructural analysis. This data can be used to infer growth kinetics of specific orientations, and this information can be synthesized into more accurate grain growth or recrystallization models. Another role which EBSD techniques may play is in determining initial structures for recrystallization models. A realistic starting structure is vital for evaluating the models, and attempts at predicting realistic structures with finite element simulations are not yet successful. As methodologies and equipment resolution continue to improve, it is possible that measured structures will serve as input for recrystallization models. Simulations have already been run using information obtained manually from a TEM.

  11. Operational Dust Prediction

    Science.gov (United States)

    Benedetti, Angela; Baldasano, Jose M.; Basart, Sara; Benincasa, Francesco; Boucher, Olivier; Brooks, Malcolm E.; Chen, Jen-Ping; Colarco, Peter R.; Gong, Sunlin; Huneeus, Nicolas; Jones, Luke; Lu, Sarah; Menut, Laurent; Morcrette, Jean-Jacques; Mulcahy, Jane; Nickovic, Slobodan; Garcia-Pando, Carlos P.; Reid, Jeffrey S.; Sekiyama, Thomas T.; Tanaka, Taichu Y.; Terradellas, Enric; Westphal, Douglas L.; Zhang, Xiao-Ye; Zhou, Chun-Hong

    2014-01-01

    Over the last few years, numerical prediction of dust aerosol concentration has become prominent at several research and operational weather centres due to growing interest from diverse stakeholders, such as solar energy plant managers, health professionals, aviation and military authorities and policymakers. Dust prediction in numerical weather prediction-type models faces a number of challenges owing to the complexity of the system. At the centre of the problem is the vast range of scales required to fully account for all of the physical processes related to dust. Another limiting factor is the paucity of suitable dust observations available for model, evaluation and assimilation. This chapter discusses in detail numerical prediction of dust with examples from systems that are currently providing dust forecasts in near real-time or are part of international efforts to establish daily provision of dust forecasts based on multi-model ensembles. The various models are introduced and described along with an overview on the importance of dust prediction activities and a historical perspective. Assimilation and evaluation aspects in dust prediction are also discussed.

  12. Analyse textuelle des discours: Niveaux ou plans d´analyse

    Directory of Open Access Journals (Sweden)

    Jean-Michel Adam

    2012-12-01

    Full Text Available L’article porte sur la théorie de l´Analyse Textuelle des Discours, à partir d´une reprisede la traduction brésilienne de La linguistique textuelle: introduction à l’analyse textuelle desdiscours (Cortez, 2008. L’ATD est pensée en fonction de trois observations préliminaires: lalinguistique textuelle est une des disciplines de l’analyse de discours, le texte est l’objet d’analysede l’ATD, et, dès qu’il y a texte, c’est-à-dire reconnaissance du fait qu’une suite d’énoncésforme un tout de communication, il y a effet de généricité, c’est-à-dire inscription de cette suited’énoncés dans une classe de discours. Le modèle théorique de l’ATD est éclairé par une reprisede son schéma 4, où sont représentés huit niveaux d’analyse. L´ATD est abordée sous l’angled’une double exigence – des raisons théoriques et des raisons méthodologiques et didactiquesqui conduisent à ces niveaux – et sont détaillées et illustrées les cinq plans ou niveaux d’analysetextuelle. Pour finir, des parties de l’oeuvre sont reprises et élargies, avec d’autres analyses où denouveaux aspcts théoriques sont détaillés.

  13. Genome-wide analyses of small noncoding RNAs in streptococci

    Directory of Open Access Journals (Sweden)

    Nadja ePatenge

    2015-05-01

    Full Text Available Streptococci represent a diverse group of Gram-positive bacteria, which colonize a wide range of hosts among animals and humans. Streptococcal species occur as commensal as well as pathogenic organisms. Many of the pathogenic species can cause severe, invasive infections in their hosts leading to a high morbidity and mortality. The consequence is a tremendous suffering on the part of men and livestock besides the significant financial burden in the agricultural and healthcare sectors. An environmentally stimulated and tightly controlled expression of virulence factor genes is of fundamental importance for streptococcal pathogenicity. Bacterial small noncoding RNAs (sRNAs modulate the expression of genes involved in stress response, sugar metabolism, surface composition, and other properties that are related to bacterial virulence. Even though the regulatory character is shared by this class of RNAs, variation on the molecular level results in a high diversity of functional mechanisms. The knowledge about the role of sRNAs in streptococci is still limited, but in recent years, genome-wide screens for sRNAs have been conducted in an increasing number of species. Bioinformatics prediction approaches have been employed as well as expression analyses by classical array techniques or next generation sequencing. This review will give an overview of whole genome screens for sRNAs in streptococci with a focus on describing the different methods and comparing their outcome considering sRNA conservation among species, functional similarities, and relevance for streptococcal infection.

  14. Rock penetration : finite element sensitivity and probabilistic modeling analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Fossum, Arlo Frederick

    2004-08-01

    This report summarizes numerical analyses conducted to assess the relative importance on penetration depth calculations of rock constitutive model physics features representing the presence of microscale flaws such as porosity and networks of microcracks and rock mass structural features. Three-dimensional, nonlinear, transient dynamic finite element penetration simulations are made with a realistic geomaterial constitutive model to determine which features have the most influence on penetration depth calculations. A baseline penetration calculation is made with a representative set of material parameters evaluated from measurements made from laboratory experiments conducted on a familiar sedimentary rock. Then, a sequence of perturbations of various material parameters allows an assessment to be made of the main penetration effects. A cumulative probability distribution function is calculated with the use of an advanced reliability method that makes use of this sensitivity database, probability density functions, and coefficients of variation of the key controlling parameters for penetration depth predictions. Thus the variability of the calculated penetration depth is known as a function of the variability of the input parameters. This simulation modeling capability should impact significantly the tools that are needed to design enhanced penetrator systems, support weapons effects studies, and directly address proposed HDBT defeat scenarios.

  15. Microstructural and compositional analyses of GaN-based nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Pretorius, Angelika; Mueller, Knut; Rosenauer, Andreas [Section Electron Microscopy, Institute of Solid State Physics, University of Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Schmidt, Thomas; Falta, Jens [Section Surface Physics, Institute of Solid State Physics, University of Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Aschenbrenner, Timo; Yamaguchi, Tomohiro; Dartsch, Heiko; Hommel, Detlef [Section Semiconductor Epitaxy, Institute of Solid State Physics, University of Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Kuebel, Christian [Institute of Nanotechnology, Karlsruher Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany)

    2011-08-15

    Composition and microstructure of GaN-based island structures and distributed Bragg reflectors (DBRs) were investigated with transmission electron microscopy (TEM). We analysed free-standing InGaN islands and islands capped with GaN. Growth of the islands performed by molecular beam epitaxy (MBE) and metal organic vapour phase epitaxy (MOVPE) resulted in different microstructures. The islands grown by MBE were plastically relaxed. Cap layer deposition resulted in a rapid dissolution of the islands already at early stages of cap layer growth. These findings are confirmed by grazing-incidence X-ray diffraction (GIXRD). In contrast, the islands grown by MOVPE relax only elastically. Strain state analysis (SSA) revealed that the indium concentration increases towards the tips of the islands. For an application as quantum dots, the islands must be embedded into DBRs. Structure and composition of Al{sub y}Ga{sub 1-y}N/GaN Bragg reflectors on top of an AlGaN buffer layer and In{sub x}Al{sub 1-x}N/GaN Bragg reflectors on top of a GaN buffer layer were investigated. Specifically, structural defects such as threading dislocations (TDs) and inversion domains (IDs) were studied, and we investigated thicknesses, interfaces and interface roughnesses of the layers. As the peak reflectivities of the investigated DBRs do not reach the theoretical predictions, possible reasons are discussed. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  16. Review of Approximate Analyses of Sheet Forming Processes

    Science.gov (United States)

    Weiss, Matthias; Rolfe, Bernard; Yang, Chunhui; de Souza, Tim; Hodgson, Peter

    2011-08-01

    Approximate models are often used for the following purposes: • in on-line control systems of metal forming processes where calculation speed is critical; • to obtain quick, quantitative information on the magnitude of the main variables in the early stages of process design; • to illustrate the role of the major variables in the process; • as an initial check on numerical modelling; and • as a basis for quick calculations on processes in teaching and training packages. The models often share many similarities; for example, an arbitrary geometric assumption of deformation giving a simplified strain distribution, simple material property descriptions—such as an elastic, perfectly plastic law—and mathematical short cuts such as a linear approximation of a polynomial expression. In many cases, the output differs significantly from experiment and performance or efficiency factors are developed by experience to tune the models. In recent years, analytical models have been widely used at Deakin University in the design of experiments and equipment and as a pre-cursor to more detailed numerical analyses. Examples that are reviewed in this paper include deformation of sandwich material having a weak, elastic core, load prediction in deep drawing, bending of strip (particularly of ageing steel where kinking may occur), process analysis of low-pressure hydroforming of tubing, analysis of the rejection rates in stamping, and the determination of constitutive models by an inverse method applied to bending tests.

  17. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  18. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  19. Taxometric analyses of paranoid and schizoid personality disorders.

    Science.gov (United States)

    Ahmed, Anthony Olufemi; Green, Bradley Andrew; Buckley, Peter Francis; McFarland, Megan Elizabeth

    2012-03-30

    There remains debate about whether personality disorders (PDs) are better conceptualized as categorical, reflecting discontinuity from normal personality; or dimensional, existing on a continuum of severity with normal personality traits. Evidence suggests that most PDs are dimensional but there is a lack of consensus about the structure of Cluster A disorders. Taxometric methods are adaptable to investigating the taxonic status of psychiatric disorders. The current study investigated the latent structure of paranoid and schizoid PDs in an epidemiological sample (N=43,093) drawn from the National Epidemiological Survey on Alcohol and Related Conditions (NESARC) using taxometric analyses. The current study used taxometric methods to analyze three indicators of paranoid PD - mistrust, resentment, and functional disturbance - and three indicators of schizoid PD - emotional detachment, social withdrawal, and functional disturbance - derived factor analytically. Overall, taxometrics supported a dimensional rather than taxonic structure for paranoid and schizoid PDs through examination of taxometric graphs and comparative curve fit indices. Dimensional models of paranoid and schizoid PDs better predicted social functioning, role-emotional, and mental health scales in the survey than categorical models. Evidence from the current study supports recent efforts to represent paranoid and schizoid PDs as well as other PDs along broad personality dimensions.

  20. Application of Polar Cap (PC) indices in analyses and forecasts of geophysical conditions

    Science.gov (United States)

    Stauning, Peter

    2016-07-01

    The Polar Cap (PC) indices could be considered to represent the input of power from the solar wind to the Earth's magnetosphere. The indices have been used to analyse interplanetary electric fields, effects of solar wind pressure pulses, cross polar cap voltages and polar cap diameter, ionospheric Joule heating, and other issues of polar cap dynamics. The PC indices have also been used to predict auroral electrojet intensities and global auroral power as well as ring current intensities. For specific space weather purposes the PC indices could be used to forecast substorm development and predict associated power line disturbances in the subauroral regions. The presentation shall outline the general background for applying the PC indices in analyses or forecasts of solar wind-magnetosphere-ionosphere interactions and provide illustrative examples of the use of the Polar Cap indices in specific cases

  1. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  2. On study design in neuroimaging heritability analyses

    Science.gov (United States)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  3. Pipeline for macro- and microarray analyses

    Directory of Open Access Journals (Sweden)

    R. Vicentini

    2007-05-01

    Full Text Available The pipeline for macro- and microarray analyses (PMmA is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps. It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.

  4. Pipeline for macro- and microarray analyses.

    Science.gov (United States)

    Vicentini, R; Menossi, M

    2007-05-01

    The pipeline for macro- and microarray analyses (PMmA) is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps). It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA. PMID:17464422

  5. Seismic criteria studies and analyses. Quarterly progress report No. 3. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-03

    Information is presented concerning the extent to which vibratory motions at the subsurface foundation level might differ from motions at the ground surface and the effects of the various subsurface materials on the overall Clinch River Breeder Reactor site response; seismic analyses of LMFBR type reactors to establish analytical procedures for predicting structure stresses and deformations; and aspects of the current technology regarding the representation of energy losses in nuclear power plants as equivalent viscous damping.

  6. Evaluation of mixed dentition analyses in north Indian population: A comparative study

    OpenAIRE

    Ravi Kumar Goyal; Vijay P Sharma; Pradeep Tandon; Amit Nagar; Gyan P Singh

    2014-01-01

    Introduction: Mixed dentition regression equations analyses (Moyers, Tanaka-Johnston) are based on European population , reliability of these methods is questionable over other population. Materials and Methods: The present study was conducted on total 260 study models. This study was done in two phases. In the first phase, linear regression equations were made. In the second phase, comparison of actual values of sum of mesiodistal width of canine, first and second premolars with the predicte...

  7. Analyses of hypomethylated oil palm gene space.

    Science.gov (United States)

    Low, Eng-Ti L; Rosli, Rozana; Jayanthi, Nagappan; Mohd-Amin, Ab Halim; Azizi, Norazah; Chan, Kuang-Lim; Maqbool, Nauman J; Maclean, Paul; Brauning, Rudi; McCulloch, Alan; Moraga, Roger; Ong-Abdullah, Meilina; Singh, Rajinder

    2014-01-01

    Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC) were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP) markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm. PMID:24497974

  8. Analyses of hypomethylated oil palm gene space.

    Directory of Open Access Journals (Sweden)

    Eng-Ti L Low

    Full Text Available Demand for palm oil has been increasing by an average of ∼8% the past decade and currently accounts for about 59% of the world's vegetable oil market. This drives the need to increase palm oil production. Nevertheless, due to the increasing need for sustainable production, it is imperative to increase productivity rather than the area cultivated. Studies on the oil palm genome are essential to help identify genes or markers that are associated with important processes or traits, such as flowering, yield and disease resistance. To achieve this, 294,115 and 150,744 sequences from the hypomethylated or gene-rich regions of Elaeis guineensis and E. oleifera genome were sequenced and assembled into contigs. An additional 16,427 shot-gun sequences and 176 bacterial artificial chromosomes (BAC were also generated to check the quality of libraries constructed. Comparison of these sequences revealed that although the methylation-filtered libraries were sequenced at low coverage, they still tagged at least 66% of the RefSeq supported genes in the BAC and had a filtration power of at least 2.0. A total 33,752 microsatellites and 40,820 high-quality single nucleotide polymorphism (SNP markers were identified. These represent the most comprehensive collection of microsatellites and SNPs to date and would be an important resource for genetic mapping and association studies. The gene models predicted from the assembled contigs were mined for genes of interest, and 242, 65 and 14 oil palm transcription factors, resistance genes and miRNAs were identified respectively. Examples of the transcriptional factors tagged include those associated with floral development and tissue culture, such as homeodomain proteins, MADS, Squamosa and Apetala2. The E. guineensis and E. oleifera hypomethylated sequences provide an important resource to understand the molecular mechanisms associated with important agronomic traits in oil palm.

  9. A conceptual DFT approach towards analysing toxicity

    Indian Academy of Sciences (India)

    U Sarkar; D R Roy; P K Chattaraj; R Parthasarathi; J Padmanabhan; V Subramanian

    2005-09-01

    The applicability of DFT-based descriptors for the development of toxicological structure-activity relationships is assessed. Emphasis in the present study is on the quality of DFT-based descriptors for the development of toxicological QSARs and, more specifically, on the potential of the electrophilicity concept in predicting toxicity of benzidine derivatives and the series of polyaromatic hydrocarbons (PAH) expressed in terms of their biological activity data (50). First, two benzidine derivatives, which act as electron-donating agents in their interactions with biomolecules are considered. Overall toxicity in general and the most probable site of reactivity in particular are effectively described by the global and local electrophilicity parameters respectively. Interaction of two benzidine derivatives with nucleic acid (NA) bases/selected base pairs is determined using Parr’s charge transfer formula. The experimental biological activity data (50) for the family of PAH, namely polychlorinated dibenzofurans (PCDF), polyhalogenated dibenzo--dioxins (PHDD) and polychlorinated biphenyls (PCB) are taken as dependent variables and the HF energy (), along with DFT-based global and local descriptors, viz., electrophilicity index () and local electrophilic power (+) respectively are taken as independent variables. Fairly good correlation is obtained showing the significance of the selected descriptors in the QSAR on toxins that act as electron acceptors in the presence of biomolecules. Effects of population analysis schemes in the calculation of Fukui functions as well as that of solvation are probed. Similarly, some electron-donor aliphatic amines are studied in the present work. We see that global and local electrophilicities along with the HF energy are adequate in explaining the toxicity of several substances

  10. Quantitative DNA Analyses for Airborne Birch Pollen.

    Directory of Open Access Journals (Sweden)

    Isabell Müller-Germann

    Full Text Available Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR, which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8 and the other for a multi-copy gene (ITS. The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm, the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  11. Aircraft noise prediction

    Science.gov (United States)

    Filippone, Antonio

    2014-07-01

    This contribution addresses the state-of-the-art in the field of aircraft noise prediction, simulation and minimisation. The point of view taken in this context is that of comprehensive models that couple the various aircraft systems with the acoustic sources, the propagation and the flight trajectories. After an exhaustive review of the present predictive technologies in the relevant fields (airframe, propulsion, propagation, aircraft operations, trajectory optimisation), the paper addresses items for further research and development. Examples are shown for several airplanes, including the Airbus A319-100 (CFM engines), the Bombardier Dash8-Q400 (PW150 engines, Dowty R408 propellers) and the Boeing B737-800 (CFM engines). Predictions are done with the flight mechanics code FLIGHT. The transfer function between flight mechanics and the noise prediction is discussed in some details, along with the numerical procedures for validation and verification. Some code-to-code comparisons are shown. It is contended that the field of aircraft noise prediction has not yet reached a sufficient level of maturity. In particular, some parametric effects cannot be investigated, issues of accuracy are not currently addressed, and validation standards are still lacking.

  12. Solar Cycle Prediction

    CERN Document Server

    Petrovay, K

    2010-01-01

    A review of solar cycle prediction methods and their performance is given, including forecasts for cycle 24 and focusing on aspects of the solar cycle prediction problem that have a bearing on dynamo theory. The scope of the review is further restricted to the issue of predicting the amplitude (and optionally the epoch) of an upcoming solar maximum no later than right after the start of the given cycle. Prediction methods form three main groups. Precursor methods rely on the value of some measure of solar activity or magnetism at a specified time to predict the amplitude of the following solar maximum. Their implicit assumption is that each numbered solar cycle is a consistent unit in itself, while solar activity seems to consist of a series of much less tightly intercorrelated individual cycles. Extrapolation methods, in contrast, are based on the premise that the physical process giving rise to the sunspot number record is statistically homogeneous, i.e., the mathematical regularities underlying its variati...

  13. A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species

    Directory of Open Access Journals (Sweden)

    Deepak Perumal, Chu Sing Lim, Vincent T.K. Chow, Kishore R. Sakharkar, Meena K. Sakharkar

    2008-01-01

    Full Text Available Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol lyase (EC: 2.5.1.49 in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.

  14. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    J. Scaglione

    1999-09-09

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

  15. Mortality of atomic bomb survivors predicted from laboratory animals

    Science.gov (United States)

    Carnes, Bruce A.; Grahn, Douglas; Hoel, David

    2003-01-01

    Exposure, pathology and mortality data for mice, dogs and humans were examined to determine whether accurate interspecies predictions of radiation-induced mortality could be achieved. The analyses revealed that (1) days of life lost per unit dose can be estimated for a species even without information on radiation effects in that species, and (2) accurate predictions of age-specific radiation-induced mortality in beagles and the atomic bomb survivors can be obtained from a dose-response model for comparably exposed mice. These findings illustrate the value of comparative mortality analyses and the relevance of animal data to the study of human health effects.

  16. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  17. Prediction model Perla

    International Nuclear Information System (INIS)

    Prediction model Perla presents one of a tool for an evaluation of a stream ecological status. It enables a comparing with a standard. The standard is formed by a dataset of sites from all area of the Czech Republic. The sites were influenced by a human activity as few as possible. 8 variables were used for prediction (distance from source, elevation, stream width and depth, slope, substrate roughness, longitude and latitude. All of them were statistically important for benthic communities. Results do not response ecoregions, but rather stream size (type). B (EQItaxonu), EQISi, EQIASPT a EQIH appears applicable for assessment using the prediction model and for natural and human stress differentiating. Limiting values of the indices for good ecological status are suggested. On the contrary, using of EQIEPT a EQIekoprof indices would be possible only with difficulties. (authors)

  18. Partially predictable chaos

    CERN Document Server

    Wernecke, Hendrik; Gros, Claudius

    2016-01-01

    For a chaotic system pairs of initially close-by trajectories become eventually fully uncorrelated on the attracting set. This process of decorrelation is split into an initial decrease characterized by the maximal Lyapunov exponent and a subsequent diffusive process on the chaotic attractor causing the final loss of predictability. The time scales of both processes can be either of the same or of very different orders of magnitude. In the latter case the two trajectories linger within a finite but small distance (with respect to the overall size of the attractor) for exceedingly long times and therefore remain partially predictable. We introduce a 0-1 indicator for chaos capable of describing this scenario, arguing, in addition, that the chaotic closed braids found close to a period-doubling transition are generically partially predictable.

  19. Predicting the Sunspot Cycle

    Science.gov (United States)

    Hathaway, David H.

    2009-01-01

    The 11-year sunspot cycle was discovered by an amateur astronomer in 1844. Visual and photographic observations of sunspots have been made by both amateurs and professionals over the last 400 years. These observations provide key statistical information about the sunspot cycle that do allow for predictions of future activity. However, sunspots and the sunspot cycle are magnetic in nature. For the last 100 years these magnetic measurements have been acquired and used exclusively by professional astronomers to gain new information about the nature of the solar activity cycle. Recently, magnetic dynamo models have evolved to the stage where they can assimilate past data and provide predictions. With the advent of the Internet and open data policies, amateurs now have equal access to the same data used by professionals and equal opportunities to contribute (but, alas, without pay). This talk will describe some of the more useful prediction techniques and reveal what they say about the intensity of the upcoming sunspot cycle.

  20. Predictive Techniques for Spacecraft Cabin Air Quality Control

    Science.gov (United States)

    Perry, J. L.; Cromes, Scott D. (Technical Monitor)

    2001-01-01

    As assembly of the International Space Station (ISS) proceeds, predictive techniques are used to determine the best approach for handling a variety of cabin air quality challenges. These techniques use equipment offgassing data collected from each ISS module before flight to characterize the trace chemical contaminant load. Combined with crew metabolic loads, these data serve as input to a predictive model for assessing the capability of the onboard atmosphere revitalization systems to handle the overall trace contaminant load as station assembly progresses. The techniques for predicting in-flight air quality are summarized along with results from early ISS mission analyses. Results from groundbased analyses of in-flight air quality samples are compared to the predictions to demonstrate the technique's relative conservatism.

  1. It's difficult, but important, to make negative predictions.

    Science.gov (United States)

    Williams, Richard V; Amberg, Alexander; Brigo, Alessandro; Coquin, Laurence; Giddings, Amanda; Glowienke, Susanne; Greene, Nigel; Jolly, Robert; Kemper, Ray; O'Leary-Steele, Catherine; Parenty, Alexis; Spirkl, Hans-Peter; Stalford, Susanne A; Weiner, Sandy K; Wichard, Joerg

    2016-04-01

    At the confluence of predictive and regulatory toxicologies, negative predictions may be the thin green line that prevents populations from being exposed to harm. Here, two novel approaches to making confident and robust negative in silico predictions for mutagenicity (as defined by the Ames test) have been evaluated. Analyses of 12 data sets containing >13,000 compounds, showed that negative predictivity is high (∼90%) for the best approach and features that either reduce the accuracy or certainty of negative predictions are identified as misclassified or unclassified respectively. However, negative predictivity remains high (and in excess of the prevalence of non-mutagens) even in the presence of these features, indicating that they are not flags for mutagenicity. PMID:26785392

  2. The Strepsiptera-Odyssey: the history of the systematic placement of an enigmatic parasitic insect order

    Directory of Open Access Journals (Sweden)

    H. Pohl

    2013-09-01

    Full Text Available The history of the phylogenetic placement of the parasitic insect order Strepsiptera is outlined. The first species was described in 1793 by P. Rossi and assigned to the hymenopteran family Ichneumonidae. A position close to the cucujiform beetle family Rhipiphoridae was suggested by several earlier authors. Others proposed a close relationship with Diptera or even a group Pupariata including Diptera, Strepsiptera and Coccoidea. A subordinate placement within the polyphagan series Cucujiformia close to the wood-associated Lymexylidae was favored by the coleopterist R.A. Crowson. W. Hennig considered a sistergroup relationship with Coleoptera as the most likely hypothesis but emphasized the uncertainty. Cladistic analyses of morphological data sets yielded very different placements, alternatively as sistergroup of Coleoptera, Antliophora, or all other holometabolan orders. Results based on ribosomal genes suggested a sistergroup relationship with Diptera (Halteria concept. A clade Coleopterida (Strepsiptera and Coleoptera was supported in two studies based on different combinations of protein coding nuclear genes. Analyses of data sets comprising seven or nine genes (7 single copy nuclear genes, respectively, yielded either a subordinate placement within Coleoptera or a sistergroup relationship with Neuropterida. Several early hypotheses based on a typological approach − affinities with Diptera, Coleoptera, a coleopteran subgroup, or Neuropterida − were revived using either a Hennigian approach or formal analyses of morphological characters or different molecular data sets. A phylogenomic approach finally supported a sistergroup relationship with monophyletic Coleoptera.

  3. Linguistic Structure Prediction

    CERN Document Server

    Smith, Noah A

    2011-01-01

    A major part of natural language processing now depends on the use of text data to build linguistic analyzers. We consider statistical, computational approaches to modeling linguistic structure. We seek to unify across many approaches and many kinds of linguistic structures. Assuming a basic understanding of natural language processing and/or machine learning, we seek to bridge the gap between the two fields. Approaches to decoding (i.e., carrying out linguistic structure prediction) and supervised and unsupervised learning of models that predict discrete structures as outputs are the focus. W

  4. Atmospheric predictability revisited

    Directory of Open Access Journals (Sweden)

    Lizzie S. R. Froude

    2013-06-01

    Full Text Available This article examines the potential to improve numerical weather prediction (NWP by estimating upper and lower bounds on predictability by re-visiting the original study of Lorenz (1982 but applied to the most recent version of the European Centre for Medium Range Weather Forecasts (ECMWF forecast system, for both the deterministic and ensemble prediction systems (EPS. These bounds are contrasted with an older version of the same NWP system to see how they have changed with improvements to the NWP system. The computations were performed for the earlier seasons of DJF 1985/1986 and JJA 1986 and the later seasons of DJF 2010/2011 and JJA 2011 using the 500-hPa geopotential height field. Results indicate that for this field, we may be approaching the limit of deterministic forecasting so that further improvements might only be obtained by improving the initial state. The results also show that predictability calculations with earlier versions of the model may overestimate potential forecast skill, which may be due to insufficient internal variability in the model and because recent versions of the model are more realistic in representing the true atmospheric evolution. The same methodology is applied to the EPS to calculate upper and lower bounds of predictability of the ensemble mean forecast in order to explore how ensemble forecasting could extend the limits of the deterministic forecast. The results show that there is a large potential to improve the ensemble predictions, but for the increased predictability of the ensemble mean, there will be a trade-off in information as the forecasts will become increasingly smoothed with time. From around the 10-d forecast time, the ensemble mean begins to converge towards climatology. Until this point, the ensemble mean is able to predict the main features of the large-scale flow accurately and with high consistency from one forecast cycle to the next. By the 15-d forecast time, the ensemble mean has lost

  5. Zephyr - the prediction models

    DEFF Research Database (Denmark)

    Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg;

    2001-01-01

    This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish...... utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models....

  6. RETAIL BANKRUPTCY PREDICTION

    Directory of Open Access Journals (Sweden)

    Johnny Pang

    2013-01-01

    Full Text Available This study reintroduces the famous discriminant functions from Edward Altman and Begley, Ming and Watts (BMW that were used to predict bankrupts. We will formulate three new discriminant functions which differ from Altman’s and BMW’s re-estimated Altman model. Altman’s models as well as Begley, Ming and Watts’s re-estimated Altman model apply to publicly traded industries, whereas the new models formulated in this study are based on retail companies. The three new functions will provide better predictions on retail bankruptcy and they will minimize the chance of misclassifications.

  7. The Effect of Scale Dependent Discretization on the Progressive Failure of Composite Materials Using Multiscale Analyses

    Science.gov (United States)

    Ricks, Trenton M.; Lacy, Thomas E., Jr.; Pineda, Evan J.; Bednarcyk, Brett A.; Arnold, Steven M.

    2013-01-01

    A multiscale modeling methodology, which incorporates a statistical distribution of fiber strengths into coupled micromechanics/ finite element analyses, is applied to unidirectional polymer matrix composites (PMCs) to analyze the effect of mesh discretization both at the micro- and macroscales on the predicted ultimate tensile (UTS) strength and failure behavior. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a PMC tensile specimen that initiates at the repeating unit cell (RUC) level. Three different finite element mesh densities were employed and each coupled with an appropriate RUC. Multiple simulations were performed in order to assess the effect of a statistical distribution of fiber strengths on the bulk composite failure and predicted strength. The coupled effects of both the micro- and macroscale discretizations were found to have a noticeable effect on the predicted UTS and computational efficiency of the simulations.

  8. Prediction method abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  9. THE PREDICTION OF OVULATION

    Institute of Scientific and Technical Information of China (English)

    WANGXin-Xing; ZHAShu-Wei; WUZhou-Ya

    1989-01-01

    The authors present their work on the prediction of ovulation in forty-five women with normal menstrual cycles for a total of 72 cycles by several indices, including ultrasonography, BBT graph, cervical mucus and mittelschmerz, LH peak values were also determined for reference in 20 cases ( 20 cycles ), Results are as follows:

  10. Predicting coronary heart disease

    DEFF Research Database (Denmark)

    Sillesen, Henrik; Fuster, Valentin

    2012-01-01

    Atherosclerosis is the leading cause of death and disabling disease. Whereas risk factors are well known and constitute therapeutic targets, they are not useful for prediction of risk of future myocardial infarction, stroke, or death. Therefore, methods to identify atherosclerosis itself have bee...

  11. Predicting Lotto Numbers

    NARCIS (Netherlands)

    Jorgensen, C.B.; Suetens, S.; Tyran, J.R.

    2011-01-01

    We investigate the "law of small numbers" using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto number

  12. Prediction of resonant oscillation

    DEFF Research Database (Denmark)

    2010-01-01

    The invention relates to methods for prediction of parametric rolling of vessels. The methods are based on frequency domain and time domain information in order do set up a detector able to trigger an alarm when parametric roll is likely to occur. The methods use measurements of e.g. pitch and roll...

  13. Predicting service life margins

    Science.gov (United States)

    Egan, G. F.

    1971-01-01

    Margins are developed for equipment susceptible to malfunction due to excessive time or operation cycles, and for identifying limited life equipment so monitoring and replacing is accomplished before hardware failure. Method applies to hardware where design service is established and where reasonable expected usage prediction is made.

  14. Gate valve performance prediction

    International Nuclear Information System (INIS)

    The Electric Power Research Institute is carrying out a program to improve the performance prediction methods for motor-operated valves. As part of this program, an analytical method to predict the stem thrust required to stroke a gate valve has been developed and has been assessed against data from gate valve tests. The method accounts for the loads applied to the disc by fluid flow and for the detailed mechanical interaction of the stem, disc, guides, and seats. To support development of the method, two separate-effects test programs were carried out. One test program determined friction coefficients for contacts between gate valve parts by using material specimens in controlled environments. The other test program investigated the interaction of the stem, disc, guides, and seat using a special fixture with full-sized gate valve parts. The method has been assessed against flow-loop and in-plant test data. These tests include valve sizes from 3 to 18 in. and cover a considerable range of flow, temperature, and differential pressure. Stem thrust predictions for the method bound measured results. In some cases, the bounding predictions are substantially higher than the stem loads required for valve operation, as a result of the bounding nature of the friction coefficients in the method

  15. Predicting Classroom Success.

    Science.gov (United States)

    Kessler, Ronald P.

    A study was conducted at Rancho Santiago College (RSC) to identify personal and academic factors that are predictive of students' success in their courses. The study examined the following possible predictors of success: language and math test scores; background characteristics; length of time out of high school; high school background; college…

  16. Predicting Intrinsic Motivation

    Science.gov (United States)

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the extent to…

  17. Predictability of critical transitions

    Science.gov (United States)

    Zhang, Xiaozhu; Kuehn, Christian; Hallerberg, Sarah

    2015-11-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socioeconomic changes and climate transitions between ice ages and warm ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However, especially in the presence of noise, it is not clear whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the quadratic integrate-and-fire model and the van der Pol model under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictability of the system. The performance of different indicator variables turns out to be dependent on the specific model under study and the conditions of accessing it. Furthermore, we study the influence of the magnitude of transitions on the predictive performance.

  18. Towards Predictive Association Theories

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios; Tsivintzelis, Ioannis; Michelsen, Michael Locht;

    2011-01-01

    Association equations of state like SAFT, CPA and NRHB have been previously applied to many complex mixtures. In this work we focus on two of these models, the CPA and the NRHB equations of state and the emphasis is on the analysis of their predictive capabilities for a wide range of applications...

  19. PREDICTION OF OVULATION

    Institute of Scientific and Technical Information of China (English)

    LIUYong; CHENSu-Ru; ZHOUJin-Ting; LIUJi-Ying

    1989-01-01

    The purpose or this research is: I) to observe the secretory pattern of five reproductive hormones in Chinese women with normal menstrual cyclcs, especially at the prc-ovulatory peroid; 2) to study whether urinary LH measurement could be used instead of serum LH measurement; 3) to evaluate the significance of LH-EIA kit (Right-Day) for ovulation prediction.

  20. Prediction in OLAP Cube

    Directory of Open Access Journals (Sweden)

    Abdellah Sair

    2012-05-01

    Full Text Available Data warehouses are now offering an adequate solution for managing large volumes of data. Online analysis supports OLAP data warehouses in the process of decision support and visualization tools offer, structure and operation of data warehouse. On the other hand, data mining allows the extraction of knowledge with technical description, classification, explanation and prediction. It is therefore possible to better understand the data by coupling on-line analysis with data mining through a unified analysis process. Continuing the work of R. Ben Messaoud, where exploitation of the coupling of on-line analysis and data mining focuses on the description, visualization, classification and explanation, we propose extending the OLAP prediction capabilities. To integrate the prediction in the heart of OLAP, an approach based on automatic learning with regression trees is proposed in order to predict the value of an aggregate or a measure. We will try to express our approach using data from a service management reviews to know that it would be the average obtained by the students if we open a new module, for a department at a certain criterion.

  1. Can observers predict trustworthiness?

    NARCIS (Netherlands)

    M. Belot; V. Bhaskar; J. van de Ven

    2009-01-01

    We analyze experimental evidence on whether untrained subjects can predict how trustworthy an individual is. Two players on a TV show play a high stakes prisoner's dilemma with pre-play communication. Our subjects report probabilistic beliefs that each player cooperates, before and after communicati

  2. Chloride ingress prediction

    DEFF Research Database (Denmark)

    Frederiksen, Jens Mejer; Geiker, Mette Rica

    2008-01-01

    Prediction of chloride ingress into concrete is an important part of durability design of reinforced concrete structures exposed to chloride containing environment. This paper presents experimentally based design parameters for Portland cement concretes with and without silica fume and fly ash in...

  3. Analyses of the early stages of star formation

    Science.gov (United States)

    Lintott, Christopher John

    This thesis presents a study of the physical and chemical properties of star forming regions, both in the Milky Way and in the distant Universe, building on the existing astrochem- ical models developed by the group at UCL. Observations of the nearby star-forming region, L134A, which were carried out with the James Clark Maxwell Telescope (JCMT) in Hawai'i are compared to the predictions of a model of star formation from gas rich in atomic (rather than molecular) hydrogen. A similar model is used to investigate the effect of non-equilibrium chemistry on the derivation of the cosmic-ray ionization rate, an important parameter in controlling both the chemistry and the physics of star forming clumps. A collapse faster than free-fall is proposed as an explanation for differences be tween the distribution of CS and N2H+ in such regions. Moving beyond the Milky Way, JCMT observations of sulphur-bearing species in the nearby starburst galaxy, M82, are presented and compared with existing molecular observations of similar systems. M82 is a local anlogue for star forming systems in the early Universe, many of which have star formation rates several thousand times that of the Milky Way. A model which treats the molecular gas in such systems as an assembly of 'hot cores' (protostellar cores which have a distinctive chemical signature) has been developed, and is used to predict the abundance of many species. An application of this model is used to explain the observed deviation in the early Universe from the otherwise tight relation between infrared and HCN luminosity via relatively recent star formation from near-primordial gas. Many of the stars formed in the early Universe must now be in massive elliptical systems, and work on the structure of these systems is presented. Data from the Sloan Digital Sky Survey is analysed to show that such galaxies have cores dominated by baryons rather than dark matter, and the dark matter profile is constrained by adiabatic contraction.

  4. L'analyse qualitative comme approche multiple

    Directory of Open Access Journals (Sweden)

    Roberto Cipriani

    2009-11-01

    Full Text Available L’exemple de l’enquête historique, visant à identifier les caractéristiques de la naissance et du développement d’une science et des lectures qu’elle donne des événements sociaux, est des plus originaux. Toute méthodologie historique non seulement débouche sur une pure et simple masse d’épisodes et d’événements, mais est également une narration et une élaboration critique de ces mêmes faits. Michael Postan écrit à juste titre que la complexité des données historiques est cependant de telle nature, et les différences et les similitudes tellement difficiles à cerner, que les efforts des historiens et des sociologues pour construire des comparaisons explicites se sont soldées, pour la plupart, par des tentatives grossières et naïves. La leçon des Annales a contribué en effet à construire l’idée d’une histoire qui puisse lire et expliquer ce qui est uniforme et ce qui est singulier. Rien de plus naturel que la réunion d’« êtres psychiques », à l’instar de l’assemblage des cellules en un organisme, en un « être psychique » nouveau et différent. Un tournant s’impose donc vers une expérimentation empirique plus ample et plus correcte, afin de disposer des instruments adéquats, capables de garantir à la méthodologie micro, qualitative et biographique, une fiabilité suffisante.Historical approach offers a relevant contribution in order to find the features of birth and development of a science which analyses social events. Historical methodology produces not only a lot of data but also a narrative, and an interpretation of facts. According to Michael Postan, history and sociology have made many efforts to compare data that are complex but similar and different at the same time. And the results seem to be naïf. Thanks to Les Annales suggestion it is possible to read and to explain what is uniform and what is singular. To put together “psychical beings”, like organic cells, in a new

  5. Exchange Rate Predictions

    OpenAIRE

    Yablonskyy, Karen

    2012-01-01

    The aim of this thesis is to analyze the foreign exchange currency forecasting techniques. Moreover the central idea behind the topic is to develop the strategy of forecasting by choosing indicators and techniques to make own forecast on currency pair EUR/USD. This thesis work is a mixture of theory and practice analyses. The goal during the work on this project was to study different types of forecasting techniques and make own forecast, practice forecasting and trading on Forex platform, ba...

  6. Predictive role of the nighttime blood pressure

    DEFF Research Database (Denmark)

    Hansen, Tine W; Li, Yan; Boggia, José;

    2011-01-01

    Numerous studies addressed the predictive value of the nighttime blood pressure (BP) as captured by ambulatory monitoring. However, arbitrary cutoff limits in dichotomized analyses of continuous variables, data dredging across selected subgroups, extrapolation of cross-sectional studies...... of conclusive evidence proving that nondipping is a reversible risk factor, the option whether or not to restore the diurnal blood pressure profile to a normal pattern should be left to the clinical judgment of doctors and should be individualized for each patient. Current guidelines on the interpretation...... studies in hypertensive patients (n = 23 856) separately from those in individuals randomly recruited from populations (n = 9641). We pooled summary statistics and individual subject data, respectively. In both patients and populations, in analyses in which nighttime BP was additionally adjusted...

  7. Predicting Major Solar Eruptions

    Science.gov (United States)

    Kohler, Susanna

    2016-05-01

    Coronal mass ejections (CMEs) and solar flares are two examples of major explosions from the surface of the Sun but theyre not the same thing, and they dont have to happen at the same time. A recent study examines whether we can predict which solar flares will be closely followed by larger-scale CMEs.Image of a solar flare from May 2013, as captured by NASAs Solar Dynamics Observatory. [NASA/SDO]Flares as a Precursor?A solar flare is a localized burst of energy and X-rays, whereas a CME is an enormous cloud of magnetic flux and plasma released from the Sun. We know that some magnetic activity on the surface of the Sun triggers both a flare and a CME, whereas other activity only triggers a confined flare with no CME.But what makes the difference? Understanding this can help us learn about the underlying physical drivers of flares and CMEs. It also might help us to better predict when a CME which can pose a risk to astronauts, disrupt radio transmissions, and cause damage to satellites might occur.In a recent study, Monica Bobra and Stathis Ilonidis (Stanford University) attempt to improve our ability to make these predictions by using a machine-learning algorithm.Classification by ComputerUsing a combination of 6 or more features results in a much better predictive success (measured by the True Skill Statistic; higher positive value = better prediction) for whether a flare will be accompanied by a CME. [Bobra Ilonidis 2016]Bobra and Ilonidis used magnetic-field data from an instrument on the Solar Dynamics Observatory to build a catalog of solar flares, 56 of which were accompanied by a CME and 364 of which were not. The catalog includes information about 18 different features associated with the photospheric magnetic field of each flaring active region (for example, the mean gradient of the horizontal magnetic field).The authors apply a machine-learning algorithm known as a binary classifier to this catalog. This algorithm tries to predict, given a set of features

  8. Validation of HELIOS for ATR Core Follow Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bays, Samuel E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Swain, Emily T. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Crawford, Douglas S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nigg, David W. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    This work summarizes the validation analyses for the HELIOS code to support core design and safety assurance calculations of the Advanced Test Reactor (ATR). Past and current core safety assurance is performed by the PDQ-7 diffusion code; a state of the art reactor physics simulation tool from the nuclear industry’s earlier days. Over the past twenty years, improvements in computational speed have enabled the use of modern neutron transport methodologies to replace the role of diffusion theory for simulation of complex systems, such as the ATR. More exact methodologies have enabled a paradigm-shift away from highly tuned codes that force compliance with a bounding safety envelope, and towards codes regularly validated against routine measurements. To validate HELIOS, the 16 ATR operational cycles from late-2009 to present were modeled. The computed power distribution was compared against data collected by the ATR’s on-line power surveillance system. It was found that the ATR’s lobe-powers could be determined with ±10% accuracy. Also, the ATR’s cold startup shim configuration for each of these 16 cycles was estimated and compared against the reported critical position from the reactor log-book. HELIOS successfully predicted criticality within the tolerance set by the ATR startup procedure for 13 out of the 16 cycles. This is compared to 12 times for PDQ (without empirical adjustment). These findings, as well as other insights discussed in this report, suggest that HELIOS is highly suited for replacing PDQ for core safety assurance of the ATR. Furthermore, a modern verification and validation framework has been established that allows reactor and fuel performance data to be computed with a known degree of accuracy and stated uncertainty.

  9. Pan-cancer analyses of the nuclear receptor superfamily

    Science.gov (United States)

    Long, Mark D.; Campbell, Moray J.

    2016-01-01

    Nuclear receptors (NR) act as an integrated conduit for environmental and hormonal signals to govern genomic responses, which relate to cell fate decisions. We review how their integrated actions with each other, shared co-factors and other transcription factors are disrupted in cancer. Steroid hormone nuclear receptors are oncogenic drivers in breast and prostate cancer and blockade of signaling is a major therapeutic goal. By contrast to blockade of receptors, in other cancers enhanced receptor function is attractive, as illustrated initially with targeting of retinoic acid receptors in leukemia. In the post-genomic era large consortia, such as The Cancer Genome Atlas, have developed a remarkable volume of genomic data with which to examine multiple aspects of nuclear receptor status in a pan-cancer manner. Therefore to extend the review of NR function we have also undertaken bioinformatics analyses of NR expression in over 3000 tumors, spread across six different tumor types (bladder, breast, colon, head and neck, liver and prostate). Specifically, to ask how the NR expression was distorted (altered expression, mutation and CNV) we have applied bootstrapping approaches to simulate data for comparison, and also compared these NR findings to 12 other transcription factor families. Nuclear receptors were uniquely and uniformly downregulated across all six tumor types, more than predicted by chance. These approaches also revealed that each tumor type had a specific NR expression profile but these were most similar between breast and prostate cancer. Some NRs were down-regulated in at least five tumor types (e.g. NR3C2/MR and NR5A2/LRH-1)) whereas others were uniquely down-regulated in one tumor (e.g. NR1B3/RARG). The downregulation was not driven by copy number variation or mutation and epigenetic mechanisms maybe responsible for the altered nuclear receptor expression. PMID:27200367

  10. Pan-Cancer Analyses of the Nuclear Receptor Superfamily

    Directory of Open Access Journals (Sweden)

    Mark D. Long

    2015-12-01

    Full Text Available Nuclear receptors (NR act as an integrated conduit for environmental and hormonal signals to govern genomic responses, which relate to cell fate decisions. We review how their integrated actions with each other, shared co-factors and other transcription factors are disrupted in cancer. Steroid hormone nuclear receptors are oncogenic drivers in breast and prostate cancer and blockade of signaling is a major therapeutic goal. By contrast to blockade of receptors, in other cancers enhanced receptor function is attractive, as illustrated initially with targeting of retinoic acid receptors in leukemia. In the post-genomic era large consortia, such as The Cancer Genome Atlas, have developed a remarkable volume of genomic data with which to examine multiple aspects of nuclear receptor status in a pan-cancer manner. Therefore to extend the review of NR function we have also undertaken bioinformatics analyses of NR expression in over 3000 tumors, spread across six different tumor types (bladder, breast, colon, head and neck, liver and prostate. Specifically, to ask how the NR expression was distorted (altered expression, mutation and CNV we have applied bootstrapping approaches to simulate data for comparison, and also compared these NR findings to 12 other transcription factor families. Nuclear receptors were uniquely and uniformly downregulated across all six tumor types, more than predicted by chance. These approaches also revealed that each tumor type had a specific NR expression profile but these were most similar between breast and prostate cancer. Some NRs were down-regulated in at least five tumor types (e.g., NR3C2/MR and NR5A2/LRH-1 whereas others were uniquely down-regulated in one tumor (e.g., NR1B3/RARG. The downregulation was not driven by copy number variation or mutation and epigenetic mechanisms maybe responsible for the altered nuclear receptor expression.

  11. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven;

    2006-01-01

    To improve the productivity of the development process, more and more tools for static software analysis are tightly integrated into the incremental build process of an IDE. If multiple interdependent analyses are used simultaneously, the coordination between the analyses becomes a major obstacle...... to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...

  12. Prediction of postoperative pain: a systematic review of predictive experimental pain studies

    DEFF Research Database (Denmark)

    Werner, Mads Utke; Mjöbo, Helena N; Nielsen, Per R;

    2010-01-01

    Quantitative testing of a patient's basal pain perception before surgery has the potential to be of clinical value if it can accurately predict the magnitude of pain and requirement of analgesics after surgery. This review includes 14 studies that have investigated the correlation between...... preoperative responses to experimental pain stimuli and clinical postoperative pain and demonstrates that the preoperative pain tests may predict 4-54% of the variance in postoperative pain experience depending on the stimulation methods and the test paradigm used. The predictive strength is much higher than...... previously reported for single factor analyses of demographics and psychologic factors. In addition, some of these studies indicate that an increase in preoperative pain sensitivity is associated with a high probability of development of sustained postsurgical pain....

  13. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  14. Refining intra-protein contact prediction by graph analysis

    Directory of Open Access Journals (Sweden)

    Eyal Eran

    2007-05-01

    Full Text Available Abstract Background Accurate prediction of intra-protein residue contacts from sequence information will allow the prediction of protein structures. Basic predictions of such specific contacts can be further refined by jointly analyzing predicted contacts, and by adding information on the relative positions of contacts in the protein primary sequence. Results We introduce a method for graph analysis refinement of intra-protein contacts, termed GARP. Our previously presented intra-contact prediction method by means of pair-to-pair substitution matrix (P2PConPred was used to test the GARP method. In our approach, the top contact predictions obtained by a basic prediction method were used as edges to create a weighted graph. The edges were scored by a mutual clustering coefficient that identifies highly connected graph regions, and by the density of edges between the sequence regions of the edge nodes. A test set of 57 proteins with known structures was used to determine contacts. GARP improves the accuracy of the P2PConPred basic prediction method in whole proteins from 12% to 18%. Conclusion Using a simple approach we increased the contact prediction accuracy of a basic method by 1.5 times. Our graph approach is simple to implement, can be used with various basic prediction methods, and can provide input for further downstream analyses.

  15. Final report on reliability and lifetime prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Kenneth T; Wise, Jonathan; Jones, Gary D.; Causa, Al G.; Terrill, Edward R.; Borowczak, Marc

    2012-12-01

    This document highlights the important results obtained from the subtask of the Goodyear CRADA devoted to better understanding reliability of tires and to developing better lifetime prediction methods. The overall objective was to establish the chemical and physical basis for the degradation of tires using standard as well as unique models and experimental techniques. Of particular interest was the potential application of our unique modulus profiling apparatus for assessing tire properties and for following tire degradation. During the course of this complex investigation, extensive relevant information was generated, including experimental results, data analyses and development of models and instruments. Detailed descriptions of the findings are included in this report.

  16. Educational Data Mining & Students’ Performance Prediction

    Directory of Open Access Journals (Sweden)

    Amjad Abu Saa

    2016-05-01

    Full Text Available It is important to study and analyse educational data especially students’ performance. Educational Data Mining (EDM is the field of study concerned with mining educational data to find out interesting patterns and knowledge in educational organizations. This study is equally concerned with this subject, specifically, the students’ performance. This study explores multiple factors theoretically assumed to affect students’ performance in higher education, and finds a qualitative model which best classifies and predicts the students’ performance based on related personal and social factors.

  17. Prediction of Wild-type Enzyme Characteristics

    DEFF Research Database (Denmark)

    Geertz-Hansen, Henrik Marcus

    of biotechnology, including enzyme discovery and characterization. This work presents two articles on sequence-based discovery and functional annotation of enzymes in environmental samples, and two articles on analysis and prediction of enzyme thermostability and cofactor requirements. The first article presents...... a sequence-based approach to discovery of proteolytic enzymes in metagenomes obtained from the Polar oceans. We show that microorganisms living in these extreme environments of constant low temperature harbour genes encoding novel proteolytic enzymes with potential industrial relevance. The second article...... presents a web server for the processing and annotation of functional metagenomics sequencing data, tailored to meet the requirements of non-bioinformaticians. The third article presents analyses of the molecular determinants of enzyme thermostability, and a feature-based prediction method of the melting...

  18. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.

    2011-11-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis is that of no difference, and a spatial loss differential is created based on the observed data, the two sets of predictions, and the loss function chosen by the researcher. The test assumes only isotropy and short-range spatial dependence of the loss differential but does allow it to be non-Gaussian, non-zero-mean, and spatially correlated. Constant and nonconstant spatial trends in the loss differential are treated in two separate cases. Monte Carlo simulations illustrate the size and power properties of this test, and an example based on daily average wind speeds in Oklahoma is used for illustration. Supplemental results are available online. © 2011 American Statistical Association and the American Society for Qualitys.

  19. Predictive Hypothesis Identification

    CERN Document Server

    Hutter, Marcus

    2008-01-01

    While statistics focusses on hypothesis testing and on estimating (properties of) the true sampling distribution, in machine learning the performance of learning algorithms on future data is the primary issue. In this paper we bridge the gap with a general principle (PHI) that identifies hypotheses with best predictive performance. This includes predictive point and interval estimation, simple and composite hypothesis testing, (mixture) model selection, and others as special cases. For concrete instantiations we will recover well-known methods, variations thereof, and new ones. PHI nicely justifies, reconciles, and blends (a reparametrization invariant variation of) MAP, ML, MDL, and moment estimation. One particular feature of PHI is that it can genuinely deal with nested hypotheses.

  20. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Jørgensen, Claus Bjørn; Suetens, Sigrid; Tyran, Jean-Robert

    numbers based on recent drawings. While most players pick the same set of numbers week after week without regards of numbers drawn or anything else, we find that those who do change, act on average in the way predicted by the law of small numbers as formalized in recent behavioral theory. In particular......We investigate the “law of small numbers” using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto......, on average they move away from numbers that have recently been drawn, as suggested by the “gambler’s fallacy”, and move toward numbers that are on streak, i.e. have been drawn several weeks in a row, consistent with the “hot hand fallacy”....

  1. Predictability of Critical Transitions

    CERN Document Server

    Zhang, Xiaozhu; Hallerberg, Sarah

    2015-01-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socio-economic changes and climate transitions between ice-ages and warm-ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However especially in the presence of noise it is not clear, whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the the quadratic integrate-and-fire model and the van der Pol model, under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictabil...

  2. Predicting Bankruptcy in Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul RASHID

    2011-09-01

    Full Text Available This paper aims to identify the financial ratios that are most significant in bankruptcy prediction for the non-financial sector of Pakistan based on a sample of companies which became bankrupt over the time period 1996-2006. Twenty four financial ratios covering four important financial attributes, namely profitability, liquidity, leverage, and turnover ratios, were examined for a five-year period prior bankruptcy. The discriminant analysis produced a parsimonious model of three variables viz. sales to total assets, EBIT to current liabilities, and cash flow ratio. Our estimates provide evidence that the firms having Z-value below zero fall into the “bankrupt” whereas the firms with Z-value above zero fall into the “non-bankrupt” category. The model achieved 76.9% prediction accuracy when it is applied to forecast bankruptcies on the underlying sample.

  3. Chaos detection and predictability

    CERN Document Server

    Gottwald, Georg; Laskar, Jacques

    2016-01-01

    Distinguishing chaoticity from regularity in deterministic dynamical systems and specifying the subspace of the phase space in which instabilities are expected to occur is of utmost importance in as disparate areas as astronomy, particle physics and climate dynamics.   To address these issues there exists a plethora of methods for chaos detection and predictability. The most commonly employed technique for investigating chaotic dynamics, i.e. the computation of Lyapunov exponents, however, may suffer a number of problems and drawbacks, for example when applied to noisy experimental data.   In the last two decades, several novel methods have been developed for the fast and reliable determination of the regular or chaotic nature of orbits, aimed at overcoming the shortcomings of more traditional techniques. This set of lecture notes and tutorial reviews serves as an introduction to and overview of modern chaos detection and predictability techniques for graduate students and non-specialists.   The book cover...

  4. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik;

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines the...

  5. Nominal model predictive control

    OpenAIRE

    Grüne, Lars

    2013-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  6. Multivariate respiratory motion prediction

    International Nuclear Information System (INIS)

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation. (paper)

  7. Predictive Game Theory

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  8. Time-predictable architectures

    CERN Document Server

    Rochange, Christine; Uhrig , Sascha

    2014-01-01

    Building computers that can be used to design embedded real-time systems is the subject of this title. Real-time embedded software requires increasingly higher performances. The authors therefore consider processors that implement advanced mechanisms such as pipelining, out-of-order execution, branch prediction, cache memories, multi-threading, multicorearchitectures, etc. The authors of this book investigate the timepredictability of such schemes.

  9. Multivariate respiratory motion prediction

    Science.gov (United States)

    Dürichen, R.; Wissel, T.; Ernst, F.; Schlaefer, A.; Schweikard, A.

    2014-10-01

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation.

  10. Thinking about Aid Predictability

    OpenAIRE

    Andrews, Matthew; Wilhelm, Vera

    2008-01-01

    Researchers are giving more attention to aid predictability. In part, this is because of increases in the number of aid agencies and aid dollars and the growing complexity of the aid community. A growing body of research is examining key questions: Is aid unpredictable? What causes unpredictability? What can be done about it? This note draws from a selection of recent literature to bring s...

  11. Predicting helpful product reviews

    OpenAIRE

    O'Mahony, Michael P.; Cunningham, Pádraig; Smyth, Barry

    2010-01-01

    Millions of users are today posting user-generated content online, expressing their opinions on all manner of goods and services, topics and social affairs. While undoubtedly useful,user-generated content presents consumers with significant challenges in terms of information overload and quality considerations. In this paper, we address these issues in the context of product reviews and present a brief survey of our work to date on predicting review helpfulness. In particular, the performa...

  12. The Predictive Audit Framework

    OpenAIRE

    Kuenkaikaew, Siripan; Vasarhelyi, Miklos A.

    2013-01-01

    Assurance is an essential part of the business process of the modern enterprise. Auditing is a widely used assurance method made mandatory for public companies since 1934. The traditional (retroactive) audit provides after-the-fact audit reports, and is of limited value in the ever changing modern business environment because it is slow and backwards looking. Contemporary auditing and monitoring technologies could shorten the audit and assurance time frame. This paper proposes the predictive ...

  13. Predicting appointment breaking.

    Science.gov (United States)

    Bean, A G; Talaga, J

    1995-01-01

    The goal of physician referral services is to schedule appointments, but if too many patients fail to show up, the value of the service will be compromised. The authors found that appointment breaking can be predicted by the number of days to the scheduled appointment, the doctor's specialty, and the patient's age and gender. They also offer specific suggestions for modifying the marketing mix to reduce the incidence of no-shows. PMID:10142384

  14. Azimuthal angular distributions in EDDE as a spin-parity analyser and glueball filter for the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, Vladimir Alexeevich [Institute for High Energy Physics, 142 281, Protvino (Russian Federation); Ryutin, Roman Anatolievich [Institute for High Energy Physics, 142 281, Protvino (Russian Federation); Sobol, Andrei E. [Institute for High Energy Physics, 142 281, Protvino (Russian Federation); Guillaud, Jean-Paul [LAPP, Annecy (France)

    2005-06-01

    Exclusive Double Diffractive Events (EDDE) are analysed as the source of the information about the central system. The experimental possibilities for the exotic particles searches are considered. From the reggeized tensor current picture some azimuthal angle dependences were obtained to fit the data from the WA102 experiment and to make predictions for the LHC collider.

  15. Eclipse prediction in Mesopotamia.

    Science.gov (United States)

    Steele, J. M.

    2000-02-01

    Among the many celestial phenomena observed in ancient Mesopotamia, eclipses, particularly eclipses of the Moon, were considered to be among the astrologically most significant events. In Babylon, by at least the middle of the seventh century BC, and probably as early as the middle of the eighth century BC, astronomical observations were being systematically conducted and recorded in a group of texts which we have come to call Astronomical Diaries. These Diaries contain many observations and predictions of eclipses. The predictions generally include the expected time of the eclipse, apparently calculated quite precisely. By the last three centuries BC, the Babylonian astronomers had developed highly advanced mathematical theories of the Moon and planets. This paper outlines the various methods which appear to have been formulated by the Mesopotamian astronomers to predict eclipses of the Sun and the Moon. It also considers the question of which of these methods were actually used in compiling the Astronomical Diaries, and speculates why these particular methods were used.

  16. Is Suicide Predictable?

    Directory of Open Access Journals (Sweden)

    S Asmaee

    2012-04-01

    Full Text Available Background:The current study aimed to test the hypothesis: Is suicide predictable? And try to classify the predictive factors in multiple suicide attempts.Methods:A cross-sectional study was administered to 223 multiple attempters, women who came to a medical poison centre after a suicide attempt.The participants were young, poor, and single.A Logistic Regression Analiysis was used to classify the predictive factors of suicide.Results:Women who had multiple suicide attempts exhibited a significant tendency to attempt suicide again. They had a history for more than two years of multiple suicide attempts, from three to as many as 18 times, plus mental illnesses such as depression and substance abuse.They also had a positive history of mental illnesses.Conclusion:Results indicate that contributing factors for another suicide attempt include previous suicide attempts, mental illness (depression,or a positive history of mental illnesses in the family affecting them at a young age, and substance abuse.

  17. Improvements in Hanford TRU Program Utilizing Systems Modeling and Analyses

    International Nuclear Information System (INIS)

    Hanford's Transuranic (TRU) Program is responsible for certifying contact-handled (CH) TRU waste and shipping the certified waste to the Waste Isolation Pilot Plant (WIPP). Hanford's CH TRU waste includes material that is in retrievable storage as well as above ground storage, and newly generated waste. Certifying a typical container entails retrieving and then characterizing it (Non-Destructive Examination [NDE], Non-Destructive Assay [NDA], and Head Space Gas Sampling [HSG]), validating records (data review and reconciliation), and designating the container for a payload. The certified payload is then shipped to WIPP. Systems modeling and analysis techniques were applied to Hanford's TRU Program to help streamline the certification process and increase shipping rates. The modeling and analysis yields several benefits: - Maintains visibility on system performance and predicts downstream consequences of production issues. - Predicts future system performance with higher confidence, based on tracking past performance. - Applies speculation analyses to determine the impact of proposed changes (e.g., apparent shortage of feed should not be used as basis to reassign personnel if more feed is coming in the queue). - Positively identifies the appropriate queue for all containers (e.g., discovered several containers that were not actively being worked because they were in the wrong 'physical' location - method used previously for queuing up containers). - Identifies anomalies with the various data systems used to track inventory (e.g., dimensional differences for Standard Waste Boxes). A model of the TRU Program certification process was created using custom queries of the multiple databases for managing waste containers. The model was developed using a simplified process chart based on the expected path for a typical container. The process chart was augmented with the remediation path for containers that do not meet acceptance criteria for WIPP. Containers are sorted

  18. Predicting Community Evolution in Social Networks

    Directory of Open Access Journals (Sweden)

    Stanisław Saganowski

    2015-05-01

    Full Text Available Nowadays, sustained development of different social media can be observed worldwide. One of the relevant research domains intensively explored recently is analysis of social communities existing in social media as well as prediction of their future evolution taking into account collected historical evolution chains. These evolution chains proposed in the paper contain group states in the previous time frames and its historical transitions that were identified using one out of two methods: Stable Group Changes Identification (SGCI and Group Evolution Discovery (GED. Based on the observed evolution chains of various length, structural network features are extracted, validated and selected as well as used to learn classification models. The experimental studies were performed on three real datasets with different profile: DBLP, Facebook and Polish blogosphere. The process of group prediction was analysed with respect to different classifiers as well as various descriptive feature sets extracted from evolution chains of different length. The results revealed that, in general, the longer evolution chains the better predictive abilities of the classification models. However, chains of length 3 to 7 enabled the GED-based method to almost reach its maximum possible prediction quality. For SGCI, this value was at the level of 3–5 last periods.

  19. Odor Impression Prediction from Mass Spectra

    Science.gov (United States)

    Nakamoto, Takamichi

    2016-01-01

    The sense of smell arises from the perception of odors from chemicals. However, the relationship between the impression of odor and the numerous physicochemical parameters has yet to be understood owing to its complexity. As such, there is no established general method for predicting the impression of odor of a chemical only from its physicochemical properties. In this study, we designed a novel predictive model based on an artificial neural network with a deep structure for predicting odor impression utilizing the mass spectra of chemicals, and we conducted a series of computational analyses to evaluate its performance. Feature vectors extracted from the original high-dimensional space using two autoencoders equipped with both input and output layers in the model are used to build a mapping function from the feature space of mass spectra to the feature space of sensory data. The results of predictions obtained by the proposed new method have notable accuracy (R≅0.76) in comparison with a conventional method (R≅0.61). PMID:27326765

  20. Aeroacoustic Prediction Codes

    Science.gov (United States)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  1. Predicting Alcohol, Cigarette, and Marijuana Use from Preferential Music Consumption

    Science.gov (United States)

    Oberle, Crystal D.; Garcia, Javier A.

    2015-01-01

    This study investigated whether use of alcohol, cigarettes, and marijuana may be predicted from preferential consumption of particular music genres. Undergraduates (257 women and 78 men) completed a questionnaire assessing these variables. Partial correlation analyses, controlling for sensation-seeking tendencies and behaviors, revealed that…

  2. Determinants of work ability and its predictive value for disability

    NARCIS (Netherlands)

    S.M. Alavinia; A.G.E.M. de Boer; J.C. van Duivenbooden; M.H.W. Frings-Dresen; A. Burdorf

    2009-01-01

    Background Maintaining the ability of workers to cope with physical and psychosocial demands at work becomes increasingly important in prolonging working life. Aims To analyse the effects of work-related factors and individual characteristics on work ability and to determine the predictive value of

  3. Predicting travel time variability for cost-benefit analysis

    NARCIS (Netherlands)

    S. Peer; C. Koopmans; E.T. Verhoef

    2010-01-01

    Unreliable travel times cause substantial costs to travelers. Nevertheless, they are not taken into account in many cost-benefit-analyses (CBA), or only in very rough ways. This paper aims at providing simple rules on how variability can be predicted, based on travel time data from Dutch highways. T

  4. Numerical prediction of the IJkDijk trial embankment failure

    NARCIS (Netherlands)

    N. Melnikova; D. Jordan; V. Krzhizhanovskaya; P. Sloot

    2015-01-01

    The paper analyses the experimental slope failure of a full-scale earthen dyke (levee) in Booneschans (Groningen, the Netherlands). The goals of the experiment were to develop efficient dyke-monitoring systems predicting various modes of failure well in advance of onset and to test the ability of nu

  5. Large-scale prediction of drug-target relationships

    DEFF Research Database (Denmark)

    Kuhn, Michael; Campillos, Mónica; González, Paula;

    2008-01-01

    also provides a more global view on drug-target relations. Here we review recent attempts to apply large-scale computational analyses to predict novel interactions of drugs and targets from molecular and cellular features. In this context, we quantify the family-dependent probability of two proteins to...

  6. Analysis and Prediction Techniques

    DEFF Research Database (Denmark)

    Heiselberg, Per

    In large enclosures, common ventilation strategies, as complete mixing, require considerable amounts of energy to move and condition enormous amounts of air. The air flow pattern should therefore be well planned and controlled to ensure an acceptable indoor air quality in the occupied zone without...... the need for excessive air flow rates. This part of the summary report concentrates on describing methods for designing and analysing ventilation in large enclosures. lt includes application of different mathematical models in the design process for simulation of temperature distribution, air motion...

  7. Integrative analyses shed new light on human ribosomal protein gene regulation.

    Science.gov (United States)

    Li, Xin; Zheng, Yiyu; Hu, Haiyan; Li, Xiaoman

    2016-01-01

    Ribosomal protein genes (RPGs) are important house-keeping genes that are well-known for their coordinated expression. Previous studies on RPGs are largely limited to their promoter regions. Recent high-throughput studies provide an unprecedented opportunity to study how human RPGs are transcriptionally modulated and how such transcriptional regulation may contribute to the coordinate gene expression in various tissues and cell types. By analyzing the DNase I hypersensitive sites under 349 experimental conditions, we predicted 217 RPG regulatory regions in the human genome. More than 86.6% of these computationally predicted regulatory regions were partially corroborated by independent experimental measurements. Motif analyses on these predicted regulatory regions identified 31 DNA motifs, including 57.1% of experimentally validated motifs in literature that regulate RPGs. Interestingly, we observed that the majority of the predicted motifs were shared by the predicted distal and proximal regulatory regions of the same RPGs, a likely general mechanism for enhancer-promoter interactions. We also found that RPGs may be differently regulated in different cells, indicating that condition-specific RPG regulatory regions still need to be discovered and investigated. Our study advances the understanding of how RPGs are coordinately modulated, which sheds light to the general principles of gene transcriptional regulation in mammals. PMID:27346035

  8. New dimension analyses with error analysis for quaking aspen and black spruce

    Science.gov (United States)

    Woods, K. D.; Botkin, D. B.; Feiveson, A. H.

    1987-01-01

    Dimension analysis for black spruce in wetland stands and trembling aspen are reported, including new approaches in error analysis. Biomass estimates for sacrificed trees have standard errors of 1 to 3%; standard errors for leaf areas are 10 to 20%. Bole biomass estimation accounts for most of the error for biomass, while estimation of branch characteristics and area/weight ratios accounts for the leaf area error. Error analysis provides insight for cost effective design of future analyses. Predictive equations for biomass and leaf area, with empirically derived estimators of prediction error, are given. Systematic prediction errors for small aspen trees and for leaf area of spruce from different site-types suggest a need for different predictive models within species. Predictive equations are compared with published equations; significant differences may be due to species responses to regional or site differences. Proportional contributions of component biomass in aspen change in ways related to tree size and stand development. Spruce maintains comparatively constant proportions with size, but shows changes corresponding to site. This suggests greater morphological plasticity of aspen and significance for spruce of nutrient conditions.

  9. Prolonged grief and depression after unnatural loss: Latent class analyses and cognitive correlates.

    Science.gov (United States)

    Boelen, Paul A; Reijntjes, Albert; J Djelantik, A A A Manik; Smid, Geert E

    2016-06-30

    This study sought to identify (a) subgroups among people confronted with unnatural/violent loss characterized by different symptoms profiles of prolonged grief disorder (PGD) and depression, and (b) socio-demographic, loss-related, and cognitive variables associated with subgroup membership. We used data from 245 individuals confronted with the death of a loved one due to an accident (47.3%), suicide (49%) or homicide (3.7%). Latent class analysis revealed three classes of participants: a resilient-class (25.3%), a predominantly PGD-class (39.2%), and a combined PGD/Depression-class (35.5%). Membership in the resilient-class was predicted by longer time since loss and lower age; membership in the combined class was predicted by lower education. Endorsement of negative cognitions about the self, life, the future, and one's own grief-reactions was lowest in the Resilient-class, intermediate in the PGD-class, and highest in the combined PGD/Depression-class. When all socio-demographic, loss-related, and cognitive variables were included in multinomial regression analyses predicting class-membership, it was found that negative cognitions about one's grief was the only variable predicting membership of the PGD-class. Negative cognitions about the self, life, and grief predicted membership of the combined PGD/Depression-class. These findings provide valuable information for the development of interventions for different subgroups of bereaved individuals confronted with unnatural/violent loss. PMID:27138832

  10. The STD/MHD codes - Comparison of analyses with experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc. [for MHD generator flows

    Science.gov (United States)

    Vetter, A. A.; Maxwell, C. D.; Swean, T. F., Jr.; Demetriades, S. T.; Oliver, D. A.; Bangerter, C. D.

    1981-01-01

    Data from sufficiently well-instrumented, short-duration experiments at AEDC/HPDE, Reynolds Metal Co., and Hercules, Inc., are compared to analyses with multidimensional and time-dependent simulations with the STD/MHD computer codes. These analyses reveal detailed features of major transient events, severe loss mechanisms, and anomalous MHD behavior. In particular, these analyses predicted higher-than-design voltage drops, Hall voltage overshoots, and asymmetric voltage drops before the experimental data were available. The predictions obtained with these analyses are in excellent agreement with the experimental data and the failure predictions are consistent with the experiments. The design of large, high-interaction or advanced MHD experiments will require application of sophisticated, detailed and comprehensive computational procedures in order to account for the critical mechanisms which led to the observed behavior in these experiments.

  11. Large-displacement structural durability analyses of simple bend specimen emulating rocket nozzle liners

    Science.gov (United States)

    Arya, Vinod K.; Halford, Gary R.

    1994-01-01

    Large-displacement elastic and elastic-plastic, finite-element stress-strain analyses of an oxygen-tree high-conductivity (OFHC) copper plate specimen were performed using an updated Lagrangian formulation. The plate specimen is intended for low-cost experiments that emulate the most important thermomechanical loading and failure modes of a more complex rocket nozzle. The plate, which is loaded in bending at 593 C, contains a centrally located and internally pressurized channel. The cyclic crack initiation lives were estimated using the results from the analyses and isothermal strain-controlled low-cycle fatigue data for OFHC copper. A comparison of the predicted and experimental cyclic lives showed that an elastic analysis predicts a longer cyclic life than that observed in experiments by a factor greater than 4. The results from elastic-plastic analysis for the plate bend specimen, however, predicted a cyclic life in close agreement with experiment, thus justifying the need for the more rigorous stress-strain analysis.

  12. Coal extraction - environmental prediction

    Energy Technology Data Exchange (ETDEWEB)

    C. Blaine Cecil; Susan J. Tewalt

    2002-08-01

    To predict and help minimize the impact of coal extraction in the Appalachian region, the U.S. Geological Survey (USGS) is addressing selected mine-drainage issues through the following four interrelated studies: spatial variability of deleterious materials in coal and coal-bearing strata; kinetics of pyrite oxidation; improved spatial geologic models of the potential for drainage from abandoned coal mines; and methodologies for the remediation of waters discharged from coal mines. As these goals are achieved, the recovery of coal resources will be enhanced. 2 figs.

  13. Mathematics of Predicting Growth

    OpenAIRE

    Nielsen, Ron W

    2015-01-01

    Abstract. Mathematical methods of analysis of data and of predicting growth are discussed. The starting point is the analysis of the growth rates, which can be expressed as a function of time or as a function of the size of the growing entity. Application of these methods is illustrated using the world economic growth but they can be applied to any type of growth.Keywords. Growth rate, Differential equations, Gross Domestic Product, Economic growth.JEL. C01, C20, C50, C53, C60, C65, C80

  14. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  15. Asphalt pavement temperature prediction

    OpenAIRE

    Minhoto, Manuel; Pais, Jorge; Pereira, Paulo

    2006-01-01

    A 3-D finite element model (FEM) was developed to calculate the lemperature of an asphtalt rubber pavement localed in the Northeast of Portugal. The goal of the case study presented in this paper is to show the good accuracy temperature prediction tha can be obtained with this model when compared with the field pavement thermal condition obtained during a year. lnput data to the model are the hourly values for solar radiation and temperature and the mean daily value of wind speed obtained fr...

  16. Essays on Earnings Predictability

    DEFF Research Database (Denmark)

    Bruun, Mark

    This dissertation addresses the prediction of corporate earnings. The thesis aims to examine whether the degree of precision in earnings forecasts can be increased by basing them on historical financial ratios. Furthermore, the intent of the dissertation is to analyze whether accounting standards...... forecasts can be generated based on historical timeseries patterns of financial ratios. This is done by modeling the return on equity and the growth-rate in equity as two separate but correlated timeseries processes which converge to a long-term, constant level. Empirical results suggest that these earnings...

  17. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Suetens, Sigrid; Galbo-Jørgensen, Claus B.; Tyran, Jean-Robert Karl

    2016-01-01

    as formalized in recent behavioral theory. In particular, players tend to bet less on numbers that have been drawn in the preceding week, as suggested by the ‘gambler’s fallacy’, and bet more on a number if it was frequently drawn in the recent past, consistent with the ‘hot-hand fallacy’.......We investigate the ‘law of small numbers’ using a data set on lotto gambling that allows us to measure players’ reactions to draws. While most players pick the same set of numbers week after week, we find that those who do change react on average as predicted by the law of small numbers...

  18. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Suetens, Sigrid; Galbo-Jørgensen, Claus B.; Tyran, Jean-Robert Karl

    2015-01-01

    formalized in recent behavioral theory. In particular, players tend to bet less on numbers that have been drawn in the preceding week, as suggested by the ‘gambler’s fallacy’, and bet more on a number if it was frequently drawn in the recent past, consistent with the ‘hot-hand fallacy’.......We investigate the ‘law of small numbers’ using a data set on lotto gambling that allows us to measure players’ reactions to draws. While most players pick the same set of numbers week after week, we find that those who do change react on average as predicted by the law of small numbers as...

  19. Vertebral Fracture Prediction

    DEFF Research Database (Denmark)

    2008-01-01

    Vertebral Fracture Prediction A method of processing data derived from an image of at least part of a spine is provided for estimating the risk of a future fracture in vertebraeof the spine. Position data relating to at least four neighbouring vertebrae of the spine is processed. The curvature...... of the spine at at least two of the neighbouring vertebrae is calculated. The different curvature values are computed to obtain a value representative of the degree of irregularity in curvature of the spine and using the degree of irregularity, an estimate of the risk of a future fracture in vertebrae...

  20. Predicting Sustainable Work Behavior

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft

    2013-01-01

    Sustainable work behavior is an important issue for operations managers – it has implications for most outcomes of OM. This research explores the antecedents of sustainable work behavior. It revisits and extends the sociotechnical model developed by Brown et al. (2000) on predicting safe behavior....... Employee characteristics and general attitudes towards safety and work condition are included in the extended model. A survey was handed out to 654 employees in Chinese factories. This research contributes by demonstrating how employee- characteristics and general attitudes towards safety and work...... condition influence their sustainable work behavior. A new definition of sustainable work behavior is proposed....

  1. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje;

    2014-01-01

    was performed. MRI assessment included age-related white matter changes (ARWMC) grading (mild, moderate, severe according to the Fazekas' scale), count of lacunar and non-lacunar infarcts, and global atrophy rating. Of the 633 (out of the 639 enrolled) patients with follow-up information (mean age 74.1 ± 5......, presence and number of neurological examination abnormalities predicted global functional decline independent of MRI lesions typical of the aging brain and other determinants of disability in the elderly. Systematically checking for neurological examination abnormalities in older patients may be cost...

  2. Predicting Lotto Numbers

    OpenAIRE

    Jorgensen, C.B.; Suetens, S.; Tyran, J.R.

    2011-01-01

    We investigate the “law of small numbers” using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto numbers based on recent drawings. While most players pick the same set of numbers week after week without regards of numbers drawn or anything else, we find that those who do change, act on average in th...

  3. Methods and procedures for shielding analyses for the SNS

    International Nuclear Information System (INIS)

    In order to provide radiologically safe Spallation Neutron Source operation, shielding analyses are performed according to Oak Ridge National Laboratory internal regulations and to comply with the Code of Federal Regulations. An overview of on-going shielding work for the accelerator facility and neutrons beam lines, methods used for the analyses, and associated procedures and regulations are presented. Methods used to perform shielding analyses are described as well. (author)

  4. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  5. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  6. Nomogram prediction for overall survival of patients diagnosed with cervical cancer

    OpenAIRE

    Polterauer, S; Grimm, C.; Hofstetter, G; Concin, N; Natter, C; Sturdza, A.; R. Pötter; Marth, C.; Reinthaller, A.; Heinze, G.

    2012-01-01

    Background: Nomograms are predictive tools that are widely used for estimating cancer prognosis. The aim of this study was to develop a nomogram for the prediction of overall survival (OS) in patients diagnosed with cervical cancer. Methods: Cervical cancer databases of two large institutions were analysed. Overall survival was defined as the clinical endpoint and OS probabilities were estimated using the Kaplan–Meier method. Based on the results of survival analyses and previous studies, rel...

  7. Predicting Alloreactivity in Transplantation

    Directory of Open Access Journals (Sweden)

    Kirsten Geneugelijk

    2014-01-01

    Full Text Available Human leukocyte Antigen (HLA mismatching leads to severe complications after solid-organ transplantation and hematopoietic stem-cell transplantation. The alloreactive responses underlying the posttransplantation complications include both direct recognition of allogeneic HLA by HLA-specific alloantibodies and T cells and indirect T-cell recognition. However, the immunogenicity of HLA mismatches is highly variable; some HLA mismatches lead to severe clinical B-cell- and T-cell-mediated alloreactivity, whereas others are well tolerated. Definition of the permissibility of HLA mismatches prior to transplantation allows selection of donor-recipient combinations that will have a reduced chance to develop deleterious host-versus-graft responses after solid-organ transplantation and graft-versus-host responses after hematopoietic stem-cell transplantation. Therefore, several methods have been developed to predict permissible HLA-mismatch combinations. In this review we aim to give a comprehensive overview about the current knowledge regarding HLA-directed alloreactivity and several developed in vitro and in silico tools that aim to predict direct and indirect alloreactivity.

  8. Compressor map prediction tool

    Science.gov (United States)

    Ravi, Arjun; Sznajder, Lukasz; Bennett, Ian

    2015-08-01

    Shell Global Solutions uses an in-house developed system for remote condition monitoring of centrifugal compressors. It requires field process data collected during operation to calculate and assess the machine's performance. Performance is assessed by comparing live results of polytropic head and efficiency versus design compressor curves provided by the Manufacturer. Typically, these design curves are given for specific suction conditions. The further these conditions on site deviate from those prescribed at design, the less accurate the health assessment of the compressor becomes. To address this specified problem, a compressor map prediction tool is proposed. The original performance curves of polytropic head against volumetric flow for varying rotational speeds are used as an input to define a range of Mach numbers within which the non-dimensional invariant performance curve of head and volume flow coefficient is generated. The new performance curves of polytropic head vs. flow for desired set of inlet conditions are then back calculated using the invariant non-dimensional curve. Within the range of Mach numbers calculated from design data, the proposed methodology can predict polytropic head curves at a new set of inlet conditions within an estimated 3% accuracy. The presented methodology does not require knowledge of detailed impeller geometry such as throat areas, blade number, blade angles, thicknesses nor other aspects of the aerodynamic design - diffusion levels, flow angles, etc. The only required mechanical design feature is the first impeller tip diameter. Described method makes centrifugal compressor surveillance activities more accurate, enabling precise problem isolation affecting machine's performance.

  9. A Prediction Contest

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    The literature suggests that important strategic initiatives can derive from employees within the organization as they respond to needs and opportunities observed in daily operations. This seems to indicate that employees have a good sense of the firm’s operational capabilities observed through...... the conditions sensed by the employees. So, we arranged a contest between operational capabilities assessed by employees and executives and the relationship to subsequent firm performance. Based on more than 400 individual data points collected from two medium-sized organizations over a period of eighteen months......, advanced distributed lag time-series analyses show that the sensing of front-line employees (surprisingly) is a better medium-term predictor of organizational performance than executive judgments. These results have implications for the way organizations set up their management information...

  10. Multi-state models: metapopulation and life history analyses

    Directory of Open Access Journals (Sweden)

    Arnason, A. N.

    2004-06-01

    Full Text Available Multi–state models are designed to describe populations that move among a fixed set of categorical states. The obvious application is to population interchange among geographic locations such as breeding sites or feeding areas (e.g., Hestbeck et al., 1991; Blums et al., 2003; Cam et al., 2004 but they are increasingly used to address important questions of evolutionary biology and life history strategies (Nichols & Kendall, 1995. In these applications, the states include life history stages such as breeding states. The multi–state models, by permitting estimation of stage–specific survival and transition rates, can help assess trade–offs between life history mechanisms (e.g. Yoccoz et al., 2000. These trade–offs are also important in meta–population analyses where, for example, the pre–and post–breeding rates of transfer among sub–populations can be analysed in terms of target colony distance, density, and other covariates (e.g., Lebreton et al. 2003; Breton et al., in review. Further examples of the use of multi–state models in analysing dispersal and life–history trade–offs can be found in the session on Migration and Dispersal. In this session, we concentrate on applications that did not involve dispersal. These applications fall in two main categories: those that address life history questions using stage categories, and a more technical use of multi–state models to address problems arising from the violation of mark–recapture assumptions leading to the potential for seriously biased predictions or misleading insights from the models. Our plenary paper, by William Kendall (Kendall, 2004, gives an overview of the use of Multi–state Mark–Recapture (MSMR models to address two such violations. The first is the occurrence of unobservable states that can arise, for example, from temporary emigration or by incomplete sampling coverage of a target population. Such states can also occur for life history reasons, such

  11. Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft

    Science.gov (United States)

    Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv

    2009-01-01

    This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.

  12. Analyses of Acoustic Streaming Generated by Four Ultrasonic Vibrators in a Vessel

    Science.gov (United States)

    Nakagawa, Masafumi

    2004-05-01

    When ultrasonic waves are applied, the heat transfer at a heated surface in water increases markedly. The origin of this increase in heat transfer is thought to be due to the agitation effect from the microjets of cavitation and from acoustic streaming. The method in which four vibrators are used has the ability of further enhancing heat transfer. This paper presents the method using four vibrators to eject an acoustic stream jet at a selected position in the vessel. Analyses of this method are performed to establish it theoretically and to compare with an experiment previously conducted. The analyses shown in this research indicate that the aspects of acoustic streaming generated by the four vibrators in the vessel can be correctly predicted and provide a foundation for the development of using this method for the enhancement of heat transfer.

  13. Static and Dynamic Mechanical Analyses for the Vacuum Vessel of EAST Superconducting Tokamak Device

    Science.gov (United States)

    Song, Yuntao; Yao, Damao; Du, Shijun; Wu, Songtao; Weng, Peide

    2006-03-01

    EAST (experimental advanced superconducting tokamak) is an advanced steady-state plasma physics experimental device, which is being constructed as the Chinese National Nuclear Fusion Research Project. During the plasma operation the vacuum vessel as one of the key component will withstand the electromagnetic force due to the plasma disruption, the Halo current and the toroidal field coil quench, the pressure of boride water and the thermal load due to 250 oC baking by pressurized nitrogen gas. In this paper a report of the static and dynamic mechanical analyses of the vacuum vessel is made. Firstly the applied loads on the vacuum vessel were given and the static stress distribution under the gravitational loads, the pressure loads, the electromagnetic loads and thermal loads were investigated. Then a series of primary dynamic, buckling and fatigue life analyses were performed to predict the structure's dynamic behavior. A seismic analysis was also conducted.

  14. Fast reactor core thermal-hydraulic analyses during transition from forced to natural circulation

    International Nuclear Information System (INIS)

    The modeling for inter-subchannel mixing effects was presented to simulate the fast reactor transition from rated to natural circulation decay heat removal conditions which was usually accompanied by all flow regimes: forced, mixed and natural convection. The model was constructed based on correlations for mixing and pressure drop coefficients developed at MIT. This correlation was originally proposed for steady states subchannel analyses. In the present study, application of the mixing correlation was extended to unsteady multi-dimensional analyses by introducing a threshold function. The function enabled to switch the correlations adequately with change of the flow regimes, depending on Richardson number which is an index of buoyancy effect on the flow field. The modeling was validated through calculation of sodium experiments featuring 37, 61 and 169-pin bundle subassemblies. Comparisons of the experimental and numerical results revealed that the modeling was capable of predicting the core thermal-hydraulic field under wide spectrum of flow rate and heating conditions. (author)

  15. Measurement of the analysing power in proton–proton elastic scattering at small angles

    Directory of Open Access Journals (Sweden)

    Z. Bagdasarian

    2014-12-01

    Full Text Available The proton analysing power in p→p elastic scattering has been measured at small angles at COSY-ANKE at 796 MeV and five other beam energies between 1.6 and 2.4 GeV using a polarised proton beam. The asymmetries obtained by detecting the fast proton in the ANKE forward detector or the slow recoil proton in a silicon tracking telescope are completely consistent. Although the analysing power results agree well with the many published data at 796 MeV, and also with the most recent partial wave solution at this energy, the ANKE data at the higher energies lie well above the predictions of this solution at small angles. An updated phase shift analysis that uses the ANKE results together with the World data leads to a much better description of these new measurements.

  16. Fully plastic crack opening analyses of complex-cracked pipes for Ramberg-Osgood materials

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jae Uk; Choi, Jae Boong [Sungkyunkwan University, Suwon (Korea, Republic of); Huh, Nam Su [Seoul National University, Seoul (Korea, Republic of); Kim, Yun Jae [Korea University, Seoul (Korea, Republic of)

    2016-04-15

    The plastic influence functions for calculating fully plastic Crack opening displacement (COD) of complex-cracked pipes were newly proposed based on systematic 3-dimensional (3-D) elastic-plastic Finite element (FE) analyses using Ramberg-Osgood (R-O) relation, where global bending moment, axial tension and internal pressure are considered separately as a loading condition. Then, crack opening analyses were performed based on GE/EPRI concept by using the new plastic influence functions for complex-cracked pipes made of SA376 TP304 stainless steel, and the predicted CODs were compared with FE results based on deformation plasticity theory of tensile material behavior. From the comparison, the confidence of the proposed fully plastic crack opening solutions for complex-cracked pipes was gained. Therefore, the proposed engineering scheme for COD estimation using the new plastic influence functions can be utilized to estimate leak rate of a complex-cracked pipe for R-O material.

  17. Fracture mechanics analyses of partial crack closure in shell structures

    Science.gov (United States)

    Zhao, Jun

    2007-12-01

    This thesis presents the theoretical and finite element analyses of crack-face closure behavior in shells and its effect on the stress intensity factor under a bending load condition. Various shell geometries, such as spherical shell, cylindrical shell containing an axial crack, cylindrical shell containing a circumferential crack and shell with double curvatures, are all studied. In addition, the influence of material orthotropy on the crack closure effect in shells is also considered. The theoretical formulation is developed based on the shallow shell theory of Delale and Erdogan, incorporating the effect of crack-face closure at the compressive edges. The line-contact assumption, simulating the crack-face closure at the compressive edges, is employed so that the contact force at the closure edges is introduced, which can be translated to the mid-plane of the shell, accompanied by an additional distributed bending moment. The unknown contact force is computed by solving a mixed-boundary value problem iteratively, that is, along the crack length, either the normal displacement of the crack face at the compressive edges is equal to zero or the contact pressure is equal to zero. It is found that due to the curvature effects crack closure may not always occur on the entire length of the crack, depending on the direction of the bending load and the geometry of the shell. The crack-face closure influences significantly the magnitude of the stress intensity factors; it increases the membrane component but decreases the bending component. The maximum stress intensity factor is reduced by the crack-face closure. The significant influence of geometry and material orthotropy on rack closure behavior in shells is also predicted based on the analytical solutions. Three-dimensional FEA is performed to validate the theoretical solutions. It demonstrates that the crack face closure occurs actually over an area, not on a line, but the theoretical solutions of the stress intensity

  18. Joint analyses model for total cholesterol and triglyceride in human serum with near-infrared spectroscopy

    Science.gov (United States)

    Yao, Lijun; Lyu, Ning; Chen, Jiemei; Pan, Tao; Yu, Jing

    2016-04-01

    The development of a small, dedicated near-infrared (NIR) spectrometer has promising potential applications, such as for joint analyses of total cholesterol (TC) and triglyceride (TG) in human serum for preventing and treating hyperlipidemia of a large population. The appropriate wavelength selection is a key technology for developing such a spectrometer. For this reason, a novel wavelength selection method, named the equidistant combination partial least squares (EC-PLS), was applied to the wavelength selection for the NIR analyses of TC and TG in human serum. A rigorous process based on the various divisions of calibration and prediction sets was performed to achieve modeling optimization with stability. By applying EC-PLS, a model set was developed, which consists of various models that were equivalent to the optimal model. The joint analyses model of the two indicators was further selected with only 50 wavelengths. The random validation samples excluded from the modeling process were used to validate the selected model. The root-mean-square errors, correlation coefficients and ratio of performance to deviation for the prediction were 0.197 mmol L- 1, 0.985 and 5.6 for TC, and 0.101 mmol L- 1, 0.992 and 8.0 for TG, respectively. The sensitivity and specificity for hyperlipidemia were 96.2% and 98.0%. These findings indicate high prediction accuracy and low model complexity. The proposed wavelength selection provided valuable references for the designing of a small, dedicated spectrometer for hyperlipidemia. The methodological framework and optimization algorithm are universal, such that they can be applied to other fields.

  19. What can we do about exploratory analyses in clinical trials?

    Science.gov (United States)

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials. PMID:26390962

  20. Aftaler om arbejdsmiljø - en analyse af udvalgte overenskomster

    DEFF Research Database (Denmark)

    Petersen, Jens Voxtrup; Wiegmann, Inger-Marie; Vogt-Nielsen, Karl;

    En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift.......En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift....

  1. Training Residential Staff to Conduct Trial-Based Functional Analyses

    Science.gov (United States)

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  2. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of...

  3. Descriptive Analyses of Pediatric Food Refusal and Acceptance

    Science.gov (United States)

    Borrero, Carrie S. W.; Woods, Julia N.; Borrero, John C.; Masler, Elizabeth A.; Lesser, Aaron D.

    2010-01-01

    Functional analyses of inappropriate mealtime behavior typically include conditions to determine if the contingent delivery of attention, tangible items, or escape reinforce food refusal. In the current investigation, descriptive analyses were conducted for 25 children who had been admitted to a program for the assessment and treatment of food…

  4. What can we do about exploratory analyses in clinical trials?

    Science.gov (United States)

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials.

  5. Competition and stability analyses among emissions, energy, and economy: Application for Mexico

    International Nuclear Information System (INIS)

    In view of limited natural resources on Earth, linkage among environment, energy, and economy (3Es) becomes important perspectives for sustainable development. This paper proposes to use Lotka–Volterra model for SUstainable Development (LV-SUD) to analyse the interspecific interactions, equilibria and their stabilities among emissions, different types of energy consumption (renewable, nuclear, and fossil fuel), and real GDP, the main factors of 3Es issues. Modelling these interactions provides a useful multivariate framework for prediction outcomes. Interaction between 3Es, namely competition, symbiosis, or predation, plays an important role in policy development to achieve a balanced use of energy resources and to strengthen the green economy. Applying LV-SUD in Mexico, an emerging markets country, analysing results show that there is a mutualism between fossil fuel consumption and GDP; prey-predator relationships that fossil fuel and GDP enhance the growth of emissions, but emissions inhibit the growth of the others; and commensalisms that GDP benefits from nuclear power, and renewable power benefits from fossil fuel. It is suggested that national energy policies should remain committed to decoupling the relevance between non-clean energy and GDP, to actively developing clean energy and thereby to properly reducing fossil fuel consumption and emissions without harming economic growth. - Highlights: • LV-SUD is used to analyse the competition between environment-energy-economy (3Es). • The competitions between renewable, nuclear, and fossil energy are analysed. • Competition between 3Es plays an important role in policy development. • LV-SUD provides a useful multivariate framework for prediction outcomes. • An application for emerging markets countries such as Mexico is presented

  6. Entropy analyses of spatiotemporal synchronizations in brain signals from patients with focal epilepsies

    CERN Document Server

    Tuncay, Caglar

    2010-01-01

    The electroencephalographic (EEG) data intracerebrally recorded from 20 epileptic humans with different brain origins of focal epilepsies or types of seizures, ages and sexes are investigated (nearly 700 million data). Multi channel univariate amplitude analyses are performed and it is shown that time dependent Shannon entropies can be used to predict focal epileptic seizure onsets in different epileptogenic brain zones of different patients. Formations or time evolutions of the synchronizations in the brain signals from epileptogenic or non epileptogenic areas of the patients in ictal interval or inter-ictal interval are further investigated employing spatial or temporal differences of the entropies.

  7. Review of methodologies used in safety analyses of nuclear waste disposal

    International Nuclear Information System (INIS)

    The methodologies used in safety analyses of nuclear waste disposal are reviewed. Particular reference is made to the KBS-3 proposals and to the WP-Cave concept. These methodologies are aimed mainly at reducing uncertainties in the predictions of health consequences from nuclear waste repositories, and quantifying the uncertainties that remain. The steps of a safety analysis are described in turn and the currently used techniques are discussed. Recommendations are made as to the use and development of the techniques and the need for new techniques. (author)

  8. Systems Analyses Reveal Shared and Diverse Attributes of Oct4 Regulation in Pluripotent Cells

    DEFF Research Database (Denmark)

    Ding, Li; Paszkowski-Rogacz, Maciej; Winzi, Maria;

    2015-01-01

    Oct4, a key regulator of pluripotency. Our data signify that there are similarities, but also fundamental differences in Oct4 regulation in EpiSCs versus embryonic stem cells (ESCs). Through multiparametric data analyses, we predict that Tox4 is associating with the Paf1C complex, which maintains cell...... identity in both cell types, and validate that this protein-protein interaction exists in ESCs and EpiSCs. We also identify numerous knockdowns that increase Oct4 expression in EpiSCs, indicating that, in stark contrast to ESCs, Oct4 is under active repressive control in EpiSCs. These studies provide a...

  9. Analysis of Tile-Reinforced Composite Armor. Part 1; Advanced Modeling and Strength Analyses

    Science.gov (United States)

    Davila, C. G.; Chen, Tzi-Kang; Baker, D. J.

    1998-01-01

    The results of an analytical and experimental study of the structural response and strength of tile-reinforced components of the Composite Armored Vehicle are presented. The analyses are based on specialized finite element techniques that properly account for the effects of the interaction between the armor tiles, the surrounding elastomers, and the glass-epoxy sublaminates. To validate the analytical predictions, tests were conducted with panels subjected to three-point bending loads. The sequence of progressive failure events for the laminates is described. This paper describes the results of Part 1 of a study of the response and strength of tile-reinforced composite armor.

  10. Cooling pond temperature prediction

    International Nuclear Information System (INIS)

    A model is described which predicts temperature responses in the environment that are associated with the operation of a natural gas fueled thermoelectric power generation station. The model is a piecewise computer simulation, limited at present to closed cooling water systems. However, the techniques developed should be applicable to a much larger class of cooling system. The problem encountered consists of two parts: (1) data characterization and (2) modeling. An efficient characterization scheme for the environmental variables greatly simplifies the task of modeling. Methods borrowed from signal theory, but not yet applied to this field are applicable to and greatly simplify the digital computer investigation of environmental data. An optimal data set, from the point of view of information per unit cost, is described for the model

  11. Permeability prediction in chalks

    DEFF Research Database (Denmark)

    Alam, Mohammad Monzurul; Fabricius, Ida Lykke; Prasad, Manika

    2011-01-01

    The velocity of elastic waves is the primary datum available for acquiring information about subsurface characteristics such as lithology and porosity. Cheap and quick (spatial coverage, ease of measurement) information of permeability can be achieved, if sonic velocity is used for permeability......-permeability relationships were replaced by relationships between velocity of elastic waves and permeability using laboratory data, and the relationships were then applied to well-log data. We found that the permeability prediction in chalk and possibly other sediments with large surface areas could be improved...... significantly using the effective specific surface as the fluid-flow concept. The FZI unit is appropriate for highly permeable sedimentary rocks such as sandstones and limestones that have small surface areas....

  12. Motor degradation prediction methods

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-12-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor`s duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures.

  13. Motor degradation prediction methods

    International Nuclear Information System (INIS)

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor's duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures

  14. Uncertainty and Sensitivity Analyses Plan. Draft for Peer Review: Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  15. Plume rise predictions

    International Nuclear Information System (INIS)

    Anyone involved with diffusion calculations becomes well aware of the strong dependence of maximum ground concentrations on the effective stack height, h/sub e/. For most conditions chi/sub max/ is approximately proportional to h/sub e/-2, as has been recognized at least since 1936 (Bosanquet and Pearson). Making allowance for the gradual decrease in the ratio of vertical to lateral diffusion at increasing heights, the exponent is slightly larger, say chi/sub max/ approximately h/sub e/-2.3. In inversion breakup fumigation, the exponent is somewhat smaller; very crudely, chi/sub max/ approximately h/sub e/-1.5. In any case, for an elevated emission the dependence of chi/sub max/ on h/sub e/ is substantial. It is postulated that a really clever ignorant theoretician can disguise his ignorance with dimensionless constants. For most sources the effective stack height is considerably larger than the actual source height, h/sub s/. For instance, for power plants with no downwash problems, h/sub e/ is more than twice h/sub s/ whenever the wind is less than 10 m/sec, which is most of the time. This is unfortunate for anyone who has to predict ground concentrations, for he is likely to have to calculate the plume rise, Δh. Especially when using h/sub e/ = h/sub s/ + Δh instead of h/sub s/ may reduce chi/sub max/ by a factor of anywhere from 4 to infinity. Factors to be considered in making plume rise predictions are discussed

  16. Predicting the physics of particles

    International Nuclear Information System (INIS)

    A brief account is presented of the goals and methods of particle theorists, stressing the measurable quantities they would like to predict, the conventional starting points for such predictions, and some of the techniques used to arrive at a prediction. (author)

  17. Metagenomic Analyses Reveal That Energy Transfer Gene Abundances Can Predict the Syntrophic Potential of Environmental Microbial Communities.

    Science.gov (United States)

    Oberding, Lisa; Gieg, Lisa M

    2016-01-01

    Hydrocarbon compounds can be biodegraded by anaerobic microorganisms to form methane through an energetically interdependent metabolic process known as syntrophy. The microorganisms that perform this process as well as the energy transfer mechanisms involved are difficult to study and thus are still poorly understood, especially on an environmental scale. Here, metagenomic data was analyzed for specific clusters of orthologous groups (COGs) related to key energy transfer genes thus far identified in syntrophic bacteria, and principal component analysis was used in order to determine whether potentially syntrophic environments could be distinguished using these syntroph related COGs as opposed to universally present COGs. We found that COGs related to hydrogenase and formate dehydrogenase genes were able to distinguish known syntrophic consortia and environments with the potential for syntrophy from non-syntrophic environments, indicating that these COGs could be used as a tool to identify syntrophic hydrocarbon biodegrading environments using metagenomic data. PMID:27681901

  18. Intentions, planning, and self-efficacy predict physical activity in Chinese and Polish adolescents: Two moderated mediation analyses

    Directory of Open Access Journals (Sweden)

    Aleksandra Luszczynska

    2010-01-01

    Full Text Available Se cree que la planificación traslada las intenciones en conductas saludables. Sin embargo, esto puede fallar debido a la falta de autoeficacia percibida. Las personas no afrontan tareas difíciles si guardan auto-dudas, incluso si han hecho un buen plan de acción. Los presentes dos estudios longitudinales descriptivos se diseñaron para examinar el supuesto rol moderador de la autoeficacia en relación planificaciónconducta. En el Estudio I (N = 534 adolescentes chinos, se evaluaron las intenciones en la línea base, mientras que la autoeficacia y la actividad física fueron medidas cuatro semanas más tarde. En el Estudio II, 620 adolescentes polacos rellenaron cuestionarios que evaluaban la actividad física, intenciones, planificación y la autoeficacia en un seguimiento de la actividad física de 10 semanas. Un modelo de mediación moderada fue estudiado. Se especificó la planificación como mediadora entre las intenciones y el comportamiento, mientras que la autoeficacia se especificó como mediadora de la relación planificación-conducta. Los resultados confirman que los niveles de autoeficacia moderan el proceso de mediación. La fuerza del efecto mediado (intención vía planificación del comportamiento se incrementó junto con los niveles de autoeficacia. Estos resultados permanecieron válidos después de haber contabilizado la actividad física de la línea base. Para que la planificación sea mediadora de la relación intención-conducta es necesario que los adolescentes tengan los niveles de autoeficacia suficientemente altos. De otra manera, la planificación puede hacerse en vano. Se discuten las implicaciones para el desarrollo teórico y de intervenciones.

  19. Uncertainties on {alpha}{sub S} in global PDF analyses and implications for predicted hadronic cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Martin, A.D.; Watt, G. [University of Durham, Institute for Particle Physics Phenomenology, Durham (United Kingdom); Stirling, W.J. [University of Cambridge, Cavendish Laboratory, Cambridge (United Kingdom); Thorne, R.S. [University College London, Department of Physics and Astronomy, London (United Kingdom)

    2009-12-15

    We determine the uncertainty on the strong coupling {alpha}{sub S} due to the experimental errors on the data fitted in global analysis of hard-scattering data, within the standard framework of leading-twist fixed-order collinear factorisation in the MS scheme, finding that {alpha}{sub S}(M{sub Z}{sup 2})=0.1202{sub -0.0015}{sup +0.0012} at next-to-leading order (NLO) and {alpha}{sub S}(M{sub Z}{sup 2})=0.1171{sub -0.0014}{sup +0.0014} at next-to-next-to-leading order (NNLO). We do not address in detail the issue of the additional theory uncertainty on {alpha}{sub S} (M{sub Z}{sup 2}), but an estimate is {+-}0.003 at NLO and at most {+-}0.002 at NNLO. We investigate the interplay between uncertainties on {alpha}{sub S} and uncertainties on parton distribution functions (PDFs). We show, for the first time, how both these sources of uncertainty can be accounted for simultaneously in calculations of cross sections, and we provide eigenvector PDF sets with different fixed {alpha}{sub S} values to allow further studies by the general user. We illustrate the application of these PDF sets by calculating cross sections for W,Z, Higgs boson and inclusive jet production at the Tevatron and LHC. (orig.)

  20. Predictive Analyses of Biological Effects of Natural Products: From Plant Extracts to Biomolecular Laboratory and Computer Modeling

    Directory of Open Access Journals (Sweden)

    Roberto Gambari

    2011-01-01

    Full Text Available Year by year, the characterization of the biological activity of natural products is becoming more competitive and complex, with the involvement in this research area of experts belonging to different scientific fields, including chemistry, biochemistry, molecular biology, immunology and bioinformatics. These fields are becoming of great interest for several high-impact scientific journals, including eCAM. The available literature in general, and a survey of reviews and original articles recently published, establishes that natural products, including extracts from medicinal plants and essential oils, retain interesting therapeutic activities, including antitumor, antiviral, anti-inflammatory, pro-apoptotic and differentiating properties. In this commentary, we focus attention on interest in networks based on complementary activation and comparative evaluation of different experimental strategies applied to the discovery and characterization of bioactive natural products. A representative flow chart is shown in the paper.

  1. Integrative genomic analyses of secreted protein acidic and rich in cysteine and its role in cancer prediction.

    Science.gov (United States)

    Wang, Bo; Chen, Kai; Xu, Wenming; Chen, Di; Tang, Wei; Xia, Tian-Song

    2014-09-01

    Secreted protein acidic and rich in cysteine (SPARC), also termed osteonectin or basement‑membrane‑40 (BM‑40), is a matrix‑associated protein that elicits changes in cell shape, inhibits cell‑cycle progression and affects the synthesis of extracellular matrix (ECM). The final mature SPARC protein has 286 amino acids with three distinct domains, including an NH2‑terminal acidic domain (NT), follistatin‑like domain (FS) and C terminus domain (EC). The present study identified SPARC genes from 14 vertebrate genomes and revealed that SPARC existed in all types of vertebrates, including fish, amphibians, birds and mammals. In total, 21 single nucleotide polymorphisms (SNPs) causing missense mutations were identified, which may affect the formation of the truncated form of the SPARC protein. The human SPARC gene was found to be expressed in numerous tissues or organs, including in the bone marrow, whole blood, lymph node, thymus, brain, cerebellum, retina, heart, smooth muscle, skeletal muscle, spinal cord, intestine, colon, adipocyte, kidney, liver, pancreas, thyroid, salivary gland, skin, ovary, uterus, placenta, cervix and prostate. When searched in the PrognoScan database, the human SPARC gene was also found to be expressed in bladder, blood, breast, glioma, esophagus, colorectal, head and neck, ovarian, lung and skin cancer tissues. It was revealed that the association between the expression of SPARC and prognosis varied in different types of cancer, and even in the same cancer from different databases. It implied that the function of SPARC in these tumors may be multidimensional, functioning not just as a tumor suppressor or oncogene. PMID:24938427

  2. [PK/PD Modeling as a Tool for Predicting Bacterial Resistance to Antibiotics: Alternative Analyses of Experimental Data].

    Science.gov (United States)

    Golikova, M V; Strukova, E N; Portnoy, Y A; Firsov, A A

    2015-01-01

    Postexposure number of mutants (NM) is a conventional endpoint in bacterial resistance studies using in vitro dynamic models that simulate antibiotic pharmacokinetics. To compare NM with a recently introduced integral parameter AUBC(M), the area under the time course of resistance mutants, the enrichment of resistant Staphylococcus aureus was studied in vitro by simulation of mono(daptomycin, doxycycline) and combined treatments (daptomycin + rifampicin, rifampicin + linezolid). Differences in the time courses of resistant S. aureus could be reflected by AUBC(M) but not N(M). Moreover, unlike AUBC(M), N(M) did not reflect the pronounced differences in the time courses of S. aureus mutants resistant to 2x, 4x, 8x and 16xMIC of doxycycline and rifampicin. The findings suggested that AUBC(M) was a more appropriate endpoint of the amplification of resistant mutants than N(M).

  3. Metagenomic Analyses Reveal That Energy Transfer Gene Abundances Can Predict the Syntrophic Potential of Environmental Microbial Communities

    Science.gov (United States)

    Oberding, Lisa; Gieg, Lisa M.

    2016-01-01

    Hydrocarbon compounds can be biodegraded by anaerobic microorganisms to form methane through an energetically interdependent metabolic process known as syntrophy. The microorganisms that perform this process as well as the energy transfer mechanisms involved are difficult to study and thus are still poorly understood, especially on an environmental scale. Here, metagenomic data was analyzed for specific clusters of orthologous groups (COGs) related to key energy transfer genes thus far identified in syntrophic bacteria, and principal component analysis was used in order to determine whether potentially syntrophic environments could be distinguished using these syntroph related COGs as opposed to universally present COGs. We found that COGs related to hydrogenase and formate dehydrogenase genes were able to distinguish known syntrophic consortia and environments with the potential for syntrophy from non-syntrophic environments, indicating that these COGs could be used as a tool to identify syntrophic hydrocarbon biodegrading environments using metagenomic data.

  4. Prediction of neural differentiation fate of rat mesenchymal stem cells by quantitative morphological analyses using image processing techniques.

    Science.gov (United States)

    Kazemimoghadam, Mahdieh; Janmaleki, Mohsen; Fouani, Mohamad Hassan; Abbasi, Sara

    2015-02-01

    Differentiation of bone marrow mesenchymal stem cells (BMSCs) into neural cells has received significant attention in recent years. However, there is still no practical method to evaluate differentiation process non-invasively and practically. The cellular quality evaluation method is still limited to conventional techniques, which are based on extracting genes or proteins from the cells. These techniques are invasive, costly, time consuming, and should be performed by relevant experts in equipped laboratories. Moreover, they cannot anticipate the future status of cells. Recently, cell morphology has been introduced as a feasible way of monitoring cell behavior because of its relationship with cell proliferation, functions and differentiation. In this study, rat BMSCs were induced to differentiate into neurons. Subsequently, phase contrast images of cells taken at certain intervals were subjected to a series of image processing steps and cell morphology features were calculated. In order to validate the viability of applying image-based approaches for estimating the quality of differentiation process, neural-specific markers were measured experimentally throughout the induction. The strong correlation between quantitative imaging metrics and experimental outcomes revealed the capability of the proposed approach as an auxiliary method of assessing cell behavior during differentiation.

  5. Intentions, planning, and self-efficacy predict physical activity in Chinese and Polish adolescents: Two moderated mediation analyses

    OpenAIRE

    Aleksandra Luszczynska; Dian Sheng Cao; Natalie Mallach; Katarzyna Pietron; Magda Mazurkiewicz; Ralf Schwarzer

    2010-01-01

    Se cree que la planificación traslada las intenciones en conductas saludables. Sin embargo, esto puede fallar debido a la falta de autoeficacia percibida. Las personas no afrontan tareas difíciles si guardan auto-dudas, incluso si han hecho un buen plan de acción. Los presentes dos estudios longitudinales descriptivos se diseñaron para examinar el supuesto rol moderador de la autoeficacia en relación planificaciónconducta. En el Estudio I (N = 534 adolescentes chinos), se evaluaron las intenc...

  6. Integrative genomic analyses of the RNA-binding protein, RNPC1, and its potential role in cancer prediction.

    Science.gov (United States)

    Ding, Zhiming; Yang, Hai-Wei; Xia, Tian-Song; Wang, Bo; Ding, Qiang

    2015-08-01

    The RNA binding motif protein 38 (RBM38, also known as RNPC1) plays a pivotal role in regulating a wide range of biological processes, from cell proliferation and cell cycle arrest to cell myogenic differentiation. It was originally recognized as an oncogene, and was frequently found to be amplified in prostate, ovarian and colorectal cancer, chronic lymphocytic leukemia, colon carcinoma, esophageal cancer, dog lymphomas and breast cancer. In the present study, the complete RNPC1 gene was identified in a number of vertebrate genomes, suggesting that RNPC1 exists in all types of vertebrates, including fish, amphibians, birds and mammals. In the different genomes, the gene had a similar 4 exon/3 intron organization, and all the genetic loci were syntenically conserved. The phylogenetic tree demonstrated that the RNPC1 gene from the mammalian, bird, reptile and teleost lineage formed a species-specific cluster. A total of 34 functionally relevant single nucleotide polymorphisms (SNPs), including 14 SNPs causing missense mutations, 8 exonic splicing enhancer SNPs and 12 SNPs causing nonsense mutations, were identified in the human RNPC1 gene. RNPC1 was found to be expressed in bladder, blood, brain, breast, colorectal, eye, head and neck, lung, ovarian, skin and soft tissue cancer. In 14 of the 94 tests, an association between RNPC1 gene expression and cancer prognosis was observed. We found that the association between the expression of RNPC1 and prognosis varied in different types of cancer, and even in the same type of cancer from the different databases used. This suggests that the function of RNPC1 in these tumors may be multidimensional. The sex determining region Y (SRY)-box 5 (Sox5), runt-related transcription factor 3 (RUNX3), CCAAT displacement protein 1 (CUTL1), v-rel avian reticuloendotheliosis viral oncogene homolog (Rel)A, peroxisome proliferator-activated receptor γ isoform 2 (PPARγ2) and activating transcription factor 6 (ATF6) regulatory transcription factor binding sites were identified in the upstream (promoter) region of the RNPC1 gene, and may thus be involved in the effects of RNPC1 in tumors. PMID:26046131

  7. Measurement and prediction of pork colour.

    Science.gov (United States)

    Van Oeckel, M J; Warnants, N; Boucqué, C V

    1999-08-01

    The extent to which instrumental colour determinations by FOPu (light scattering), Göfo (reflectance) and Labscan II (CIE L*, CIE a* and CIE b*, hue and chroma) are related to the Japanese colour grades was studied. Additionally, four on-line methods: pH1, FOP1, PQM1 (conductivity) and DDLT (Double Density Light Transmission, analogous to Capteur Gras/Maigre), were evaluated for their ability to predict subjectively and objectively colour. One hundred and twenty samples of m. longissimus thoracis et lumborum, from animals of different genotypes, were analysed. Of the instrumental colour determinations, CIE L* (r=-0.82), FOPu (r=-0.70) and Göfo (r=0.70) were best correlated with the Japanese colour scores. The Japanese colour grades could be predicted by the on-line instruments, pH1, FOP1, PQM1 and DDLT, with determination coefficients between 15 and 28%. Ultimate meat colour, determined by Japanese colour standards, FOPu, Göfo and CIE L*, was better predicted by DDLT than by the classic on-line instruments: FOP1, pH1 and PQM1, although the standard error of the estimate was similar for all instruments. This means that DDLT, although originally designed for estimating lean meat percentage, can additionally give information about meat quality, in particular colour. However, it must be stressed that the colour estimate by DDLT refers to a population of animals, rather than to individual pigs, because of the number of erroneously assigned samples.

  8. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness.

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia's marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to 'small p and large n' problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  9. HECTR [Hydrogen Event: Containment Transient Response] analyses of the Nevada Test Site (NTS) premixed combustion experiments

    International Nuclear Information System (INIS)

    The HECTR (Hydrogen Event: Containment Transient Response) computer code has been developed at Sandia National Laboratories to predict the transient pressure and temperature responses within reactor containments for hypothetical accidents involving the transport and combustion of hydrogen. Although HECTR was designed primarily to investigate these phenomena in LWRs, it may also be used to analyze hydrogen transport and combustion experiments as well. It is in this manner that HECTR is assessed and empirical correlations, such as the combustion completeness and flame speed correlations for the hydrogen combustion model, if necessary, are upgraded. In this report, we present HECTR analyses of the large-scale premixed hydrogen combustion experiments at the Nevada Test Site (NTS) and comparison with the test results. The existing correlations in HECTR version 1.0, under certain conditions, have difficulty in predicting accurately the combustion completeness and burn time for the NTS experiments. By combining the combustion data obtained from the NTS experiments with other experimental data (FITS, VGES, ACUREX, and Whiteshell), a set of new and better combustion correlations was generated. HECTR prediction of the containment responses, using a single-compartment model and EPRI-provided combustion completeness and burn time, compares reasonably well against the test results. However, HECTR prediction of the containment responses using a multicompartment model does not compare well with the test results. This discrepancy shows the deficiency of the homogeneous burning model used in HECTR. To overcome this deficiency, a flame propagation model is highly recommended. 16 refs., 84 figs., 5 tabs

  10. Evaluation of residue-residue contact prediction in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-08-31

    We present the results of the assessment of the intramolecular residue-residue contact predictions from 26 prediction groups participating in the 10th round of the CASP experiment. The most recently developed direct coupling analysis methods did not take part in the experiment likely because they require a very deep sequence alignment not available for any of the 114 CASP10 targets. The performance of contact prediction methods was evaluated with the measures used in previous CASPs (i.e., prediction accuracy and the difference between the distribution of the predicted contacts and that of all pairs of residues in the target protein), as well as new measures, such as the Matthews correlation coefficient, the area under the precision-recall curve and the ranks of the first correctly and incorrectly predicted contact. We also evaluated the ability to detect interdomain contacts and tested whether the difficulty of predicting contacts depends upon the protein length and the depth of the family sequence alignment. The analyses were carried out on the target domains for which structural homologs did not exist or were difficult to identify. The evaluation was performed for all types of contacts (short, medium, and long-range), with emphasis placed on long-range contacts, i.e. those involving residues separated by at least 24 residues along the sequence. The assessment suggests that the best CASP10 contact prediction methods perform at approximately the same level, and comparably to those participating in CASP9.

  11. Update on protein structure prediction

    DEFF Research Database (Denmark)

    Hubbard, T; Tramontano, A; Barton, G;

    1996-01-01

    Computational tools for protein structure prediction are of great interest to molecular, structural and theoretical biologists due to a rapidly increasing number of protein sequences with no known structure. In October 1995, a workshop was held at IRBM to predict as much as possible about a number...... of proteins of biological interest using ab initio pre!diction of fold recognition methods. 112 protein sequences were collected via an open invitation for target submissions. 17 were selected for prediction during the workshop and for 11 of these a prediction of some reliability could be made. We believe...

  12. Prediction of IRI in short and long terms for flexible pavements: ANN and GMDH methods

    NARCIS (Netherlands)

    Ziari, H.; Sobhani, J.; Ayoubinejad, J.; Hartmann, T.

    2015-01-01

    Prediction of pavement condition is one of the most important issues in pavement management systems. In this paper, capabilities of artificial neural networks (ANNs) and group method of data handling (GMDH) methods in predicting flexible pavement conditions were analysed in three levels: in 1 year,

  13. Practices for predicting and preventing preterm birth in Ireland: a national survey.

    LENUS (Irish Health Repository)

    Smith, V

    2011-03-01

    Preterm birth can result in adverse outcomes for the neonate and\\/or his\\/her family. The accurate prediction and prevention of preterm birth is paramount. This study describes and critically analyses practices for predicting and preventing preterm birth in Ireland.

  14. Predicting the Creativity of Design Majors Based on the Interaction of Diverse Personality Traits

    Science.gov (United States)

    Chang, Chi-Cheng; Peng, Li-Pei; Lin, Ju-Sen; Liang, Chaoyun

    2015-01-01

    In this study, design majors were analysed to examine how diverse personality traits interact and influence student creativity. The study participants comprised 476 design majors. The results indicated that openness predicted the originality of creativity, whereas openness, conscientiousness and agreeableness predicted the usefulness of…

  15. Modeling and Prediction Overview

    Energy Technology Data Exchange (ETDEWEB)

    Ermak, D L

    2002-10-18

    Effective preparation for and response to the release of toxic materials into the atmosphere hinges on accurate predictions of the dispersion pathway, concentration, and ultimate fate of the chemical or biological agent. Of particular interest is the threat to civilian populations within major urban areas, which are likely targets for potential attacks. The goals of the CBNP Modeling and Prediction area are: (1) Development of a suite of validated, multi-scale, atmospheric transport and fate modeling capabilities for chemical and biological agent releases within the complex urban environment; (2) Integration of these models and related user tools into operational emergency response systems. Existing transport and fate models are being adapted to treat the complex atmospheric flows within and around structures (e.g., buildings, subway systems, urban areas) and over terrain. Relevant source terms and the chemical and physical behavior of gas- and particle-phase species (e.g., losses due to deposition, bio-agent viability, degradation) are also being developed and incorporated into the models. Model validation is performed using both laboratory and field data. CBNP is producing and testing a suite of models with differing levels of complexity and fidelity to address the full range of user needs and applications. Lumped-parameter transport models are being developed for subway systems and building interiors, supplemented by the use of computational fluid dynamics (CFD) models to describe the circulation within large, open spaces such as auditoriums. Both sophisticated CFD transport models and simpler fast-response models are under development to treat the complex flow around individual structures and arrays of buildings. Urban parameterizations are being incorporated into regional-scale weather forecast, meteorological data assimilation, and dispersion models for problems involving larger-scale urban and suburban areas. Source term and dose response models are being

  16. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  17. Earthquake prediction with electromagnetic phenomena

    International Nuclear Information System (INIS)

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary

  18. Chaotic time series. Part II. System Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Bjørn Lillekjendlie

    1994-10-01

    Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  19. Analyses of beyond design basis accident homogeneous boron dilution scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Kereszturi, Andras; Hegyi, Gyoergy; Maraczy, Csaba; Trosztel, Istvan; Tota, Adam [Hungarian Academy of Sciences, Centre for Energy Research, Budapest (Hungary); Karsa, Zoltan [NUBIKI Nuclear Safety Research Institute, Ltd., Budapest (Hungary)

    2015-09-15

    Homogeneous boron dilution scenarios in a VVER-440 reactor were analyzed using the coupled KIKO3D-ATHLET code. The scenarios are named ''homogeneous'' because of the very slow dilution caused by a rupture in the heat exchanger of the makeup system. Without the presented analyses, a significant contribution of the homogeneous boron dilution to the Core Damage Frequency (CDF) had to be assumed in the Probabilistic Safety Analyses (PSA). According to the combined results of the presented deterministic and probabilistic analyses, the final conclusion is that boron dilution transients don't give significant contribution to the CDF for the investigated VVER-440 NPP.

  20. Analyse de la texture: Filtrage et Matrice de cooccurrence

    OpenAIRE

    Belhabri, Mohamed Amine; Aissaoui, Houssem Eddine

    2014-01-01

    L’analyse de la texture des images est un domaine de recherche qui, aujourd'hui, donne lieu à de nombreuses études. Analyser la texture, revient à exploiter les relations de voisinage des pixels. En ce sens, elle cherche à rehausser les structures significatives contenues dans l’image. Dans ce document, on s’est intéressé à l’une des approches d’analyse de texture à savoir la matrice de cooccurrence et au filtrage des images. Ces approches sont des outils qui peuvent permettre ...

  1. Alice Krieg-Planque, Analyser les discours institutionnels

    OpenAIRE

    Simon, Justine

    2012-01-01

    Comment amener des étudiants d’origines disciplinaires diverses à pratiquer l’analyse de discours et à s’en approprier les notions et concepts principaux, au service de l’analyse des productions discursives institutionnelles ? Telle est l’ambition centrale de l’ouvrage d’Alice Krieg-Planque, récemment paru chez Armand Colin. A la fois « guide pour l’analyse » et manuel visant à tisser un ensemble de compétences en matière d’approches discursives pour des étudiants au sein de formations plurid...

  2. Analyse des signaux électrocardiogrammes:Une approche fractale

    OpenAIRE

    SEDJELMACI, Ibticeme

    2009-01-01

    Notre travail consiste à traiter les signaux ECG de sujets de différents ages, normaux et présentant des pathologies par une nouvelle approche en traitement du signal appelée analyse fractale pour une identification de différentes pathologies et une éventuelle possibilité de trouver une signature fractale. L'utilisation d'ondelettes et d'outils d'analyse fractale est appropriée à l'analyse des signaux irréguliers. La caractérisation de la régularité (locale) est importante dans...

  3. The role of CFD computer analyses in hydrogen safety management

    Energy Technology Data Exchange (ETDEWEB)

    Komen, Ed M.J.; Visser, Dirk C.; Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands); Te Lintelo, Jos G.T. [N.V. Elekticiteits-Productiemaatschappij Zuid-Nederland EPZ, Borssele (Netherlands)

    2015-11-15

    The risks of hydrogen release and combustion during a severe accident in a light water reactor have attracted considerable attention after the Fukushima accident in Japan. Reliable computer analyses are needed for the optimal design of hydrogen mitigation systems. In the last decade, significant progress has been made in the development, validation, and application of more detailed, three-dimensional Computational Fluid Dynamics (CFD) simulations for hydrogen safety analyses. The validation status and reliability of CFD code simulations will be illustrated by validation analyses performed for experiments executed in the PANDA, THAI, and ENACCEF facilities.

  4. Protein Chemical Shift Prediction

    CERN Document Server

    Larsen, Anders S

    2014-01-01

    The protein chemical shifts holds a large amount of information about the 3-dimensional structure of the protein. A number of chemical shift predictors based on the relationship between structures resolved with X-ray crystallography and the corresponding experimental chemical shifts have been developed. These empirical predictors are very accurate on X-ray structures but tends to be insensitive to small structural changes. To overcome this limitation it has been suggested to make chemical shift predictors based on quantum mechanical(QM) calculations. In this thesis the development of the QM derived chemical shift predictor Procs14 is presented. Procs14 is based on 2.35 million density functional theory(DFT) calculations on tripeptides and contains corrections for hydrogen bonding, ring current and the effect of the previous and following residue. Procs14 is capable at performing predictions for the 13CA, 13CB, 13CO, 15NH, 1HN and 1HA backbone atoms. In order to benchmark Procs14, a number of QM NMR calculatio...

  5. An exact prediction of

    International Nuclear Information System (INIS)

    We propose that the expectation value of a circular BPS-Wilson loop in N=4 supersymmetric Yang--Mills can be calculated exactly, to all orders in a 1/N expansion and to all orders in g2N. Using the AdS/CFT duality, this result yields a prediction of the value of the string amplitude with a circular boundary to all orders in α' and to all orders in gs. We then compare this result with string theory. We find that the gauge theory calculation, for large g2N and to all orders in the 1/N2 expansion, does agree with the leading string theory calculation, to all orders in gs and to lowest order in α'. We also find a relation between the expectation value of any closed smooth Wilson loop and the loop related to it by an inversion that takes a point along the loop to infinity, and compare this result, again successfully, with string theory

  6. Predictive Analysis for Social Processes II: Predictability and Warning Analysis

    CERN Document Server

    Colbaugh, Richard

    2009-01-01

    This two-part paper presents a new approach to predictive analysis for social processes. Part I identifies a class of social processes, called positive externality processes, which are both important and difficult to predict, and introduces a multi-scale, stochastic hybrid system modeling framework for these systems. In Part II of the paper we develop a systems theory-based, computationally tractable approach to predictive analysis for these systems. Among other capabilities, this analytic methodology enables assessment of process predictability, identification of measurables which have predictive power, discovery of reliable early indicators for events of interest, and robust, scalable prediction. The potential of the proposed approach is illustrated through case studies involving online markets, social movements, and protest behavior.

  7. SparSNP: Fast and memory-efficient analysis of all SNPs for phenotype prediction

    OpenAIRE

    Abraham Gad; Kowalczyk Adam; Zobel Justin; Inouye Michael

    2012-01-01

    Abstract Background A central goal of genomics is to predict phenotypic variation from genetic variation. Fitting predictive models to genome-wide and whole genome single nucleotide polymorphism (SNP) profiles allows us to estimate the predictive power of the SNPs and potentially develop diagnostic models for disease. However, many current datasets cannot be analysed with standard tools due to their large size. Results We introduce SparSNP, a tool for fitting lasso linear models for massive S...

  8. An application of Auto-regressive (AR) model in predicting Aeroelastic Effectsof Lekki Cable Stayed Bridge

    OpenAIRE

    Hassan Abba Musa; Dr. A. Mohammed

    2016-01-01

    In current practice, the predictive analysis of stochastic problems encompasses a variety of statistical techniques from modeling, machine, and data mining that analyse current and historical facts to make predictions about future. Therefore, this research uses an AR Model whose codes are incorporated in the MATLAB software to predict possible aero-elastic effects of Lekki Bridge based on its existing parametric data and the conditions around the bridge. It was seen that, the fluc...

  9. New insights into domestication of carrot from root transcriptome analyses

    NARCIS (Netherlands)

    Rong, J.; Lammers, Y.; Strasburg, J.L.; Schidlo, N.S.; Ariyurek, Y.; Jong, de T.J.; Klinkhamer, P.G.L.; Smulders, M.J.M.; Vrieling, K.

    2014-01-01

    Background - Understanding the molecular basis of domestication can provide insights into the processes of rapid evolution and crop improvement. Here we demonstrated the processes of carrot domestication and identified genes under selection based on transcriptome analyses. Results - The root transcr

  10. Systematic Derivation of Static Analyses for Software Product Lines

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Brabrand, Claus; Wasowski, Andrzej

    2014-01-01

    is a classical framework for deriving static analyses in a compositional, step-by-step manner. We show how to take an analysis expressed as an abstract interpretation and lift each of the abstract interpretation steps to a family of programs. This includes schemes for how to lift domain types, and combinators......A recent line of work lifts particular verification and analysis methods to Software Product Lines (SPL). In an effort to generalize such case-by-case approaches, we develop a systematic methodology for lifting program analyses to SPLs using abstract interpretation. Abstract interpretation...... for lifting analyses and Galois connections. We prove that for analyses developed using our method, the soundness of lifting follows by construction. Finally, we discuss approximating variability in an analysis and we derive variational data-flow equations for an example analysis, a constant propagation...

  11. Multielement trace analyses of SINQ materials by ICP-OES

    Energy Technology Data Exchange (ETDEWEB)

    Keil, R.; Schwikowski, M. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Inductively Coupled Plasma Optical Emission Spectrometry was used to analyse 70 elements in various materials used for construction of the SINQ. Detection limits for individual elements depend strongly on the matrix and had to be determined separately. (author) 1 tab.

  12. Duurzaam concurreren in de Nederlandse melkveehouderij: een eerste verkennende analyse

    NARCIS (Netherlands)

    Bergevoet, R.H.M.; Calker, van K.J.; Goddijn, S.T.

    2006-01-01

    Dit rapport bevat het resultaat van een eerste verkennende analyse van de positie op het gebied van duurzaamheid van de Nederlandse melkveehouderij. Onderzocht zijn maatschappelijke en ecologische duurzaamheid van de Nederlandse melkveehouderij in vergelijking met de duurzaamheid van de melkveehoude

  13. Summary of Prometheus Radiation Shielding Nuclear Design Analyses , for information

    International Nuclear Information System (INIS)

    This report transmits a summary of radiation shielding nuclear design studies performed to support the Prometheus project. Together, the enclosures and references associated with this document describe NRPCT (KAPL and Bettis) shielding nuclear design analyses done for the project

  14. L’analyse de discours, mesures à l’appui

    OpenAIRE

    Fiala, Pierre

    2007-01-01

    Philippe Schepens : L’analyse de discours « à la française » reste marquée par les grands auteurs qui ont présidé à sa naissance : Foucault, Althusser, Pêcheux, pour n’en citer que trois. Dans la pratique de l’analyse de discours, une série de notions restent « des zones sensibles » sur le plan épistémologique et méthodologique. La toute première est celle de la visée qu’on donne à l’analyse, et celle de sa légitimité : ainsi Maingueneau pose-t-il cette question dans Le Dictionnaire d’analyse...

  15. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in...

  16. Finite element analyses for RF photoinjector gun cavities

    Energy Technology Data Exchange (ETDEWEB)

    Marhauser, F. [Berliner Elektronenspeicherring-Gesellschaft fuer Synchrotronstrahlung mbH (BESSY), Berlin (Germany)

    2006-07-01

    This paper details electromagnetical, thermal and structural 3D Finite Element Analyses (FEA) for normal conducting RF photoinjector gun cavities. The simulation methods are described extensively. Achieved results are presented. (orig.)

  17. An early "Atkins' Diet": RA Fisher analyses a medical "experiment".

    Science.gov (United States)

    Senn, Stephen

    2006-04-01

    A study on vitamin absorption which RA Fisher analysed for WRG Atkins and co-authored with him is critically examined. The historical background as well as correspondence between Atkins and Fisher is presented.

  18. Flood frequency analyses with annual and partial flood series

    Science.gov (United States)

    Bezak, N.; Brilly, M.; Sraj, M.

    2012-04-01

    The objective of the study was (1) to analyse the influence of time scale of the data on the results, (2) to analyse the relations between discharge, volume and time of flood waves of the Sava river at Litija (Slovenia), (3) to perform flood frequency analyses of peak discharges with annual and partial data series and compare the results and (4) to explore the influence of threshold value by POT method. Calculations and analyses were made for the period 1953-2010. Daily scale data sets (considering also local maximum) were used. The flood frequency analyses were based on anual and partial data series. The differences between daily and hourly time scale data sets were explored. Daily and hourly time scale hydrographs were compared and differences were analysed. Differences were adequately small. Daily time series with included maximums were logical choice because of the length of the daily time series and because hourly time series were not continuous due to gauging equipment failures. Important objective of the study was to analyse the relationship between discharge, volume and duration of flood waves. Baseflow was separated from continuous daily discharge measurements on simple and complex hydrographs. Simple graphical method with three points was used. Many different coefficients like base flow index were calculated and different combinations of correlation coefficient of wave components were examined. Annual maximum series were used to study the relationship between wave components. Flood frequency analyses were made with annual maximum series and partial duration series. Log-normal distribution, Pearson distribution type 3, log-Pearson distribution type 3, Gumbel distribution, exponential distribution, GEV distribution and GL distribution were used for annual maximum series. Simple equation of linear transformation was used to determine the design discharge and procedure which is proposed in Flood Estimation Handbook was used with GEV and GL distribution

  19. Imperfection insensitivity analyses of advanced composite tow-steered shells

    OpenAIRE

    Wu, Kin C; Farrokh, Babak; Stanford, Bret K.; Weaver, Paul M

    2016-01-01

    Two advanced composite tow-steered shells, one with tow overlaps and another without overlaps, were previously designed, fabricated and tested in end compression, both without cutouts, and with small and large cutouts. In each case, good agreement was observed between experimental buckling loads and supporting linear bifurcation buckling analyses. However, previous buckling tests and analyses have shown historically poor correlation, perhaps due to the presence of geometric imperfections that...

  20. Melanoma risk prediction models

    Directory of Open Access Journals (Sweden)

    Nikolić Jelena

    2014-01-01

    Full Text Available Background/Aim. The lack of effective therapy for advanced stages of melanoma emphasizes the importance of preventive measures and screenings of population at risk. Identifying individuals at high risk should allow targeted screenings and follow-up involving those who would benefit most. The aim of this study was to identify most significant factors for melanoma prediction in our population and to create prognostic models for identification and differentiation of individuals at risk. Methods. This case-control study included 697 participants (341 patients and 356 controls that underwent extensive interview and skin examination in order to check risk factors for melanoma. Pairwise univariate statistical comparison was used for the coarse selection of the most significant risk factors. These factors were fed into logistic regression (LR and alternating decision trees (ADT prognostic models that were assessed for their usefulness in identification of patients at risk to develop melanoma. Validation of the LR model was done by Hosmer and Lemeshow test, whereas the ADT was validated by 10-fold cross-validation. The achieved sensitivity, specificity, accuracy and AUC for both models were calculated. The melanoma risk score (MRS based on the outcome of the LR model was presented. Results. The LR model showed that the following risk factors were associated with melanoma: sunbeds (OR = 4.018; 95% CI 1.724- 9.366 for those that sometimes used sunbeds, solar damage of the skin (OR = 8.274; 95% CI 2.661-25.730 for those with severe solar damage, hair color (OR = 3.222; 95% CI 1.984-5.231 for light brown/blond hair, the number of common naevi (over 100 naevi had OR = 3.57; 95% CI 1.427-8.931, the number of dysplastic naevi (from 1 to 10 dysplastic naevi OR was 2.672; 95% CI 1.572-4.540; for more than 10 naevi OR was 6.487; 95%; CI 1.993-21.119, Fitzpatricks phototype and the presence of congenital naevi. Red hair, phototype I and large congenital naevi were