WorldWideScience

Sample records for analyses predict sistergroup

  1. Phylogenomic analyses predict sistergroup relationship of nucleariids and Fungi and paraphyly of zygomycetes with significant support

    Directory of Open Access Journals (Sweden)

    Steenkamp Emma

    2009-01-01

    Full Text Available Abstract Background Resolving the evolutionary relationships among Fungi remains challenging because of their highly variable evolutionary rates, and lack of a close phylogenetic outgroup. Nucleariida, an enigmatic group of amoeboids, have been proposed to emerge close to the fungal-metazoan divergence and might fulfill this role. Yet, published phylogenies with up to five genes are without compelling statistical support, and genome-level data should be used to resolve this question with confidence. Results Our analyses with nuclear (118 proteins and mitochondrial (13 proteins data now robustly associate Nucleariida and Fungi as neighbors, an assemblage that we term 'Holomycota'. With Nucleariida as an outgroup, we revisit unresolved deep fungal relationships. Conclusion Our phylogenomic analysis provides significant support for the paraphyly of the traditional taxon Zygomycota, and contradicts a recent proposal to include Mortierella in a phylum Mucoromycotina. We further question the introduction of separate phyla for Glomeromycota and Blastocladiomycota, whose phylogenetic positions relative to other phyla remain unresolved even with genome-level datasets. Our results motivate broad sampling of additional genome sequences from these phyla.

  2. The first record of a trans-oceanic sister-group relationship between obligate vertebrate troglobites.

    Directory of Open Access Journals (Sweden)

    Prosanta Chakrabarty

    Full Text Available We show using the most complete phylogeny of one of the most species-rich orders of vertebrates (Gobiiformes, and calibrations from the rich fossil record of teleost fishes, that the genus Typhleotris, endemic to subterranean karst habitats in southwestern Madagascar, is the sister group to Milyeringa, endemic to similar subterranean systems in northwestern Australia. Both groups are eyeless, and our phylogenetic and biogeographic results show that these obligate cave fishes now found on opposite ends of the Indian Ocean (separated by nearly 7,000 km are each others closest relatives and owe their origins to the break up of the southern supercontinent, Gondwana, at the end of the Cretaceous period. Trans-oceanic sister-group relationships are otherwise unknown between blind, cave-adapted vertebrates and our results provide an extraordinary case of Gondwanan vicariance.

  3. The first record of a trans-oceanic sister-group relationship between obligate vertebrate troglobites.

    Science.gov (United States)

    Chakrabarty, Prosanta; Davis, Matthew P; Sparks, John S

    2012-01-01

    We show using the most complete phylogeny of one of the most species-rich orders of vertebrates (Gobiiformes), and calibrations from the rich fossil record of teleost fishes, that the genus Typhleotris, endemic to subterranean karst habitats in southwestern Madagascar, is the sister group to Milyeringa, endemic to similar subterranean systems in northwestern Australia. Both groups are eyeless, and our phylogenetic and biogeographic results show that these obligate cave fishes now found on opposite ends of the Indian Ocean (separated by nearly 7,000 km) are each others closest relatives and owe their origins to the break up of the southern supercontinent, Gondwana, at the end of the Cretaceous period. Trans-oceanic sister-group relationships are otherwise unknown between blind, cave-adapted vertebrates and our results provide an extraordinary case of Gondwanan vicariance.

  4. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  5. Prediction formulas for individual opioid analgesic requirements based on genetic polymorphism analyses.

    Directory of Open Access Journals (Sweden)

    Kaori Yoshida

    Full Text Available The analgesic efficacy of opioids is well known to vary widely among individuals, and various factors related to individual differences in opioid sensitivity have been identified. However, a prediction model to calculate appropriate opioid analgesic requirements has not yet been established. The present study sought to construct prediction formulas for individual opioid analgesic requirements based on genetic polymorphisms and clinical data from patients who underwent cosmetic orthognathic surgery and validate the utility of the prediction formulas in patients who underwent major open abdominal surgery.To construct the prediction formulas, we performed multiple linear regression analyses using data from subjects who underwent cosmetic orthognathic surgery. The dependent variable was 24-h postoperative or perioperative fentanyl use, and the independent variables were age, gender, height, weight, pain perception latencies (PPL, and genotype data of five single-nucleotide polymorphisms (SNPs. To examine the utility of the prediction formulas, we performed simple linear regression analyses using subjects who underwent major open abdominal surgery. Actual 24-h postoperative or perioperative analgesic use and the predicted values that were calculated using the multiple regression equations were incorporated as dependent and independent variables, respectively.Multiple linear regression analyses showed that the four SNPs, PPL, and weight were retained as independent predictors of 24-h postoperative fentanyl use (R² = 0.145, P = 5.66 × 10⁻¹⁰ and the two SNPs and weight were retained as independent predictors of perioperative fentanyl use (R² = 0.185, P = 1.99 × 10⁻¹⁵. Simple linear regression analyses showed that the predicted values were retained as an independent predictor of actual 24-h postoperative analgesic use (R² = 0.033, P = 0.030 and perioperative analgesic use (R² = 0.100, P = 1.09 × 10⁻⁴, respectively.We constructed

  6. Prediction Uncertainty Analyses for the Combined Physically-Based and Data-Driven Models

    Science.gov (United States)

    Demissie, Y. K.; Valocchi, A. J.; Minsker, B. S.; Bailey, B. A.

    2007-12-01

    The unavoidable simplification associated with physically-based mathematical models can result in biased parameter estimates and correlated model calibration errors, which in return affect the accuracy of model predictions and the corresponding uncertainty analyses. In this work, a physically-based groundwater model (MODFLOW) together with error-correcting artificial neural networks (ANN) are used in a complementary fashion to obtain an improved prediction (i.e. prediction with reduced bias and error correlation). The associated prediction uncertainty of the coupled MODFLOW-ANN model is then assessed using three alternative methods. The first method estimates the combined model confidence and prediction intervals using first-order least- squares regression approximation theory. The second method uses Monte Carlo and bootstrap techniques for MODFLOW and ANN, respectively, to construct the combined model confidence and prediction intervals. The third method relies on a Bayesian approach that uses analytical or Monte Carlo methods to derive the intervals. The performance of these approaches is compared with Generalized Likelihood Uncertainty Estimation (GLUE) and Calibration-Constrained Monte Carlo (CCMC) intervals of the MODFLOW predictions alone. The results are demonstrated for a hypothetical case study developed based on a phytoremediation site at the Argonne National Laboratory. This case study comprises structural, parameter, and measurement uncertainties. The preliminary results indicate that the proposed three approaches yield comparable confidence and prediction intervals, thus making the computationally efficient first-order least-squares regression approach attractive for estimating the coupled model uncertainty. These results will be compared with GLUE and CCMC results.

  7. Quantitative Prediction of Coalbed Gas Content Based on Seismic Multiple-Attribute Analyses

    Directory of Open Access Journals (Sweden)

    Renfang Pan

    2015-09-01

    Full Text Available Accurate prediction of gas planar distribution is crucial to selection and development of new CBM exploration areas. Based on seismic attributes, well logging and testing data we found that seismic absorption attenuation, after eliminating the effects of burial depth, shows an evident correlation with CBM gas content; (positive structure curvature has a negative correlation with gas content; and density has a negative correlation with gas content. It is feasible to use the hydrocarbon index (P*G and pseudo-Poisson ratio attributes for detection of gas enrichment zones. Based on seismic multiple-attribute analyses, a multiple linear regression equation was established between the seismic attributes and gas content at the drilling wells. Application of this equation to the seismic attributes at locations other than the drilling wells yielded a quantitative prediction of planar gas distribution. Prediction calculations were performed for two different models, one using pre-stack inversion and the other one disregarding pre-stack inversion. A comparison of the results indicates that both models predicted a similar trend for gas content distribution, except that the model using pre-stack inversion yielded a prediction result with considerably higher precision than the other model.

  8. HHV Predicting Correlations for Torrefied Biomass Using Proximate and Ultimate Analyses

    Directory of Open Access Journals (Sweden)

    Daya Ram Nhuchhen

    2017-01-01

    Full Text Available Many correlations are available in the literature to predict the higher heating value (HHV of raw biomass using the proximate and ultimate analyses. Studies on biomass torrefaction are growing tremendously, which suggest that the fuel characteristics, such as HHV, proximate analysis and ultimate analysis, have changed significantly after torrefaction. Such changes may cause high estimation errors if the existing HHV correlations were to be used in predicting the HHV of torrefied biomass. No study has been carried out so far to verify this. Therefore, this study seeks answers to the question: “Can the existing correlations be used to determine the HHV of the torrefied biomass”? To answer this, the existing HHV predicting correlations were tested using torrefied biomass data points. Estimation errors were found to be significantly high for the existing HHV correlations, and thus, they are not suitable for predicting the HHV of the torrefied biomass. New correlations were then developed using data points of torrefied biomass. The ranges of reported data for HHV, volatile matter (VM, fixed carbon (FC, ash (ASH, carbon (C, hydrogen (H and oxygen (O contents were 14.90 MJ/kg–33.30 MJ/kg, 13.30%–88.57%, 11.25%–82.74%, 0.08%–47.62%, 35.08%–86.28%, 0.53%–7.46% and 4.31%–44.70%, respectively. Correlations with the minimum mean absolute errors and having all components of proximate and ultimate analyses were selected for future use. The selected new correlations have a good accuracy of prediction when they are validated using another set of data (26 samples. Thus, these new and more accurate correlations can be useful in modeling different thermochemical processes, including combustion, pyrolysis and gasification processes of torrefied biomass.

  9. Predictability of the monthly North Atlantic Oscillation index based on fractal analyses and dynamic system theory

    Directory of Open Access Journals (Sweden)

    M. D. Martínez

    2010-03-01

    Full Text Available The predictability of the monthly North Atlantic Oscillation, NAO, index is analysed from the point of view of different fractal concepts and dynamic system theory such as lacunarity, rescaled analysis (Hurst exponent and reconstruction theorem (embedding and correlation dimensions, Kolmogorov entropy and Lyapunov exponents. The main results point out evident signs of randomness and the necessity of stochastic models to represent time evolution of the NAO index. The results also show that the monthly NAO index behaves as a white-noise Gaussian process. The high minimum number of nonlinear equations needed to describe the physical process governing the NAO index fluctuations is evidence of its complexity. A notable predictive instability is indicated by the positive Lyapunov exponents. Besides corroborating the complex time behaviour of the NAO index, present results suggest that random Cantor sets would be an interesting tool to model lacunarity and time evolution of the NAO index.

  10. Design and Antigenic Epitopes Prediction of a New Trial Recombinant Multiepitopic Rotaviral Vaccine: In Silico Analyses.

    Science.gov (United States)

    Jafarpour, Sima; Ayat, Hoda; Ahadi, Ali Mohammad

    2015-01-01

    Rotavirus is the major etiologic factor of severe diarrheal disease. Natural infection provides protection against subsequent rotavirus infection and diarrhea. This research presents a new vaccine designed based on computational models. In this study, three types of epitopes are considered-linear, conformational, and combinational-in a proposed model protein. Several studies on rotavirus vaccines have shown that VP6 and VP4 proteins are good candidates for vaccine production. In the present study, a fusion protein was designed as a new generation of rotavirus vaccines by bioinformatics analyses. This model-based study using ABCpred, BCPREDS, Bcepred, and Ellipro web servers showed that the peptide presented in this article has the necessary properties to act as a vaccine. Prediction of linear B-cell epitopes of peptides is helpful to investigate whether these peptides are able to activate humoral immunity.

  11. PASMet: a web-based platform for prediction, modelling and analyses of metabolic systems.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Mejia, Ramon Francisco; Arita, Masanori; Hirai, Masami Yokota

    2016-07-01

    PASMet (Prediction, Analysis and Simulation of Metabolic networks) is a web-based platform for proposing and verifying mathematical models to understand the dynamics of metabolism. The advantages of PASMet include user-friendliness and accessibility, which enable biologists and biochemists to easily perform mathematical modelling. PASMet offers a series of user-functions to handle the time-series data of metabolite concentrations. The functions are organised into four steps: (i) Prediction of a probable metabolic pathway and its regulation; (ii) Construction of mathematical models; (iii) Simulation of metabolic behaviours; and (iv) Analysis of metabolic system characteristics. Each function contains various statistical and mathematical methods that can be used independently. Users who may not have enough knowledge of computing or programming can easily and quickly analyse their local data without software downloads, updates or installations. Users only need to upload their files in comma-separated values (CSV) format or enter their model equations directly into the website. Once the time-series data or mathematical equations are uploaded, PASMet automatically performs computation on server-side. Then, users can interactively view their results and directly download them to their local computers. PASMet is freely available with no login requirement at http://pasmet.riken.jp/ from major web browsers on Windows, Mac and Linux operating systems.

  12. The mitochondrial genome of Paraspadella gotoi is highly reduced and reveals that chaetognaths are a sister-group to protostomes

    Energy Technology Data Exchange (ETDEWEB)

    Helfenbein, Kevin G.; Fourcade, H. Matthew; Vanjani, Rohit G.; Boore, Jeffrey L.

    2004-05-01

    We report the first complete mitochondrial (mt) DNA sequence from a member of the phylum Chaetognatha (arrow worms). The Paraspadella gotoi mtDNA is highly unusual, missing 23 of the genes commonly found in animal mtDNAs, including atp6, which has otherwise been found universally to be present. Its 14 genes are unusually arranged into two groups, one on each strand. One group is punctuated by numerous non-coding intergenic nucleotides, while the other group is tightly packed, having no non-coding nucleotides, leading to speculation that there are two transcription units with differing modes of expression. The phylogenetic position of the Chaetognatha within the Metazoa has long been uncertain, with conflicting or equivocal results from various morphological analyses and rRNA sequence comparisons. Comparisons here of amino acid sequences from mitochondrially encoded proteins gives a single most parsimonious tree that supports a position of Chaetognatha as sister to the protostomes studied here. From this, one can more clearly interpret the patterns of evolution of various developmental features, especially regarding the embryological fate of the blastopore.

  13. Statistical analyses on sandstones: Systematic approach for predicting petrographical and petrophysical properties

    Science.gov (United States)

    Stück, H. L.; Siegesmund, S.

    2012-04-01

    Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statistical analyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statistical analyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity

  14. The Prediction of Disruptive Behaviour Disorders in an Urban Community Sample: The Contribution of Person-Centred Analyses

    Science.gov (United States)

    Burt, Keith B.; Hay, Dale F.; Pawlby, Susan; Harold, Gordon; Sharp, Deborah

    2004-01-01

    Background: Variable- and person-centred analyses were used to examine prediction of middle childhood behaviour problems from earlier child and family measures. Method: A community sample of 164 families, initially recruited at antenatal clinics at two South London practices, was assessed for children's behaviour problems and cognitive ability,…

  15. Analysing the Relevance of Experience Partitions to the Prediction of Players’ Self-Reports of Affect

    DEFF Research Database (Denmark)

    Martínez, Héctor Pérez; Yannakakis, Georgios N.

    2011-01-01

    A common practice in modeling affect from physiological signals consists of reducing the signals to a set of statistical features that feed predictors of self-reported emotions. This paper analyses the impact of various time-windows, used for the extraction of physiological features...

  16. Measuring Usable Knowledge: Teachers' Analyses of Mathematics Classroom Videos Predict Teaching Quality and Student Learning

    Science.gov (United States)

    Kersting, Nicole B.; Givvin, Karen B.; Thompson, Belinda J.; Santagata, Rossella; Stigler, James W.

    2012-01-01

    This study explores the relationships between teacher knowledge, teaching practice, and student learning in mathematics. It extends previous work that developed and evaluated an innovative approach to assessing teacher knowledge based on teachers' analyses of classroom video clips. Teachers watched and commented on 13 fraction clips. These written…

  17. Aeromechanics and Aeroacoustics Predictions of the Boeing-SMART Rotor Using Coupled-CFD/CSD Analyses

    Science.gov (United States)

    Bain, Jeremy; Sim, Ben W.; Sankar, Lakshmi; Brentner, Ken

    2010-01-01

    This paper will highlight helicopter aeromechanics and aeroacoustics prediction capabilities developed by Georgia Institute of Technology, the Pennsylvania State University, and Northern Arizona University under the Helicopter Quieting Program (HQP) sponsored by the Tactical Technology Office of the Defense Advanced Research Projects Agency (DARPA). First initiated in 2004, the goal of the HQP was to develop high fidelity, state-of-the-art computational tools for designing advanced helicopter rotors with reduced acoustic perceptibility and enhanced performance. A critical step towards achieving this objective is the development of rotorcraft prediction codes capable of assessing a wide range of helicopter configurations and operations for future rotorcraft designs. This includes novel next-generation rotor systems that incorporate innovative passive and/or active elements to meet future challenging military performance and survivability goals.

  18. Accuracy of Fall Prediction in Parkinson Disease: Six-Month and 12-Month Prospective Analyses

    Directory of Open Access Journals (Sweden)

    Ryan P. Duncan

    2012-01-01

    Full Text Available Introduction. We analyzed the ability of four balance assessments to predict falls in people with Parkinson Disease (PD prospectively over six and 12 months. Materials and Methods. The BESTest, Mini-BESTest, Functional Gait Assessment (FGA, and Berg Balance Scale (BBS were administered to 80 participants with idiopathic PD at baseline. Falls were then tracked for 12 months. Ability of each test to predict falls at six and 12 months was assessed using ROC curves and likelihood ratios (LR. Results. Twenty-seven percent of the sample had fallen at six months, and 32% of the sample had fallen at 12 months. At six months, areas under the ROC curve (AUC for the tests ranged from 0.8 (FGA to 0.89 (BESTest with LR+ of 3.4 (FGA to 5.8 (BESTest. At 12 months, AUCs ranged from 0.68 (BESTest, BBS to 0.77 (Mini-BESTest with LR+ of 1.8 (BESTest to 2.4 (BBS, FGA. Discussion. The various balance tests were effective in predicting falls at six months. All tests were relatively ineffective at 12 months. Conclusion. This pilot study suggests that people with PD should be assessed biannually for fall risk.

  19. Computational Prediction and Biochemical Analyses of New Inverse Agonists for the CB1 Receptor.

    Science.gov (United States)

    Scott, Caitlin E; Ahn, Kwang H; Graf, Steven T; Goddard, William A; Kendall, Debra A; Abrol, Ravinder

    2016-01-25

    Human cannabinoid type 1 (CB1) G-protein coupled receptor is a potential therapeutic target for obesity. The previously predicted and experimentally validated ensemble of ligand-free conformations of CB1 [Scott, C. E. et al. Protein Sci. 2013 , 22 , 101 - 113 ; Ahn, K. H. et al. Proteins 2013 , 81 , 1304 - 1317] are used here to predict the binding sites for known CB1-selective inverse agonists including rimonabant and its seven known derivatives. This binding pocket, which differs significantly from previously published models, is used to identify 16 novel compounds expected to be CB1 inverse agonists by exploiting potential new interactions. We show experimentally that two of these compounds exhibit inverse agonist properties including inhibition of basal and agonist-induced G-protein coupling activity, as well as an enhanced level of CB1 cell surface localization. This demonstrates the utility of using the predicted binding sites for an ensemble of CB1 receptor structures for designing new CB1 inverse agonists.

  20. Intrinsic disorder in Viral Proteins Genome-Linked: experimental and predictive analyses

    Directory of Open Access Journals (Sweden)

    Van Dorsselaer Alain

    2009-02-01

    Full Text Available Abstract Background VPgs are viral proteins linked to the 5' end of some viral genomes. Interactions between several VPgs and eukaryotic translation initiation factors eIF4Es are critical for plant infection. However, VPgs are not restricted to phytoviruses, being also involved in genome replication and protein translation of several animal viruses. To date, structural data are still limited to small picornaviral VPgs. Recently three phytoviral VPgs were shown to be natively unfolded proteins. Results In this paper, we report the bacterial expression, purification and biochemical characterization of two phytoviral VPgs, namely the VPgs of Rice yellow mottle virus (RYMV, genus Sobemovirus and Lettuce mosaic virus (LMV, genus Potyvirus. Using far-UV circular dichroism and size exclusion chromatography, we show that RYMV and LMV VPgs are predominantly or partly unstructured in solution, respectively. Using several disorder predictors, we show that both proteins are predicted to possess disordered regions. We next extend theses results to 14 VPgs representative of the viral diversity. Disordered regions were predicted in all VPg sequences whatever the genus and the family. Conclusion Based on these results, we propose that intrinsic disorder is a common feature of VPgs. The functional role of intrinsic disorder is discussed in light of the biological roles of VPgs.

  1. Prediction and validation of gene-disease associations using methods inspired by social network analyses.

    Directory of Open Access Journals (Sweden)

    U Martin Singh-Blom

    Full Text Available Correctly identifying associations of genes with diseases has long been a goal in biology. With the emergence of large-scale gene-phenotype association datasets in biology, we can leverage statistical and machine learning methods to help us achieve this goal. In this paper, we present two methods for predicting gene-disease associations based on functional gene associations and gene-phenotype associations in model organisms. The first method, the Katz measure, is motivated from its success in social network link prediction, and is very closely related to some of the recent methods proposed for gene-disease association inference. The second method, called Catapult (Combining dATa Across species using Positive-Unlabeled Learning Techniques, is a supervised machine learning method that uses a biased support vector machine where the features are derived from walks in a heterogeneous gene-trait network. We study the performance of the proposed methods and related state-of-the-art methods using two different evaluation strategies, on two distinct data sets, namely OMIM phenotypes and drug-target interactions. Finally, by measuring the performance of the methods using two different evaluation strategies, we show that even though both methods perform very well, the Katz measure is better at identifying associations between traits and poorly studied genes, whereas Catapult is better suited to correctly identifying gene-trait associations overall [corrected].

  2. Circulating biomarkers for predicting cardiovascular disease risk; a systematic review and comprehensive overview of meta-analyses.

    Directory of Open Access Journals (Sweden)

    Thijs C van Holten

    Full Text Available BACKGROUND: Cardiovascular disease is one of the major causes of death worldwide. Assessing the risk for cardiovascular disease is an important aspect in clinical decision making and setting a therapeutic strategy, and the use of serological biomarkers may improve this. Despite an overwhelming number of studies and meta-analyses on biomarkers and cardiovascular disease, there are no comprehensive studies comparing the relevance of each biomarker. We performed a systematic review of meta-analyses on levels of serological biomarkers for atherothrombosis to compare the relevance of the most commonly studied biomarkers. METHODS AND FINDINGS: Medline and Embase were screened on search terms that were related to "arterial ischemic events" and "meta-analyses". The meta-analyses were sorted by patient groups without pre-existing cardiovascular disease, with cardiovascular disease and heterogeneous groups concerning general populations, groups with and without cardiovascular disease, or miscellaneous. These were subsequently sorted by end-point for cardiovascular disease or stroke and summarized in tables. We have identified 85 relevant full text articles, with 214 meta-analyses. Markers for primary cardiovascular events include, from high to low result: C-reactive protein, fibrinogen, cholesterol, apolipoprotein B, the apolipoprotein A/apolipoprotein B ratio, high density lipoprotein, and vitamin D. Markers for secondary cardiovascular events include, from high to low result: cardiac troponins I and T, C-reactive protein, serum creatinine, and cystatin C. For primary stroke, fibrinogen and serum uric acid are strong risk markers. Limitations reside in that there is no acknowledged search strategy for prognostic studies or meta-analyses. CONCLUSIONS: For primary cardiovascular events, markers with strong predictive potential are mainly associated with lipids. For secondary cardiovascular events, markers are more associated with ischemia. Fibrinogen is a

  3. Accuracy of finite element analyses of CT scans in predictions of vertebral failure patterns under axial compression and anterior flexion.

    Science.gov (United States)

    Jackman, Timothy M; DelMonaco, Alex M; Morgan, Elise F

    2016-01-25

    Finite element (FE) models built from quantitative computed tomography (QCT) scans can provide patient-specific estimates of bone strength and fracture risk in the spine. While prior studies demonstrate accurate QCT-based FE predictions of vertebral stiffness and strength, the accuracy of the predicted failure patterns, i.e., the locations where failure occurs within the vertebra and the way in which the vertebra deforms as failure progresses, is less clear. This study used digital volume correlation (DVC) analyses of time-lapse micro-computed tomography (μCT) images acquired during mechanical testing (compression and anterior flexion) of thoracic spine segments (T7-T9, n=28) to measure displacements occurring throughout the T8 vertebral body at the ultimate point. These displacements were compared to those simulated by QCT-based FE analyses of T8. We hypothesized that the FE predictions would be more accurate when the boundary conditions are based on measurements of pressure distributions within intervertebral discs of similar level of disc degeneration vs. boundary conditions representing rigid platens. The FE simulations captured some of the general, qualitative features of the failure patterns; however, displacement errors ranged 12-279%. Contrary to our hypothesis, no differences in displacement errors were found when using boundary conditions representing measurements of disc pressure vs. rigid platens. The smallest displacement errors were obtained using boundary conditions that were measured directly by DVC at the T8 endplates. These findings indicate that further work is needed to develop methods of identifying physiological loading conditions for the vertebral body, for the purpose of achieving robust, patient-specific FE analyses of failure mechanisms.

  4. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 1; Steady Predictions

    Science.gov (United States)

    Allgood, Daniel C.; Graham, Jason S.; Ahuja, Vineet; Hosangadi, Ashvin

    2010-01-01

    Simulation technology can play an important role in rocket engine test facility design and development by assessing risks, providing analysis of dynamic pressure and thermal loads, identifying failure modes and predicting anomalous behavior of critical systems. Advanced numerical tools assume greater significance in supporting testing and design of high altitude testing facilities and plume induced testing environments of high thrust engines because of the greater inter-dependence and synergy in the functioning of the different sub-systems. This is especially true for facilities such as the proposed A-3 facility at NASA SSC because of a challenging operating envelope linked to variable throttle conditions at relatively low chamber pressures. Facility designs in this case will require a complex network of diffuser ducts, steam ejector trains, fast operating valves, cooling water systems and flow diverters that need to be characterized for steady state performance. In this paper, we will demonstrate with the use of CFD analyses s advanced capability to evaluate supersonic diffuser and steam ejector performance in a sub-scale A-3 facility at NASA Stennis Space Center (SSC) where extensive testing was performed. Furthermore, the focus in this paper relates to modeling of critical sub-systems and components used in facilities such as the A-3 facility. The work here will address deficiencies in empirical models and current CFD analyses that are used for design of supersonic diffusers/turning vanes/ejectors as well as analyses for confined plumes and venting processes. The primary areas that will be addressed are: (1) supersonic diffuser performance including analyses of thermal loads (2) accurate shock capturing in the diffuser duct; (3) effect of turning duct on the performance of the facility (4) prediction of mass flow rates and performance classification for steam ejectors (5) comparisons with test data from sub-scale diffuser testing and assessment of confidence

  5. Benchmark of SCALE (SAS2H) isotopic predictions of depletion analyses for San Onofre PWR MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    2000-02-01

    The isotopic composition of mixed-oxide (MOX) fuel, fabricated with both uranium and plutonium, after discharge from reactors is of significant interest to the Fissile Materials Disposition Program. The validation of the SCALE (SAS2H) depletion code for use in the prediction of isotopic compositions of MOX fuel, similar to previous validation studies on uranium-only fueled reactors, has corresponding significance. The EEI-Westinghouse Plutonium Recycle Demonstration Program examined the use of MOX fuel in the San Onofre PWR, Unit 1, during cycles 2 and 3. Isotopic analyses of the MOX spent fuel were conducted on 13 actinides and {sup 148}Nd by either mass or alpha spectrometry. Six fuel pellet samples were taken from four different fuel pins of an irradiated MOX assembly. The measured actinide inventories from those samples has been used to benchmark SAS2H for MOX fuel applications. The average percentage differences in the code results compared with the measurement were {minus}0.9% for {sup 235}U and 5.2% for {sup 239}Pu. The differences for most of the isotopes were significantly larger than in the cases for uranium-only fueled reactors. In general, comparisons of code results with alpha spectrometer data had extreme differences, although the differences in the calculations compared with mass spectrometer analyses were not extremely larger than that of uranium-only fueled reactors. This benchmark study should be useful in estimating uncertainties of inventory, criticality and dose calculations of MOX spent fuel.

  6. Mitogenomic analyses of eutherian relationships.

    Science.gov (United States)

    Arnason, U; Janke, A

    2002-01-01

    Reasonably correct phylogenies are fundamental to the testing of evolutionary hypotheses. Here, we present phylogenetic findings based on analyses of 67 complete mammalian mitochondrial (mt) genomes. The analyses, irrespective of whether they were performed at the amino acid (aa) level or on nucleotides (nt) of first and second codon positions, placed Erinaceomorpha (hedgehogs and their kin) as the sister group of remaining eutherians. Thus, the analyses separated Erinaceomorpha from other traditional lipotyphlans (e.g., tenrecs, moles, and shrews), making traditional Lipotyphla polyphyletic. Both the aa and nt data sets identified the two order-rich eutherian clades, the Cetferungulata (comprising Pholidota, Carnivora, Perissodactyla, Artiodactyla, and Cetacea) and the African clade (Tenrecomorpha, Macroscelidea, Tubulidentata, Hyracoidea, Proboscidea, and Sirenia). The study corroborated recent findings that have identified a sister-group relationship between Anthropoidea and Dermoptera (flying lemurs), thereby making our own order, Primates, a paraphyletic assembly. Molecular estimates using paleontologically well-established calibration points, placed the origin of most eutherian orders in Cretaceous times, 70-100 million years before present (MYBP). The same estimates place all primate divergences much earlier than traditionally believed. For example, the divergence between Homo and Pan is estimated to have taken place approximately 10 MYBP, a dating consistent with recent findings in primate paleontology.

  7. Analyses of potential predictive markers and survival data for a response to sunitinib in patients with metastatic renal cell carcinoma.

    Directory of Open Access Journals (Sweden)

    Juana Dornbusch

    Full Text Available BACKGROUND: Patients with metastatic clear cell renal cell carcinoma (ccRCC are frequently treated with tyrosine kinase inhibitors (TKI such as sunitinib. It inhibits angiogenic pathways by mainly targeting the receptors of VEGF and PDGF. In ccRCC, angiogenesis is characterized by the inactivation of the von Hippel-Lindau gene (VHL which in turn leads to the induction of HIF1α target genes such as CA9 and VEGF. Furthermore, the angiogenic phenotype of ccRCC is also reflected by endothelial markers (CD31, CD34 or other tumor-promoting factors like Ki67 or survivin. METHODS: Tissue microarrays from primary tumor specimens of 42 patients with metastatic ccRCC under sunitinib therapy were immunohistochemically stained for selected markers related to angiogenesis. The prognostic and predictive potential of theses markers was assessed on the basis of the objective response rate which was evaluated according to the RECIST criteria after 3, 6, 9 months and after last report (12-54 months of sunitinib treatment. Additionally, VHL copy number and mutation analyses were performed on DNA from cryo-preserved tumor tissues of 20 ccRCC patients. RESULTS: Immunostaining of HIF-1α, CA9, Ki67, CD31, pVEGFR1, VEGFR1 and -2, pPDGFRα and -β was significantly associated with the sunitinib response after 6 and 9 months as well as last report under therapy. Furthermore, HIF-1α, CA9, CD34, VEGFR1 and -3 and PDGRFα showed significant associations with progression-free survival (PFS and overall survival (OS. In multivariate Cox proportional hazards regression analyses high CA9 membrane staining and a response after 9 months were independent prognostic factors for longer OS. Frequently observed copy number loss and mutation of VHL gene lead to altered expression of VHL, HIF-1α, CA9, and VEGF. CONCLUSIONS: Immunoexpression of HIF-1α, CA9, Ki67, CD31, pVEGFR1, VEGFR1 and -2, pPDGFRα and -β in the primary tumors of metastatic ccRCC patients might support the

  8. Gender-enriched transcripts in Haemonchus contortus--predicted functions and genetic interactions based on comparative analyses with Caenorhabditis elegans.

    Science.gov (United States)

    Campbell, Bronwyn E; Nagaraj, Shivashankar H; Hu, Min; Zhong, Weiwei; Sternberg, Paul W; Ong, Eng K; Loukas, Alex; Ranganathan, Shoba; Beveridge, Ian; McInnes, Russell L; Hutchinson, Gareth W; Gasser, Robin B

    2008-01-01

    In the present study, a bioinformatic-microarray approach was employed for the analysis of selected expressed sequence tags (ESTs) from Haemonchus contortus, a key parasitic nematode of small ruminants. Following a bioinformatic analysis of EST data using a semiautomated pipeline, 1885 representative ESTs (rESTs) were selected, to which oligonucleotides (three per EST) were designed and spotted on to a microarray. This microarray was hybridized with cyanine-dye labelled cRNA probes synthesized from RNA from female or male adults of H. contortus. Differential hybridisation was displayed for 301 of the 1885 rESTs ( approximately 16%). Of these, 165 (55%) had significantly greater signal intensities for female cRNA and 136 (45%) for male cRNA. Of these, 113 with increased signals in female or male H. contortus had homologues in Caenorhabditis elegans, predicted to function in metabolism, information storage and processing, cellular processes and signalling, and embryonic and/or larval development. Of the rESTs with no known homologues in C. elegans, 24 ( approximately 40%) had homologues in other nematodes, four had homologues in various other organisms and 30 (52%) had no homology to any sequence in current gene databases. A genetic interaction network was predicted for the C. elegans orthologues of the gender-enriched H. contortus genes, and a focused analysis of a subset revealed a tight network of molecules involved in amino acid, carbohydrate or lipid transport and metabolism, energy production and conversion, translation, ribosomal structure and biogenesis and, importantly, those associated with meiosis and/or mitosis in the germline during oogenesis or spermatogenesis. This study provides a foundation for the molecular, biochemical and functional exploration of selected molecules with differential transcription profiles in H. contortus, for further microarray analyses of transcription in different developmental stages of H. contortus, and for an extended

  9. ATOP - The Advanced Taiwan Ocean Prediction System Based on the mpiPOM. Part 1: Model Descriptions, Analyses and Results

    Directory of Open Access Journals (Sweden)

    Leo Oey

    2013-01-01

    Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.

  10. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Institute of Scientific and Technical Information of China (English)

    Yan Song; Jing Huang; Ling Shan; Hong-Tu Zhang

    2015-01-01

    Background:Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC),but biomarkers of activity are lacking.The aim of this study was to investigate the association of Von Hippel-Lindau (VHL) gene status,vascular endothelial growth factor receptor (VEGFR) or stem cell factor receptor (KIT) expression,and their relationships with characteristics and clinical outcome of advanced ccRCC.Methods:A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute,Chinese Academy of Medical Sciences between January 2010 and November 2012.Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry.Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS) and ovcrall survival (OS) were calculated and then compared based on expression status.The Chi-square test,the KaplanMeier method,and the Lon-rank test were used for statistical analyses.Results:Of 59 patients,objective responses were observed in 28 patients (47.5%).The median PFS was 13.8 months and median OS was 39.9 months.There was an improved PFS in patients with the following clinical features:Male gender,number of metastatic sites 2 or less,VEGFR-2 positive or KIT positive.Eleven patients (18.6%) had evidence of VHL mutation,with an objective response rate of 45.5%,which showed no difference with patients with no VHL mutation (47.9%).VHL mutation status did not correlate with either overall response rate (P =0.938) or PFS (P =0.277).The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients,respectively,which was significantly longer than that of VEGFR-2 or KIT negative patients (P =0.026 and P =0.043).Conclusion:VHL mutation status could not predict

  11. Analyses of Potential Predictive Markers and Response to Targeted Therapy in Patients with Advanced Clear-cell Renal Cell Carcinoma

    Directory of Open Access Journals (Sweden)

    Yan Song

    2015-01-01

    Full Text Available Background: Vascular endothelial growth factor-targeted agents are standard treatments in advanced clear-cell renal cell carcinoma (ccRCC, but biomarkers of activity are lacking. The aim of this study was to investigate the association of Von Hippel-Lindau (VHL gene status, vascular endothelial growth factor receptor (VEGFR or stem cell factor receptor (KIT expression, and their relationships with characteristics and clinical outcome of advanced ccRCC. Methods: A total of 59 patients who received targeted treatment with sunitinib or pazopanib were evaluated for determination at Cancer Hospital and Institute, Chinese Academy of Medical Sciences between January 2010 and November 2012. Paraffin-embedded tumor samples were collected and status of the VHL gene and expression of VEGFR and KIT were determined by VHL sequence analysis and immunohistochemistry. Clinical-pathological features were collected and efficacy such as response rate and Median progression-free survival (PFS and overall survival (OS were calculated and then compared based on expression status. The Chi-square test, the Kaplan-Meier method, and the Lon-rank test were used for statistical analyses. Results: Of 59 patients, objective responses were observed in 28 patients (47.5%. The median PFS was 13.8 months and median OS was 39.9 months. There was an improved PFS in patients with the following clinical features: Male gender, number of metastatic sites 2 or less, VEGFR-2 positive or KIT positive. Eleven patients (18.6% had evidence of VHL mutation, with an objective response rate of 45.5%, which showed no difference with patients with no VHL mutation (47.9%. VHL mutation status did not correlate with either overall response rate (P = 0.938 or PFS (P = 0.277. The PFS was 17.6 months and 22.2 months in VEGFR-2 positive patients and KIT positive patients, respectively, which was significantly longer than that of VEGFR-2 or KIT negative patients (P = 0.026 and P = 0.043. Conclusion

  12. Teachers' Analyses of Classroom Video Predict Student Learning of Mathematics: Further Explorations of a Novel Measure of Teacher Knowledge

    Science.gov (United States)

    Kersting, Nicole B.; Givvin, Karen B.; Sotelo, Francisco L.; Stigler, James W.

    2010-01-01

    This study explores the relationship between teacher knowledge and student learning in the area of mathematics by developing and evaluating an innovative approach to assessing teacher knowledge. This approach is based on teachers' analyses of classroom video clips. Teachers watched 13 video clips of classroom instruction and then provided written…

  13. Comparing direct image and wavelet transform-based approaches to analysing remote sensing imagery for predicting wildlife distribution

    NARCIS (Netherlands)

    Murwira, A.; Skidmore, A.K.

    2010-01-01

    In this study we tested the ability to predict the probability of elephant (Loxodonta africana) presence in an agricultural landscape of Zimbabwe based on three methods of measuring the spatial heterogeneity in vegetation cover, where vegetation cover was measured using the Landsat Thematic Mapper (

  14. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  15. Can the lifetime of the superheater tubes be predicted according to the fuel analyses? Assessment from field and laboratory data

    Energy Technology Data Exchange (ETDEWEB)

    Salmenoja, K. [Kvaerner Pulping Oy, Tampere (Finland)

    1998-12-31

    Lifetime of the superheaters in different power boilers is more or less still a mystery. This is especially true in firing biomass based fuels (biofuels), such as bark, forest residues, and straw. Due to the unhomogeneous nature of the biofuels, the lifetime of the superheaters may vary from case to case. Sometimes the lifetime is significantly shorter than originally expected, sometimes no corrosion even in the hottest tubes is observed. This is one of the main reasons why the boiler operators often demand for a better predictability on the corrosion resistance of the materials to avoid unscheduled shutdowns. (orig.) 9 refs.

  16. Evaluation of multivariate statistical analyses for monitoring and prediction of processes in an seawater reverse osmosis desalination plant

    Energy Technology Data Exchange (ETDEWEB)

    Kolluri, Srinivas Sahan; Esfahani, Iman Janghorban; Garikiparthy, Prithvi Sai Nadh; Yoo, Chang Kyoo [Kyung Hee University, Yongin (Korea, Republic of)

    2015-08-15

    Our aim was to analyze, monitor, and predict the outcomes of processes in a full-scale seawater reverse osmosis (SWRO) desalination plant using multivariate statistical techniques. Multivariate analysis of variance (MANOVA) was used to investigate the performance and efficiencies of two SWRO processes, namely, pore controllable fiber filterreverse osmosis (PCF-SWRO) and sand filtration-ultra filtration-reverse osmosis (SF-UF-SWRO). Principal component analysis (PCA) was applied to monitor the two SWRO processes. PCA monitoring revealed that the SF-UF-SWRO process could be analyzed reliably with a low number of outliers and disturbances. Partial least squares (PLS) analysis was then conducted to predict which of the seven input parameters of feed flow rate, PCF/SF-UF filtrate flow rate, temperature of feed water, turbidity feed, pH, reverse osmosis (RO)flow rate, and pressure had a significant effect on the outcome variables of permeate flow rate and concentration. Root mean squared errors (RMSEs) of the PLS models for permeate flow rates were 31.5 and 28.6 for the PCF-SWRO process and SF-UF-SWRO process, respectively, while RMSEs of permeate concentrations were 350.44 and 289.4, respectively. These results indicate that the SF-UF-SWRO process can be modeled more accurately than the PCF-SWRO process, because the RMSE values of permeate flowrate and concentration obtained using a PLS regression model of the SF-UF-SWRO process were lower than those obtained for the PCF-SWRO process.

  17. Receptor site topographies for phencyclidine-like and sigma drugs: predictions from quantitative conformational, electrostatic potential, and radioreceptor analyses.

    Science.gov (United States)

    Manallack, D T; Wong, M G; Costa, M; Andrews, P R; Beart, P M

    1988-12-01

    Computer-assisted molecular modelling techniques and electrostatic analyses of a wide range of phenycyclidine (PCP) and sigma ligands, in conjunction with radioreceptor studies, were used to determine the topographies of the PCP and sigma receptors. The PCP receptor model was defined using key molecules from the arylcyclohexylamine, benzomorphan, bridged benz[f]isoquinoline, and dibenzocycloalkenimine drug classes. Hypothetical receptor points (R1, R2) were constructed onto the aromatic ring of each compound to represent hydrophobic interactions with the receptor, along with an additional receptor point (R3) representing a hydrogen bond between the nitrogen atom and the receptor. The superimposition of these key molecules gave the coordinates of the receptor points and nitrogen defining the primary PCP pharmacophore as follows: R1 (0.00, 3.50, 0.00), R2 (0.00, -3.50, 0.00), R3 (6.66, -1.13, 0.00), and N (3.90, -1.46, -0.32). Additional analyses were used to describe secondary binding sites for an additional hydrogen bonding site and two lipophilic clefts. Similarly, the sigma receptor model was constructed from ligands of the benzomorphan, octahydrobenzo[f]quinoline, phenylpiperidine, and diphenylguanidine drug classes. Coordinates for the primary sigma pharmacophore are as follows: R1 (0.00, 3.50, 0.00), R2 (0.00, -3.50, 0.00), R3 (6.09, 2.09, 0.00), and N (4.9, -0.12, -1.25). Secondary binding sites for sigma ligands were proposed for the interaction of aromatic ring substituents and large N-substituted lipophilic groups with the receptor. The sigma receptor model differs from the PCP model in the position of nitrogen atom, direction of the nitrogen lone pair vector, and secondary sigma binding sites. This study has thus demonstrated that the differing quantitative structure-activity relationships of PCP and sigma ligands allow the definition of discrete receptors. These models may be used in conjunction with rational drug design techniques to design novel PCP

  18. Landscaping analyses of the ROC predictions of discrete-slots and signal-detection models of visual working memory.

    Science.gov (United States)

    Donkin, Chris; Tran, Sophia Chi; Nosofsky, Robert

    2014-10-01

    A fundamental issue concerning visual working memory is whether its capacity limits are better characterized in terms of a limited number of discrete slots (DSs) or a limited amount of a shared continuous resource. Rouder et al. (2008) found that a mixed-attention, fixed-capacity, DS model provided the best explanation of behavior in a change detection task, outperforming alternative continuous signal detection theory (SDT) models. Here, we extend their analysis in two ways: first, with experiments aimed at better distinguishing between the predictions of the DS and SDT models, and second, using a model-based analysis technique called landscaping, in which the functional-form complexity of the models is taken into account. We find that the balance of evidence supports a DS account of behavior in change detection tasks but that the SDT model is best when the visual displays always consist of the same number of items. In our General Discussion section, we outline, but ultimately reject, a number of potential explanations for the observed pattern of results. We finish by describing future research that is needed to pinpoint the basis for this observed pattern of results.

  19. The GENOTEND chip: a new tool to analyse gene expression in muscles of beef cattle for beef quality prediction

    Directory of Open Access Journals (Sweden)

    Hocquette Jean-Francois

    2012-08-01

    validated in the groups of 30 Charolais young bulls slaughtered in year 2, and in the 21 Charolais steers slaughtered in year 1, but not in the group of 19 steers slaughtered in year 2 which differ from the reference group by two factors (gender and year. When the first three groups of animals were analysed together, this subset of genes explained a 4-fold higher proportion of the variability in tenderness than muscle biochemical traits. Conclusion This study underlined the relevance of the GENOTEND chip to identify markers of beef quality, mainly by confirming previous results and by detecting other genes of the heat shock family as potential markers of beef quality. However, it was not always possible to extrapolate the relevance of these markers to all animal groups which differ by several factors (such as gender or environmental conditions of production from the initial population of reference in which these markers were identified.

  20. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  1. Evidence that changes in social cognitions predict changes in self-reported driver behavior: Causal analyses of two-wave panel data.

    Science.gov (United States)

    Elliott, Mark A; Thomson, James A; Robertson, Kirsty; Stephenson, Carry; Wicks, John

    2013-01-01

    Previous research on the theory of planned behavior (TPB) is characterized by cross-sectional tests of the model's proposed causal relationships. In the absence of effective experimental techniques for changing the TPB's cognitive antecedents, the present research aimed to provide a stronger non-experimental test of the model, using causal analyses of two-wave panel data. Two studies of driver behavior were conducted in which naturally occurring within-participant changes in TPB constructs were measured over time, and used to predict corresponding within-participant changes in both intentions and behavior. A two-wave panel design was used in both studies. Study 1 had a one-month gap between baseline and follow-up. At both waves, a convenience sample comprising predominantly university students (N=135) completed questionnaire measures of all TPB cognitions and behavior (compliance with speed limits in urban areas). Cross-lagged multiple regressions and bootstrapping procedures for testing multiple mediators supported all of the relationships proposed by the TPB. These findings were extended in study 2 using a large, non-student sample of speed limit offenders (N=1149), a six-month gap between baseline and follow-up, and a larger number of cognitive antecedents. Participants completed postal questionnaires at both waves to measure all cognitions proposed by the two-component TPB, along with moral norm, anticipated regret, self-identity and speeding on urban roads, country roads, and fast dual carriageways or motorways. Changes in instrumental and affective attitude, descriptive norm, self-efficacy, moral norm, anticipated regret and self-identity predicted changes in intention to speed. Changes in intention and self-efficacy predicted behavior-change. Injunctive norm and perceived controllability did not predict intention or behavior-change. Additionally, direct (unhypothesized) relationships with behavior were found for affective attitude, descriptive norm and

  2. Computational fluid dynamics analyses of lateral heat conduction, coolant azimuthal mixing and heat transfer predictions in a BR2 fuel assembly geometry.

    Energy Technology Data Exchange (ETDEWEB)

    Tzanos, C. P.; Dionne, B. (Nuclear Engineering Division)

    2011-05-23

    To support the analyses related to the conversion of the BR2 core from highly-enriched (HEU) to low-enriched (LEU) fuel, the thermal-hydraulics codes PLTEMP and RELAP-3D are used to evaluate the safety margins during steady-state operation (PLTEMP), as well as after a loss-of-flow, loss-of-pressure, or a loss of coolant event (RELAP). In the 1-D PLTEMP and RELAP simulations, conduction in the azimuthal and axial directions is not accounted. The very good thermal conductivity of the cladding and the fuel meat and significant temperature gradients in the lateral directions (axial and azimuthal directions) could lead to a heat flux distribution that is significantly different than the power distribution. To evaluate the significance of the lateral heat conduction, 3-D computational fluid dynamics (CFD) simulations, using the CFD code STAR-CD, were performed. Safety margin calculations are typically performed for a hot stripe, i.e., an azimuthal region of the fuel plates/coolant channel containing the power peak. In a RELAP model, for example, a channel between two plates could be divided into a number of RELAP channels (stripes) in the azimuthal direction. In a PLTEMP model, the effect of azimuthal power peaking could be taken into account by using engineering factors. However, if the thermal mixing in the azimuthal direction of a coolant channel is significant, a stripping approach could be overly conservative by not taking into account this mixing. STAR-CD simulations were also performed to study the thermal mixing in the coolant. Section II of this document presents the results of the analyses of the lateral heat conduction and azimuthal thermal mixing in a coolant channel. Finally, PLTEMP and RELAP simulations rely on the use of correlations to determine heat transfer coefficients. Previous analyses showed that the Dittus-Boelter correlation gives significantly more conservative (lower) predictions than the correlations of Sieder-Tate and Petukhov. STAR-CD 3-D

  3. Genetically Predicted Body Mass Index and Breast Cancer Risk: Mendelian Randomization Analyses of Data from 145,000 Women of European Descent

    Science.gov (United States)

    Guo, Yan; Warren Andersen, Shaneda; Shu, Xiao-Ou; Michailidou, Kyriaki; Bolla, Manjeet K.; Wang, Qin; Garcia-Closas, Montserrat; Milne, Roger L.; Schmidt, Marjanka K.; Chang-Claude, Jenny; Dunning, Allison; Bojesen, Stig E.; Ahsan, Habibul; Aittomäki, Kristiina; Andrulis, Irene L.; Anton-Culver, Hoda; Beckmann, Matthias W.; Beeghly-Fadiel, Alicia; Benitez, Javier; Bogdanova, Natalia V.; Bonanni, Bernardo; Børresen-Dale, Anne-Lise; Brand, Judith; Brauch, Hiltrud; Brenner, Hermann; Brüning, Thomas; Burwinkel, Barbara; Casey, Graham; Chenevix-Trench, Georgia; Couch, Fergus J.; Cross, Simon S.; Czene, Kamila; Dörk, Thilo; Dumont, Martine; Fasching, Peter A.; Figueroa, Jonine; Flesch-Janys, Dieter; Fletcher, Olivia; Flyger, Henrik; Fostira, Florentia; Gammon, Marilie; Giles, Graham G.; Guénel, Pascal; Haiman, Christopher A.; Hamann, Ute; Hooning, Maartje J.; Hopper, John L.; Jakubowska, Anna; Jasmine, Farzana; Jenkins, Mark; John, Esther M.; Johnson, Nichola; Jones, Michael E.; Kabisch, Maria; Knight, Julia A.; Koppert, Linetta B.; Kosma, Veli-Matti; Kristensen, Vessela; Le Marchand, Loic; Lee, Eunjung; Li, Jingmei; Lindblom, Annika; Lubinski, Jan; Malone, Kathi E.; Mannermaa, Arto; Margolin, Sara; McLean, Catriona; Meindl, Alfons; Neuhausen, Susan L.; Nevanlinna, Heli; Neven, Patrick; Olson, Janet E.; Perez, Jose I. A.; Perkins, Barbara; Phillips, Kelly-Anne; Pylkäs, Katri; Rudolph, Anja; Santella, Regina; Sawyer, Elinor J.; Schmutzler, Rita K.; Seynaeve, Caroline; Shah, Mitul; Shrubsole, Martha J.; Southey, Melissa C.; Swerdlow, Anthony J.; Toland, Amanda E.; Tomlinson, Ian; Torres, Diana; Truong, Thérèse; Ursin, Giske; Van Der Luijt, Rob B.; Verhoef, Senno; Whittemore, Alice S.; Winqvist, Robert; Zhao, Hui; Zhao, Shilin; Hall, Per; Simard, Jacques; Kraft, Peter; Hunter, David; Easton, Douglas F.; Zheng, Wei

    2016-01-01

    Background Observational epidemiological studies have shown that high body mass index (BMI) is associated with a reduced risk of breast cancer in premenopausal women but an increased risk in postmenopausal women. It is unclear whether this association is mediated through shared genetic or environmental factors. Methods We applied Mendelian randomization to evaluate the association between BMI and risk of breast cancer occurrence using data from two large breast cancer consortia. We created a weighted BMI genetic score comprising 84 BMI-associated genetic variants to predicted BMI. We evaluated genetically predicted BMI in association with breast cancer risk using individual-level data from the Breast Cancer Association Consortium (BCAC) (cases  =  46,325, controls  =  42,482). We further evaluated the association between genetically predicted BMI and breast cancer risk using summary statistics from 16,003 cases and 41,335 controls from the Discovery, Biology, and Risk of Inherited Variants in Breast Cancer (DRIVE) Project. Because most studies measured BMI after cancer diagnosis, we could not conduct a parallel analysis to adequately evaluate the association of measured BMI with breast cancer risk prospectively. Results In the BCAC data, genetically predicted BMI was found to be inversely associated with breast cancer risk (odds ratio [OR]  =  0.65 per 5 kg/m2 increase, 95% confidence interval [CI]: 0.56–0.75, p = 3.32 × 10−10). The associations were similar for both premenopausal (OR   =   0.44, 95% CI:0.31–0.62, p  =  9.91 × 10−8) and postmenopausal breast cancer (OR  =  0.57, 95% CI: 0.46–0.71, p  =  1.88 × 10−8). This association was replicated in the data from the DRIVE consortium (OR  =  0.72, 95% CI: 0.60–0.84, p   =   1.64 × 10−7). Single marker analyses identified 17 of the 84 BMI-associated single nucleotide polymorphisms (SNPs) in association with breast cancer risk at p

  4. Phylogenetic and genomewide analyses suggest a functional relationship between kayak, the Drosophila fos homolog, and fig, a predicted protein phosphatase 2c nested within a kayak intron.

    Science.gov (United States)

    Hudson, Stephanie G; Garrett, Matthew J; Carlson, Joseph W; Micklem, Gos; Celniker, Susan E; Goldstein, Elliott S; Newfeld, Stuart J

    2007-11-01

    A gene located within the intron of a larger gene is an uncommon arrangement in any species. Few of these nested gene arrangements have been explored from an evolutionary perspective. Here we report a phylogenetic analysis of kayak (kay) and fos intron gene (fig), a divergently transcribed gene located in a kay intron, utilizing 12 Drosophila species. The evolutionary relationship between these genes is of interest because kay is the homolog of the proto-oncogene c-fos whose function is modulated by serine/threonine phosphorylation and fig is a predicted PP2C phosphatase specific for serine/threonine residues. We found that, despite an extraordinary level of diversification in the intron-exon structure of kay (11 inversions and six independent exon losses), the nested arrangement of kay and fig is conserved in all species. A genomewide analysis of protein-coding nested gene pairs revealed that approximately 20% of nested pairs in D. melanogaster are also nested in D. pseudoobscura and D. virilis. A phylogenetic examination of fig revealed that there are three subfamilies of PP2C phosphatases in all 12 species of Drosophila. Overall, our phylogenetic and genomewide analyses suggest that the nested arrangement of kay and fig may be due to a functional relationship between them.

  5. Prediction

    CERN Document Server

    Sornette, Didier

    2010-01-01

    This chapter first presents a rather personal view of some different aspects of predictability, going in crescendo from simple linear systems to high-dimensional nonlinear systems with stochastic forcing, which exhibit emergent properties such as phase transitions and regime shifts. Then, a detailed correspondence between the phenomenology of earthquakes, financial crashes and epileptic seizures is offered. The presented statistical evidence provides the substance of a general phase diagram for understanding the many facets of the spatio-temporal organization of these systems. A key insight is to organize the evidence and mechanisms in terms of two summarizing measures: (i) amplitude of disorder or heterogeneity in the system and (ii) level of coupling or interaction strength among the system's components. On the basis of the recently identified remarkable correspondence between earthquakes and seizures, we present detailed information on a class of stochastic point processes that has been found to be particu...

  6. Predicting origins of passerines migrating through Canadian migration monitoring stations using stable-hydrogen isotope analyses of feathers: a new tool for bird conservation

    Directory of Open Access Journals (Sweden)

    Keith A. Hobson

    2015-06-01

    Full Text Available The Canadian Migration Monitoring Network (CMMN consists of standardized observation and migration count stations located largely along Canada's southern border. A major purpose of CMMN is to detect population trends of migratory passerines that breed primarily in the boreal forest and are otherwise poorly monitored by the North American Breeding Bird Survey (BBS. A primary limitation of this approach to monitoring is that it is currently not clear which geographic regions of the boreal forest are represented by the trends generated for each bird species at each station or group of stations. Such information on "catchment areas" for CMMN will greatly enhance their value in contributing to understanding causes of population trends, as well as facilitating joint trend analysis for stations with similar catchments. It is now well established that naturally occurring concentrations of deuterium in feathers grown in North America can provide information on their approximate geographic origins, especially latitude. We used stable hydrogen isotope analyses of feathers (δ²Hf from 15 species intercepted at 22 CMMN stations to assign approximate origins to populations moving through stations or groups of stations. We further constrained the potential catchment areas using prior information on potential longitudinal origins based upon bird migration trajectories predicted from band recovery data and known breeding distributions. We detected several cases of differences in catchment area of species passing through sites, and between seasons within species. We discuss the importance of our findings, and future directions for using this approach to assist conservation of migratory birds at continental scales.

  7. Kvalitative analyser ..

    DEFF Research Database (Denmark)

    Boolsen, Merete Watt

    bogen forklarer de fundamentale trin i forskningsprocessen og applikerer dem på udvalgte kvalitative analyser: indholdsanalyse, Grounded Theory, argumentationsanalyse og diskursanalyse......bogen forklarer de fundamentale trin i forskningsprocessen og applikerer dem på udvalgte kvalitative analyser: indholdsanalyse, Grounded Theory, argumentationsanalyse og diskursanalyse...

  8. Prediction models for short children born small for gestational age (SGA covering the total growth phase. Analyses based on data from KIGS (Pfizer International Growth Database

    Directory of Open Access Journals (Sweden)

    Lindberg Anders

    2011-06-01

    Full Text Available Abstract Background Mathematical models can be developed to predict growth in short children treated with growth hormone (GH. These models can serve to optimize and individualize treatment in terms of height outcomes and costs. The aims of this study were to compile existing prediction models for short children born SGA (SGA, to develop new models and to validate the algorithms. Methods Existing models to predict height velocity (HV for the first two and the fourth prepubertal years and during total pubertal growth (TPG on GH were applied to SGA children from the KIGS (Pfizer International Growth Database - 1st year: N = 2340; 2nd year: N = 1358; 4th year: N = 182; TPG: N = 59. A new prediction model was developed for the 3rd prepubertal year based upon 317 children by means of the all-possible regression approach, using Mallow's C(p criterion. Results The comparison between the observed and predicted height velocity showed no significant difference when the existing prediction models were applied to new cohorts. A model for predicting HV during the 3rd year explained 33% of the variability with an error SD of 1.0 cm/year. The predictors were (in order of importance: HV previous year; chronological age; weight SDS; mid-parent height SDS and GH dose. Conclusions Models to predict growth to GH from prepubertal years to adult height are available for short children born SGA. The models utilize easily accessible predictors and are accurate. The overall explained variability in SGA is relatively low, due to the heterogeneity of the disorder. The models can be used to provide patients with a realistic expectation of treatment, and may help to identify compliance problems or other underlying causes of treatment failure.

  9. Logistic Regression Analyses for Predicting Clinically Important Differences in Motor Capacity, Motor Performance, and Functional Independence after Constraint-Induced Therapy in Children with Cerebral Palsy

    Science.gov (United States)

    Wang, Tien-ni; Wu, Ching-yi; Chen, Chia-ling; Shieh, Jeng-yi; Lu, Lu; Lin, Keh-chung

    2013-01-01

    Given the growing evidence for the effects of constraint-induced therapy (CIT) in children with cerebral palsy (CP), there is a need for investigating the characteristics of potential participants who may benefit most from this intervention. This study aimed to establish predictive models for the effects of pediatric CIT on motor and functional…

  10. Molecular and clinical analyses of Greig cephalopolysyndactyly and Pallister-Hall syndromes: robust phenotype prediction from the type and position of GLI3 mutations.

    NARCIS (Netherlands)

    Johnston, J.J.; Olivos-Glander, I.; Killoran, C.; Elson, E.; Turner, J.T.; Peters, K.F.; Abbott, M.H.; Aughton, D.J.; Aylsworth, A.S.; Bamshad, M.; Booth, C.; Curry, C.J.; David, A.; Dinulos, M.B.; Flannery, D.B.; Fox, M.A.; Graham, J.M.; Grange, D.K.; Guttmacher, A.E.; Hannibal, M.C.; Henn, W.; Hennekam, R.C.M.; Holmes, L.B.; Hoyme, H.E.; Leppig, K.A.; Lin, A.E.; Macleod, P.; Manchester, D.K.; Marcelis, C.L.M.; Mazzanti, L.; McCann, E.; McDonald, M.T.; Mendelsohn, N.J.; Moeschler, J.B.; Moghaddam, B.; Neri, G.; Newbury-Ecob, R.; Pagon, R.A.; Phillips, J.A.; Sadler, L.S.; Stoler, J.M.; Tilstra, D.; Walsh Vockley, C.M.; Zackai, E.H.; Zadeh, T.M.; Brueton, L.; Black, G.C.M.; Biesecker, L.G.

    2005-01-01

    Mutations in the GLI3 zinc-finger transcription factor gene cause Greig cephalopolysyndactyly syndrome (GCPS) and Pallister-Hall syndrome (PHS), which are variable but distinct clinical entities. We hypothesized that GLI3 mutations that predict a truncated functional repressor protein cause PHS and

  11. The Janus-faced nature of time spent on homework : Using latent profile analyses to predict academic achievement over a school year

    NARCIS (Netherlands)

    Flunger, Barbara; Trautwein, Ulrich; Nagengast, Benjamin; Lüdtke, Oliver; Niggli, Alois; Schnyder, Inge

    2015-01-01

    Homework time and achievement are only modestly associated, whereas homework effort has consistently been shown to positively predict later achievement. We argue that time spent on homework can be an important predictor of achievement when combined with measures of homework effort. Latent profile an

  12. Consequences of kriging and land use regression for PM2.5 predictions in epidemiologic analyses: insights into spatial variability using high-resolution satellite data.

    Science.gov (United States)

    Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A

    2015-01-01

    Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.

  13. Generation of a predicted protein database from EST data and application to iTRAQ analyses in grape (Vitis vinifera cv. Cabernet Sauvignon berries at ripening initiation

    Directory of Open Access Journals (Sweden)

    Smith Derek

    2009-01-01

    Full Text Available Abstract Background iTRAQ is a proteomics technique that uses isobaric tags for relative and absolute quantitation of tryptic peptides. In proteomics experiments, the detection and high confidence annotation of proteins and the significance of corresponding expression differences can depend on the quality and the species specificity of the tryptic peptide map database used for analysis of the data. For species for which finished genome sequence data are not available, identification of proteins relies on similarity to proteins from other species using comprehensive peptide map databases such as the MSDB. Results We were interested in characterizing ripening initiation ('veraison' in grape berries at the protein level in order to better define the molecular control of this important process for grape growers and wine makers. We developed a bioinformatic pipeline for processing EST data in order to produce a predicted tryptic peptide database specifically targeted to the wine grape cultivar, Vitis vinifera cv. Cabernet Sauvignon, and lacking truncated N- and C-terminal fragments. By searching iTRAQ MS/MS data generated from berry exocarp and mesocarp samples at ripening initiation, we determined that implementation of the custom database afforded a large improvement in high confidence peptide annotation in comparison to the MSDB. We used iTRAQ MS/MS in conjunction with custom peptide db searches to quantitatively characterize several important pathway components for berry ripening previously described at the transcriptional level and confirmed expression patterns for these at the protein level. Conclusion We determined that a predicted peptide database for MS/MS applications can be derived from EST data using advanced clustering and trimming approaches and successfully implemented for quantitative proteome profiling. Quantitative shotgun proteome profiling holds great promise for characterizing biological processes such as fruit ripening

  14. Comparative analyses of genetic risk prediction methods reveal extreme diversity of genetic predisposition to nonalcoholic fatty liver disease (NAFLD) among ethnic populations of India

    Indian Academy of Sciences (India)

    Ankita Chatterjee; Analabha Basu; Abhijit Chowdhury; Kausik Das; Neeta Sarkar-Roy; Partha P. Majumder; Priyadarshi Basu

    2015-03-01

    Nonalcoholic fatty liver disease (NAFLD) is a distinct pathologic condition characterized by a disease spectrum ranging from simple steatosis to steato-hepatitis, cirrhosis and hepatocellular carcinoma. Prevalence of NAFLD varies in different ethnic groups, ranging from 12% in Chinese to 45% in Hispanics. Among Indian populations, the diversity in prevalence is high, ranging from 9% in rural populations to 32% in urban populations, with geographic differences as well. Here, we wished to find out if this difference is reflected in their genetic makeup. To date, several candidate genes and a few genomewide association studies (GWAS) have been carried out, and many associations between single nucleotide polymorphisms (SNPs) and NAFLD have been observed. In this study, the risk allele frequencies (RAFs) of NAFLD-associated SNPs in 20 Indian ethnic populations (376 individuals) were analysed. We used two different measures for calculating genetic risk scores and compared their performance. The correlation of additive risk scores of NAFLD for three Hapmap populations with their weighted mean prevalence was found to be high (2 = 0.93). Later we used this method to compare NAFLD risk among ethnic Indian populations. Based on our observation, the Indian caste populations have high risk scores compared to Caucasians, who are often used as surrogate and similar to Indian caste population in disease gene association studies, and is significantly higher than the Indian tribal populations.

  15. Comparative analyses of genetic risk prediction methods reveal extreme diversity of genetic predisposition to nonalcoholic fatty liver disease (NAFLD) among ethnic populations of India.

    Science.gov (United States)

    Chatterjee, Ankita; Basu, Analabha; Chowdhury, Abhijit; Das, Kausik; Sarkar-Roy, Neeta; Majumder, Partha P; Basu, Priyadarshi

    2015-03-01

    Nonalcoholic fatty liver disease (NAFLD) is a distinct pathologic condition characterized by a disease spectrum ranging from simple steatosis to steato-hepatitis, cirrhosis and hepatocellular carcinoma. Prevalence of NAFLD varies in different ethnic groups, ranging from 12% in Chinese to 45% in Hispanics. Among Indian populations, the diversity in prevalence is high, ranging from 9% in rural populations to 32% in urban populations, with geographic differences as well. Here, we wished to find out if this difference is reflected in their genetic makeup. To date, several candidate genes and a few genomewide association studies (GWAS) have been carried out, and many associations between single nucleotide polymorphisms (SNPs) and NAFLD have been observed. In this study, the risk allele frequencies (RAFs) of NAFLD-associated SNPs in 20 Indian ethnic populations (376 individuals) were analysed. We used two different measures for calculating genetic risk scores and compared their performance. The correlation of additive risk scores of NAFLD for three Hapmap populations with their weighted mean prevalence was found to be high (R(2) = 0.93). Later we used this method to compare NAFLD risk among ethnic Indian populations. Based on our observation, the Indian caste populations have high risk scores compared to Caucasians, who are often used as surrogate and similar to Indian caste population in disease gene association studies, and is significantly higher than the Indian tribal populations.

  16. Effects of pharmacists' interventions on appropriateness of prescribing and evaluation of the instruments' (MAI, STOPP and STARTs' ability to predict hospitalization--analyses from a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Ulrika Gillespie

    Full Text Available BACKGROUND: Appropriateness of prescribing can be assessed by various measures and screening instruments. The aims of this study were to investigate the effects of pharmacists' interventions on appropriateness of prescribing in elderly patients, and to explore the relationship between these results and hospital care utilization during a 12-month follow-up period. METHODS: The study population from a previous randomized controlled study, in which the effects of a comprehensive pharmacist intervention on re-hospitalization was investigated, was used. The criteria from the instruments MAI, STOPP and START were applied retrospectively to the 368 study patients (intervention group (I n = 182, control group (C n = 186. The assessments were done on admission and at discharge to detect differences over time and between the groups. Hospital care consumption was recorded and the association between scores for appropriateness, and hospitalization was analysed. RESULTS: The number of Potentially Inappropriate Medicines (PIMs per patient as identified by STOPP was reduced for I but not for C (1.42 to 0.93 vs. 1.46 to 1.66 respectively, p<0.01. The number of Potential Prescription Omissions (PPOs per patient as identified by START was reduced for I but not for C (0.36 to 0.09 vs. 0.42 to 0.45 respectively, p<0.001. The summated score for MAI was reduced for I but not for C (8.5 to 5.0 and 8.7 to 10.0 respectively, p<0.001. There was a positive association between scores for MAI and STOPP and drug-related readmissions (RR 8-9% and 30-34% respectively. No association was detected between the scores of the tools and total re-visits to hospital. CONCLUSION: The interventions significantly improved the appropriateness of prescribing for patients in the intervention group as evaluated by the instruments MAI, STOPP and START. High scores in MAI and STOPP were associated with a higher number of drug-related readmissions.

  17. A new tool for prediction and analysis of thermal comfort in steady and transient states; Un nouvel outil pour la prediction et l'analyse du confort thermique en regime permanent et variable

    Energy Technology Data Exchange (ETDEWEB)

    Megri, A.Ch. [Illinois Institute of Technology, Civil and Architectural Engineering Dept., Chicago, Illinois (United States); Megri, A.F. [Centre Universitaire de Tebessa, Dept. d' Electronique (Algeria); El Naqa, I. [Washington Univ., School of Medicine, Dept. of Radiation Oncology, Saint Louis, Missouri (United States); Achard, G. [Universite de Savoie, Lab. Optimisation de la Conception et Ingenierie de L' Environnement (LOCIE) - ESIGEC, 73 - Le Bourget du Lac (France)

    2006-02-15

    Thermal comfort is influenced by psychological as well as physiological factors. This paper proposes the use of support vector machine (SVM) learning for automated prediction of human thermal comfort in steady and transient states. The SVM is an artificial intelligent approach that could capture the input/output mapping from the given data. Support vector machines were developed based on the Structural Risk Minimization principle. Different sets of representative experimental environmental factors that affect a homogenous person's thermal balance were used for training the SVM machine. The SVM is a very efficient, fast, and accurate technique to identify thermal comfort. This technique permits the determination of thermal comfort indices for different sub-categories of people; sick and elderly, in extreme climatic conditions, when the experimental data for such sub-category are available. The experimental data has been used for the learning and testing processes. The results show a good correlation between SVM predicted values and those obtained from conventional thermal comfort, such as Fanger and Gagge models. The 'trained machine' with representative data could be used easily and effectively in comparison with other conventional estimation methods of different indices. (author)

  18. Phylogenetic analyses of complete mitochondrial genome sequences suggest a basal divergence of the enigmatic rodent Anomalurus

    Directory of Open Access Journals (Sweden)

    Gissi Carmela

    2007-02-01

    Full Text Available Abstract Background Phylogenetic relationships between Lagomorpha, Rodentia and Primates and their allies (Euarchontoglires have long been debated. While it is now generally agreed that Rodentia constitutes a monophyletic sister-group of Lagomorpha and that this clade (Glires is sister to Primates and Dermoptera, higher-level relationships within Rodentia remain contentious. Results We have sequenced and performed extensive evolutionary analyses on the mitochondrial genome of the scaly-tailed flying squirrel Anomalurus sp., an enigmatic rodent whose phylogenetic affinities have been obscure and extensively debated. Our phylogenetic analyses of the coding regions of available complete mitochondrial genome sequences from Euarchontoglires suggest that Anomalurus is a sister taxon to the Hystricognathi, and that this clade represents the most basal divergence among sampled Rodentia. Bayesian dating methods incorporating a relaxed molecular clock provide divergence-time estimates which are consistently in agreement with the fossil record and which indicate a rapid radiation within Glires around 60 million years ago. Conclusion Taken together, the data presented provide a working hypothesis as to the phylogenetic placement of Anomalurus, underline the utility of mitochondrial sequences in the resolution of even relatively deep divergences and go some way to explaining the difficulty of conclusively resolving higher-level relationships within Glires with available data and methodologies.

  19. POMA analyses as new efficient bioinformatics' platform to predict and optimise bioactivity of synthesized 3a,4-dihydro-3H-indeno[1,2-c]pyrazole-2-carboxamide/carbothioamide analogues.

    Science.gov (United States)

    Ahsan, Mohamed Jawed; Govindasamy, Jeyabalan; Khalilullah, Habibullah; Mohan, Govind; Stables, James P

    2012-12-01

    A series of 43, 3a,4-dihydro-3H-indeno[1,2-c]pyrazole-2-carboxamide/carbothioamide analogues (D01-D43) were analysed using Petra, Osiris, Molinspiration and ALOGPS (POMA) to identify pharmacophore, toxicity prediction, lipophilicity and bioactivity. All the compounds were evaluated for anti-HIV activity. 3-(4-Chlorophenyl)-N-(4-fluorophenyl)-6,7-dimethoxy-3a,4-dihydro-3H-indeno[1,2-c]pyrazole-2-carboxamide (D07) was found to be the most active with IC(50)>4.83 μM and CC(50) 4.83 μM. 3-(4-Fluorophenyl)-6,7-dimethoxy-3a,4-dihydro-3H-indeno[1,2-c]pyrazole-2-carbothioamide (D41) was found to be the most active compound against bacterial strains with MIC of 4 μg/ml, comparable to the standard drug ciprofloxacin while 3-(4-methoxyphenyl)-6,7-dimethoxy-3a,4-dihydro-3H-indeno[1,2-c]pyrazole-2-carboxamide (D38) was found to be the most active compound against fungal strains with MIC 2-4 μg/ml, however less active than standard fluconazole. Toxicities prediction by Osiris were well supported and experimentally verified with exception of some compounds. In anticonvulsant screening, 3-(4-fluorophenyl)-N-(4-chlorophenyl)-6,7-dimethoxy-3a,4-dihydro-3H-indeno[1,2-c]pyrazole-2-carboxamide (D09) showed maximum activity showing 100% (4/4, 0.25-0.5h) and 75% (3/4, 1.0 h) protection against minimal clonic seizure test without any toxicity.

  20. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  1. Sproglig Metode og Analyse

    DEFF Research Database (Denmark)

    le Fevre Jakobsen, Bjarne

    Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011......Publikationen indeholder øvematerialer, tekster, powerpointpræsentationer og handouts til undervisningsfaget Sproglig Metode og Analyse på BA og tilvalg i Dansk/Nordisk 2010-2011...

  2. Uncertainty and Sensitivity Analyses Plan

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  3. Why looking at the whole hippocampus is not enough – a critical role for anteroposterior axis, subfield and activation analyses to enhance predictive value of hippocampal changes for Alzheimer’s disease diagnosis.

    Directory of Open Access Journals (Sweden)

    Aleksandra eMaruszak

    2014-03-01

    Full Text Available The hippocampus is one of the earliest affected brain regions in Alzheimer´s disease (AD and its dysfunction is believed to underlie the core feature of the disease- memory impairment. Given that hippocampal volume is one of the best AD biomarkers, our review focuses on distinct subfields within the hippocampus, pinpointing regions that might enhance the predictive value of current diagnostic methods. Our review presents how changes in hippocampal volume, shape, symmetry and activation are reflected by cognitive impairment and how they are linked with neurogenesis alterations. Moreover, we revisit the functional differentiation along the anteroposterior longitudinal axis of the hippocampus and discuss its relevance for AD diagnosis. Finally, we indicate that apart from hippocampal subfield volumetry, the characteristic pattern of hippocampal hyperactivation associated with seizures and neurogenesis changes is another promising candidate for an early AD biomarker that could become also a target for early interventions.

  4. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  5. Hydrophysical conditions and periphyton in natural rivers. Analysis and predictive modelling of periphyton by changed regulations; Hydrofysiske forhold og begroing i naturlige elver. Analyse og prediktiv modellering av begroing ved reguleringsendringer

    Energy Technology Data Exchange (ETDEWEB)

    Stokseth, S.

    1994-10-01

    The objective of this thesis has been to examine the interaction between hydrodynamical and physical factors and the temporal and spatial dynamics of periphyton in natural steep rivers. The study strategy has been to work with quantitative system variables to be able to evaluate the potential usability of a predictive model for periphyton changes as a response to river regulations. The thesis is constituted by a theoretical and an empirical study. The theoretical study is aimed at presenting a conceptual model of the relevant factors based on an analysis of published studies. Effort has been made to evaluate and present the background material in a structured way. To concurrently handle the spatial and temporal dynamics of periphyton a new method for data collection has been developed. A procedure for quantifying the photo registrations has been developed. The simple hydrodynamical parameters were estimated from a set of standard formulas whereas the complex parameters were estimated from a three dimensional simulation model called SSIIM. The main conclusion from the analysis is that flood events are the major controlling factors wrt. periphyton biomass and that water temperature is of major importance for the periphyton resistance. Low temperature clearly increases the periphyton erosion resistance. Thus, to model or control the temporal dynamics the river periphyton, the water temperature and the frequency and size of floods should be regarded the most significant controlling factors. The data in this study has been collected from a river with a stable water quality and frequent floods. 109 refs., 41 figs., 34 tabs.

  6. Wavelet Analyses and Applications

    Science.gov (United States)

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  7. Report sensory analyses veal

    NARCIS (Netherlands)

    Veldman, M.; Schelvis-Smit, A.A.M.

    2005-01-01

    On behalf of a client of Animal Sciences Group, different varieties of veal were analyzed by both instrumental and sensory analyses. The sensory evaluation was performed with a sensory analytical panel in the period of 13th of May and 31st of May, 2005. The three varieties of veal were: young bull,

  8. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    . Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  9. Meta-analyses

    NARCIS (Netherlands)

    Hendriks, M.A.; Luyten, J.W.; Scheerens, J.; Sleegers, P.J.C.; Scheerens, J.

    2014-01-01

    In this chapter results of a research synthesis and quantitative meta-analyses of three facets of time effects in education are presented, namely time at school during regular lesson hours, homework, and extended learning time. The number of studies for these three facets of time that could be used

  10. Expression of lactate/H⁺ symporters MCT1 and MCT4 and their chaperone CD147 predicts tumor progression in clear cell renal cell carcinoma: immunohistochemical and The Cancer Genome Atlas data analyses.

    Science.gov (United States)

    Kim, Younghye; Choi, Jung-Woo; Lee, Ju-Han; Kim, Young-Sik

    2015-01-01

    Clear cell renal cell carcinomas (ccRCCs) have inactivation of the von Hippel-Lindau protein, leading to the accumulation of hypoxia-inducible factor-α (HIF-α). HIF-1α induces aerobic glycolysis, the Warburg effect, whereas HIF-2α functions as an oncoprotein. Lactate transport through monocarboxylate transporters (MCTs) and the chaperone CD147 is essential for high glycolytic cancer cell survival. To elucidate the clinical significance of MCT1, MCT4, and CD147 expression, we investigated their expressions by immunohistochemistry in ccRCC specimens and validated the results by an open-access The Cancer Genome Atlas data analysis. Overexpression of MCT1, MCT4, and CD147 was observed in 49.4% (89/180), 39.4% (71/180), and 79.4% (143/180) of ccRCC patients, respectively. High MCT1 expression was associated with older age (P = .017), larger tumor size (P = .015), and advanced TNM stage (P = .012). However, MCT4 overexpression was not related to any variables. CD147 overexpression correlated with high grade (P = .005), tumor necrosis (P = .016), and larger tumor size (P = .038). In univariate analysis, high expression of MCT1 (P CD147 (P = .02) was linked to short progression-free survival. In multivariate analysis, high MCT1 expression was associated with worse progression-free survival (P = .001). In conclusion, high expression of MCT1 and CD147 is associated with poor prognostic factors. Overexpression of MCT1, MCT4, and CD147 predicts tumor progression. Reversing the Warburg effect by targeting the lactate transporters may be a useful strategy to prevent ccRCC progression.

  11. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    . Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set......When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... of credentials needed to reach a certain location in a system. This knowledge allows to identify a set of (inside) actors who have the possibility to commit an insider attack at that location. This has immediate applications in analysing log files, but also nontechnical applications such as identifying possible...

  12. Possible future HERA analyses

    CERN Document Server

    Geiser, Achim

    2015-01-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing $ep$ collider data and their physics scope. Comparisons to the original scope of the HERA programme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-e...

  13. Possible future HERA analyses

    Energy Technology Data Exchange (ETDEWEB)

    Geiser, Achim

    2015-12-15

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  14. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  15. THOR Turbulence Electron Analyser: TEA

    Science.gov (United States)

    Fazakerley, Andrew; Moore, Tom; Owen, Chris; Pollock, Craig; Wicks, Rob; Samara, Marilia; Rae, Jonny; Hancock, Barry; Kataria, Dhiren; Rust, Duncan

    2016-04-01

    Turbulence Heating ObserveR (THOR) is the first mission ever flown in space dedicated to plasma turbulence. The Turbulence Electron Analyser (TEA) will measure the plasma electron populations in the mission's Regions of Interest. It will collect a 3D electron velocity distribution with cadences as short as 5 ms. The instrument will be capable of measuring energies up to 30 keV. TEA consists of multiple electrostatic analyser heads arranged so as to measure electrons arriving from look directions covering the full sky, i.e. 4 pi solid angle. The baseline concept is similar to the successful FPI-DES instrument currently operating on the MMS mission. TEA is intended to have a similar angular resolution, but a larger geometric factor. In comparison to earlier missions, TEA improves on the measurement cadence. For example, MMS FPI-DES routinely operates at 30 ms cadence. The objective of measuring distributions at rates as fast as 5 ms is driven by the mission's scientific requirements to resolve electron gyroscale size structures, where plasma heating and fluctuation dissipation is predicted to occur. TEA will therefore be capable of making measurements of the evolution of distribution functions across thin (a few km) current sheets travelling past the spacecraft at up to 600 km/s, of the Power Spectral Density of fluctuations of electron moments and of distributions fast enough to match frequencies with waves expected to be dissipating turbulence (e.g. with 100 Hz whistler waves).

  16. Analyses of factors affecting prognosis of patients with sepsis and evaluation of their predicting values%脓毒症预后影响因素分析及预后价值评估

    Institute of Scientific and Technical Information of China (English)

    曾文美; 毛璞; 黄勇波; 庞晓清; 吴苏龙; 刘晓青; 黎毅敏

    2015-01-01

    factors affecting the prognosis of patients with sepsis and evaluate their values in predicting the disease outcome. Methods A clinical prospective study was conducted. Fifty-three septic patients admitted to intensive care unit (ICU) of the First Affiliated Hospital of Guangzhou Medical University from October 17th, 2012 to August 8th, 2013 were enrolled, and in the same term 35 volunteers having passedphysical check-up were assigned in the healthy control group. According to the severity of the patients, they were divided into sepsis, severe sepsis and septic shock groups. Furthermore, based on the difference in scores of acute physiology and chronic health evaluation Ⅱ (APACHE Ⅱ), the patients were divided into low-risk (APACHE Ⅱ scores 0.05). The IL-8 level of non-coagulation defect group was significantly lower than that of adjusted (ng/L:24.67 vs. 27.23, P0.05). Conclusions The grade of sepsis severity, APACHEⅡscore, whether existence of coagulation dysfunction being present or not and whether its presence being adjusted or not during the septic patients' stay in ICU, the levels of blood lactate, PCT, IL-6 and IL-8 on the first day in ICU are significantly correlated to the prognosis of septic patients. Whether the existence of coagulation dysfunction being present or not, whether coagulation dysfunction being adjusted or not and the blood lactate level are the independent prognostic factors of septic patients, and the plasma concentrations of IL-6 and IL-8 are the independent affecting factors of whether coagulation dysfunction occurring or not, therefore they have predicting value concerning the occurrence of coagulation dysfunction in septic patients.

  17. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  18. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural...... environment. In addition, main input data are based on transport modelling analyses based on a misleading `local ontology' among the model makers. The ontological misconceptions translate into erroneous epistemological assumptions about the possibility of precise predictions and the validity of willingness......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  19. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  20. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    Sturzstroms are very fast landslides of very large initial volume. As type features they display extreme run out, pared with intensive fragmentation of the involved blocks of rock within a collisional flow. The inherent danger to the growing communities in alpine valleys below future potential sites of sturzstroms must be examined and results of predictions of endangered zones allow to impact upon the planning processes in these areas. This calls for the ability to make Type A predictions, according to Lambe (1973), which are done before an event. But Type A predictions are only possible if sufficient understanding of the mechanisms involved in a process is available. The motivation of the doctoral thesis research project presented is therefore to reveal the mechanics of sturzstroms in more detail in order to contribute to the development of a Type A run out prediction model. It is obvious that a sturzstrom represents a highly dynamic collisional granular regime. Thus particles do not only collide but will eventually crush each other. Erismann and Abele (2001) describe this process as dynamic disintegration, where kinetic energy is the main driver for fragmenting the rock mass. In this case an approach combining the type features long run out and fragmentation within a single hypothesis is represented by the dynamic fragmentation-spreading model (Davies and McSaveney, 2009; McSaveney and Davies, 2009). Unfortunately, sturzstroms, and fragmentation within sturzstroms, can not be observed directly in a real event because of their long "reoccurrence time" and the obvious difficulties in placing measuring devices within such a rock flow. Therefore, rigorous modelling is required in particular of the transition from static to dynamic behaviour to achieve better knowledge of the mechanics of sturzstroms, and to provide empirical evidence to confirm the dynamic fragmentation-spreading model. Within this study fragmentation and their effects on the mobility of sturzstroms

  1. Enhancing Technology-Mediated Communication: Tools, Analyses, and Predictive Models

    Science.gov (United States)

    2007-09-01

    the home (see, for example, Nagel, Hudson, & Abowd, 2004), in social Chapter 2: Background 17 settings (see Kern, Antifakos, Schiele ...on Computer Supported Cooperative Work (CSCW 2006), pp. 525-528 ACM Press. Kern, N., Antifakos, S., Schiele , B., & Schwaninger, A. (2004). A model

  2. Analysis of K-net and Kik-net data: implications for ground motion prediction - acceleration time histories, response spectra and nonlinear site response; Analyse des donnees accelerometriques de K-net et Kik-net: implications pour la prediction du mouvement sismique - accelerogrammes et spectres de reponse - et la prise en compte des effets de site non-lineaire

    Energy Technology Data Exchange (ETDEWEB)

    Pousse, G

    2005-10-15

    This thesis intends to characterize ground motion during earthquake. This work is based on two Japanese networks. It deals with databases of shallow events, depth less than 25 km, with magnitude between 4.0 and 7.3. The analysis of K-net allows to compute a spectral ground motion prediction equation and to review the shape of the Eurocode 8 design spectra. We show the larger amplification at short period for Japanese data and bring in light the soil amplification that takes place at large period. In addition, we develop a new empirical model for simulating synthetic stochastic nonstationary acceleration time histories. By specifying magnitude, distance and site effect, this model allows to produce many time histories, that a seismic event is liable to produce at the place of interest. Furthermore, the study of near-field borehole records of the Kik-net allows to explore the validity domain of predictive equations and to explain what occurs by extrapolating ground motion predictions. Finally, we show that nonlinearity reduces the dispersion of ground motion at the surface. (author)

  3. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  4. The application analyses for primary spectrum pyrometer

    Institute of Scientific and Technical Information of China (English)

    FU; TaiRan

    2007-01-01

    In the applications of primary spectrum pyrometry, based on the dynamic range and the minimum sensibility of the sensor, the application issues, such as the measurement range and the measurement partition, were investigated through theoretical analyses. For a developed primary spectrum pyrometer, the theoretical predictions of measurement range and the distributions of measurement partition were presented through numerical simulations. And the measurement experiments of high-temperature blackbody and standard temperature lamp were processed to further verify the above theoretical analyses and numerical results. Therefore the research in the paper provides the helpful supports for the applications of primary spectrum pyrometer and other radiation pyrometers.……

  5. New Fossil Evidence on the Sister-Group of Mammals and Early Mesozoic Faunal Distributions

    Science.gov (United States)

    Shubin, Neil H.; Crompton, A. W.; Sues, Hans-Dieter; Olsen, Paul E.

    1991-03-01

    Newly discovered remains of highly advanced mammal-like reptiles (Cynodontia: Tritheledontidae) from the Early Jurassic of Nova Scotia, Canada, have revealed that aspects of the characteristic mammalian occlusal pattern are primitive. Mammals and tritheledontids share an homologous pattern of occlusion that is not seen in other cynodonts. The new tritheledontids represent the first definite record of this family from North America. The extreme similarity of North American and African tritheledontids supports the hypothesis that the global distribution of terrestrial tetrapods was homogeneous in the Early Jurassic. This Early Jurassic cosmopolitanism represents the continuation of a trend toward increased global homogeneity among terrestrial tetrapod communities that began in the late Paleozoic.

  6. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...... hyperfunktionelle websites. Det primære ærinde for HCI-eksperterne er at udarbejde websites, som er brugervenlige. Ifølge deres direktiver skal websites være opbygget med hurtige og effektive navigations- og interaktionsstrukturer, hvor brugeren kan få sine informationer ubesværet af lange downloadingshastigheder...... eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens...

  7. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... of prediction necessary and possible in spatial planning of urban development. Finally, the political implications of positions within theory of science rejecting the possibility of predictions about social phenomena are addressed....

  8. Multiple Imputation for Network Analyses

    NARCIS (Netherlands)

    Krause, Robert; Huisman, Mark; Steglich, Christian; Snijders, Thomas

    2016-01-01

    Missing data on network ties is a fundamental problem for network analyses. The biases induced by missing edge data, even when missing completely at random (MCAR), are widely acknowledged and problematic for network analyses (Kossinets, 2006; Huisman & Steglich, 2008; Huisman, 2009). Although model-

  9. Olfactometric analyses or odors measurement by sensorial analyses; Analyses olfactometriques ou mesure des odeurs par analyse sensorielle

    Energy Technology Data Exchange (ETDEWEB)

    Gouronnec, A.M. [Institut de Radioprotection et de Surete Nucleaire (IRSN), 92 - Clamart (France)

    2004-06-15

    The olfactometric analyses presented here are applied to industrial odors being able to generate harmful effects for people. The aim of the olfactometric analyses is to quantify odors, to qualify them or to join a pleasant or an unpleasant character to them (hedonism notion). The aim of this work is at first to present the different measurements carried out, the different measurement methods used and the current applications for each of the methods. (O.M.)

  10. Predicting future of predictive analysis

    OpenAIRE

    Piyush, Duggal

    2014-01-01

    With enormous growth in analytical data and insight about advantage of managing future brings Predictive Analysis in picture. It really has potential to be called one of efficient and competitive technologies that give an edge to business operations. The possibility to predict future market conditions and to know customers’ needs and behavior in advance is the area of interest of every organization. Other areas of interest may be maintenance prediction where we tend to predict when and where ...

  11. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L.A.; Gilbert, M Thomas P; Hofreiter, Michael

    2013-01-01

    analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... (mitogenomes). Such studies were initially limited to analyses of extant organisms, but developments in both DNA sequencing technologies and general methodological aspects related to working with degraded DNA have resulted in complete mitogenomes becoming increasingly popular for ancient DNA studies as well....... To date, at least 124 partially or fully assembled mitogenomes from more than 20 species have been obtained, and, given the rapid progress in sequencing technology, this number is likely to dramatically increase in the future. The increased information content offered by analysing full mitogenomes has...

  12. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    , this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security......Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software...

  13. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; Have, ten Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health sta

  14. Evaluation "Risk analyses of agroparks"

    NARCIS (Netherlands)

    Ge, L.

    2011-01-01

    Dit TransForum project richt zich op analyse van de onzekerheden en mogelijkheden van agroparken. Dit heeft geleid tot een risicomodel dat de kwalitatieve en/of kwantitatieve onzekerheden van een agropark project in kaart brengt. Daarmee kunnen maatregelen en managementstrategiën worden geïdentifice

  15. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  16. Accurate renormalization group analyses in neutrino sector

    Energy Technology Data Exchange (ETDEWEB)

    Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Kaneta, Kunio [Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Takahashi, Ryo [Graduate School of Science and Engineering, Shimane University, Matsue 690-8504 (Japan); Yamaguchi, Yuya [Department of Physics, Faculty of Science, Hokkaido University, Sapporo 060-0810 (Japan)

    2014-08-15

    We investigate accurate renormalization group analyses in neutrino sector between ν-oscillation and seesaw energy scales. We consider decoupling effects of top quark and Higgs boson on the renormalization group equations of light neutrino mass matrix. Since the decoupling effects are given in the standard model scale and independent of high energy physics, our method can basically apply to any models beyond the standard model. We find that the decoupling effects of Higgs boson are negligible, while those of top quark are not. Particularly, the decoupling effects of top quark affect neutrino mass eigenvalues, which are important for analyzing predictions such as mass squared differences and neutrinoless double beta decay in an underlying theory existing at high energy scale.

  17. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  18. Multivariate Evolutionary Analyses in Astrophysics

    CERN Document Server

    Fraix-Burnet, Didier

    2011-01-01

    The large amount of data on galaxies, up to higher and higher redshifts, asks for sophisticated statistical approaches to build adequate classifications. Multivariate cluster analyses, that compare objects for their global similarities, are still confidential in astrophysics, probably because their results are somewhat difficult to interpret. We believe that the missing key is the unavoidable characteristics in our Universe: evolution. Our approach, known as Astrocladistics, is based on the evolutionary nature of both galaxies and their properties. It gathers objects according to their "histories" and establishes an evolutionary scenario among groups of objects. In this presentation, I show two recent results on globular clusters and earlytype galaxies to illustrate how the evolutionary concepts of Astrocladistics can also be useful for multivariate analyses such as K-means Cluster Analysis.

  19. Analysing ESP Texts, but How?

    OpenAIRE

    Borza Natalia

    2015-01-01

    English as a second language (ESL) teachers instructing general English and English for specific purposes (ESP) in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary schoo...

  20. DebriSat Laboratory Analyses

    Science.gov (United States)

    2015-01-05

    Semiquantitative elemental composition. – Elemental mapping and line scans. • Fourier Transform Infrared ( FTIR ) spectroscopy – Identification of chemical...Transform Infrared ( FTIR ) spectroscopy – Nicolet 6700 spectrometer. – Harrick Scientific “praying mantis” diffuse reflectance accessory. • Qualitative...VIS-NIR Spectroscopy Dianna Alaan © The Aerospace Corporation 2015 DebriSat Laboratory Analyses 5 January, 2015 Paul M. Adams1, Zachary Lingley2

  1. Analyses of containment structures with corrosion damage

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, J.L. [Sandia National Labs., Albuquerque, NM (United States)

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a {open_quotes}lower bound{close_quotes}, {open_quotes}best estimate{close_quotes}, and {open_quotes}upper bound{close_quotes} failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties.

  2. Analyses of a Virtual World

    CERN Document Server

    Holovatch, Yurij; Szell, Michael; Thurner, Stefan

    2016-01-01

    We present an overview of a series of results obtained from the analysis of human behavior in a virtual environment. We focus on the massive multiplayer online game (MMOG) Pardus which has a worldwide participant base of more than 400,000 registered players. We provide evidence for striking statistical similarities between social structures and human-action dynamics in the real and virtual worlds. In this sense MMOGs provide an extraordinary way for accurate and falsifiable studies of social phenomena. We further discuss possibilities to apply methods and concepts developed in the course of these studies to analyse oral and written narratives.

  3. Experimental review on moment analyses

    CERN Document Server

    Calvi, M

    2003-01-01

    Moments of photon energy spectrum in B->Xs gamma decays, of hadronic mass spectrum and of lepton energy spectrum in B->Xc l nu decays are sensitive to the masses of the heavy quarks as well as to the non-perturbative parameters of the heavy quark expansion. Several measurements have been performed both at the Upsilon(4S) resonance and at Z0 center of mass energies. They provide constraints on the non-perturbative parameters, give a test of the consistency of the theoretical predictions and of the underlying assumptions and allow to reduce the indetermination in the |Vcb| extraction.

  4. Fracturing and brittleness index analyses of shales

    Science.gov (United States)

    Barnhoorn, Auke; Primarini, Mutia; Houben, Maartje

    2016-04-01

    The formation of a fracture network in rocks has a crucial control on the flow behaviour of fluids. In addition, an existing network of fractures , influences the propagation of new fractures during e.g. hydraulic fracturing or during a seismic event. Understanding of the type and characteristics of the fracture network that will be formed during e.g. hydraulic fracturing is thus crucial to better predict the outcome of a hydraulic fracturing job. For this, knowledge of the rock properties is crucial. The brittleness index is often used as a rock property that can be used to predict the fracturing behaviour of a rock for e.g. hydraulic fracturing of shales. Various terminologies of the brittleness index (BI1, BI2 and BI3) exist based on mineralogy, elastic constants and stress-strain behaviour (Jin et al., 2014, Jarvie et al., 2007 and Holt et al., 2011). A maximum brittleness index of 1 predicts very good and efficient fracturing behaviour while a minimum brittleness index of 0 predicts a much more ductile shale behaviour. Here, we have performed systematic petrophysical, acoustic and geomechanical analyses on a set of shale samples from Whitby (UK) and we have determined the three different brittleness indices on each sample by performing all the analyses on each of the samples. We show that each of the three brittleness indices are very different for the same sample and as such it can be concluded that the brittleness index is not a good predictor of the fracturing behaviour of shales. The brittleness index based on the acoustic data (BI1) all lie around values of 0.5, while the brittleness index based on the stress strain data (BI2) give an average brittleness index around 0.75, whereas the mineralogy brittleness index (BI3) predict values below 0.2. This shows that by using different estimates of the brittleness index different decisions can be made for hydraulic fracturing. If we would rely on the mineralogy (BI3), the Whitby mudstone is not a suitable

  5. Analysing ESP Texts, but How?

    Directory of Open Access Journals (Sweden)

    Borza Natalia

    2015-03-01

    Full Text Available English as a second language (ESL teachers instructing general English and English for specific purposes (ESP in bilingual secondary schools face various challenges when it comes to choosing the main linguistic foci of language preparatory courses enabling non-native students to study academic subjects in English. ESL teachers intending to analyse English language subject textbooks written for secondary school students with the aim of gaining information about what bilingual secondary school students need to know in terms of language to process academic textbooks cannot avoiding deal with a dilemma. It needs to be decided which way it is most appropriate to analyse the texts in question. Handbooks of English applied linguistics are not immensely helpful with regard to this problem as they tend not to give recommendation as to which major text analytical approaches are advisable to follow in a pre-college setting. The present theoretical research aims to address this lacuna. Respectively, the purpose of this pedagogically motivated theoretical paper is to investigate two major approaches of ESP text analysis, the register and the genre analysis, in order to find the more suitable one for exploring the language use of secondary school subject texts from the point of view of an English as a second language teacher. Comparing and contrasting the merits and limitations of the two contrastive approaches allows for a better understanding of the nature of the two different perspectives of text analysis. The study examines the goals, the scope of analysis, and the achievements of the register perspective and those of the genre approach alike. The paper also investigates and reviews in detail the starkly different methods of ESP text analysis applied by the two perspectives. Discovering text analysis from a theoretical and methodological angle supports a practical aspect of English teaching, namely making an informed choice when setting out to analyse

  6. Perturbation analyses of intermolecular interactions

    Science.gov (United States)

    Koyama, Yohei M.; Kobayashi, Tetsuya J.; Ueda, Hiroki R.

    2011-08-01

    Conformational fluctuations of a protein molecule are important to its function, and it is known that environmental molecules, such as water molecules, ions, and ligand molecules, significantly affect the function by changing the conformational fluctuations. However, it is difficult to systematically understand the role of environmental molecules because intermolecular interactions related to the conformational fluctuations are complicated. To identify important intermolecular interactions with regard to the conformational fluctuations, we develop herein (i) distance-independent and (ii) distance-dependent perturbation analyses of the intermolecular interactions. We show that these perturbation analyses can be realized by performing (i) a principal component analysis using conditional expectations of truncated and shifted intermolecular potential energy terms and (ii) a functional principal component analysis using products of intermolecular forces and conditional cumulative densities. We refer to these analyses as intermolecular perturbation analysis (IPA) and distance-dependent intermolecular perturbation analysis (DIPA), respectively. For comparison of the IPA and the DIPA, we apply them to the alanine dipeptide isomerization in explicit water. Although the first IPA principal components discriminate two states (the α state and PPII (polyproline II) + β states) for larger cutoff length, the separation between the PPII state and the β state is unclear in the second IPA principal components. On the other hand, in the large cutoff value, DIPA eigenvalues converge faster than that for IPA and the top two DIPA principal components clearly identify the three states. By using the DIPA biplot, the contributions of the dipeptide-water interactions to each state are analyzed systematically. Since the DIPA improves the state identification and the convergence rate with retaining distance information, we conclude that the DIPA is a more practical method compared with the

  7. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...

  8. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  9. Retorisk analyse af historiske tekster

    DEFF Research Database (Denmark)

    Kock, Christian Erik J

    2014-01-01

    In recent years, rhetoric and the rhetorical tradition has attracted increasing interest from historians, such as, e.g., Quentin Skinner. The paper aims to explain and illustrate what may be understood by a rhetorical analysis (or “rhetorical criticism”) of historical documents, i.e., how those...... scholars who identify themselves as rhetoricians tend to define and conduct such an analysis. It is argued that while rhetoricians would sympathize with Skinner’s adoption of speech act theory in his reading of historical documents, they would generally extend their rhetorical readings of such documents...... to many more features than just the key concepts invoked in them. The paper discusses examples of rhetorical analyses done by prominent contemporary rhetoricians, including Edwin Black, Kenneth Burke, Maurice Charland, and Michael Leff. It relates its view of rhetorical documents to trends in current...

  10. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  11. HGCal Simulation Analyses for CMS

    CERN Document Server

    Bruno, Sarah Marie

    2015-01-01

    This summer, I approached the topic of fast-timing detection of photons from Higgs decays via simulation analyses, working under the supervision of Dr. Adolf Bornheim of the California Institute of Technology. My specific project focused on simulating the high granularity calorimeter for the Compact Muon Solenoid (CMS) experiment. CMS detects particles using calorimeters. The Electromagnetic Calorimeter (ECal) is arranged cylindrically to form a barrel section and two “endcaps.” Previously, both the barrel and endcap have employed lead tungstate crystal detectors, known as the “shashlik” design. The crystal detectors, however, rapidly degrade from exposure to radiation. This effect is most pronounced in the endcaps. To avoid the high expense of frequently replacing degraded detectors, it was recently decided to eliminate the endcap crystals in favor of an arrangement of silicon detectors known as the “High Granularity Calorimeter” (HGCal), while leaving the barrel detector technology unchanged. T...

  12. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...... reasons why those competing views fail provide important insights into the ethics of killing....

  13. Hydrogen Analyses in the EPR

    Energy Technology Data Exchange (ETDEWEB)

    Worapittayaporn, S.; Eyink, J.; Movahed, M. [AREVA NP GmbH, P.O. Box 3220, D-91050 Erlangen (Germany)

    2008-07-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  14. Proteins analysed as virtual knots

    CERN Document Server

    Alexander, Keith; Dennis, Mark R

    2016-01-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identify...

  15. Predicting Anthracycline Benefit

    DEFF Research Database (Denmark)

    Bartlett, John M S; McConkey, Christopher C; Munro, Alison F

    2015-01-01

    as measured with a centromere enumeration probe (CEP17) predicted sensitivity to anthracyclines, we report here an individual patient-level pooled analysis of data from five trials comparing anthracycline-based chemotherapy with CMF (cyclophosphamide, methotrexate, and fluorouracil) as adjuvant chemotherapy.......6% (TOP2A) of 3,846 patient cases with available tissue. Both CEP17and TOP2A treatment-by-marker interactions remained significant in adjusted analyses for recurrence-free and overall survival, whereas HER2 did not. A combined CEP17 and TOP2A-adjusted model predicted anthracycline benefit across all five...... trials for both recurrence-free (hazard ratio, 0.64; 95% CI, 0.51 to 0.82; P = .001) and overall survival (hazard ratio, 0.66; 95% CI, 0.51 to 0.85; P = .005). CONCLUSION: This prospectively planned individual-patient pooled analysis of patient cases from five adjuvant trials confirms that patients whose...

  16. Predicting Leakage in Labyrinth Seals

    Science.gov (United States)

    Morrison, G. L.; Rhode, D. L.; Cogan, K. C.; Chi, D.; Demko, J.

    1985-01-01

    Analytical and empirical methods evaluated. 264-page report presents comprehensive information on leakage in labyrinth seals. Summarizes previous analyses of leakage, reviews leakage tests conducted by authors and evaluates various analytical and experimental methods of determining leakage and discusses leakage prediction techniques.

  17. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  18. Social Media Analyses for Social Measurement.

    Science.gov (United States)

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  19. ITER Safety Analyses with ISAS

    Science.gov (United States)

    Gulden, W.; Nisan, S.; Porfiri, M.-T.; Toumi, I.; de Gramont, T. Boubée

    1997-06-01

    Detailed analyses of accident sequences for the International Thermonuclear Experimental Reactor (ITER), from an initiating event to the environmental release of activity, have involved in the past the use of different types of computer codes in a sequential manner. Since these codes were developed at different time scales in different countries, there is no common computing structure to enable automatic data transfer from one code to the other, and no possibility exists to model or to quantify the effect of coupled physical phenomena. To solve this problem, the Integrated Safety Analysis System of codes (ISAS) is being developed, which allows users to integrate existing computer codes in a coherent manner. This approach is based on the utilization of a command language (GIBIANE) acting as a “glue” to integrate the various codes as modules of a common environment. The present version of ISAS allows comprehensive (coupled) calculations of a chain of codes such as ATHENA (thermal-hydraulic analysis of transients and accidents), INTRA (analysis of in-vessel chemical reactions, pressure built-up, and distribution of reaction products inside the vacuum vessel and adjacent rooms), and NAUA (transport of radiological species within buildings and to the environment). In the near future, the integration of S AFALY (simultaneous analysis of plasma dynamics and thermal behavior of in-vessel components) is also foreseen. The paper briefly describes the essential features of ISAS development and the associated software architecture. It gives first results of a typical ITER accident sequence, a loss of coolant accident (LOCA) in the divertor cooling loop inside the vacuum vessel, amply demonstrating ISAS capabilities.

  20. Predictive role of the nighttime blood pressure

    DEFF Research Database (Denmark)

    Hansen, Tine W; Li, Yan; Boggia, José;

    2011-01-01

    Numerous studies addressed the predictive value of the nighttime blood pressure (BP) as captured by ambulatory monitoring. However, arbitrary cutoff limits in dichotomized analyses of continuous variables, data dredging across selected subgroups, extrapolation of cross-sectional studies to prospe......Numerous studies addressed the predictive value of the nighttime blood pressure (BP) as captured by ambulatory monitoring. However, arbitrary cutoff limits in dichotomized analyses of continuous variables, data dredging across selected subgroups, extrapolation of cross-sectional studies...

  1. Aerothermodynamic Analyses of Towed Ballutes

    Science.gov (United States)

    Gnoffo, Peter A.; Buck, Greg; Moss, James N.; Nielsen, Eric; Berger, Karen; Jones, William T.; Rudavsky, Rena

    2006-01-01

    A ballute (balloon-parachute) is an inflatable, aerodynamic drag device for application to planetary entry vehicles. Two challenging aspects of aerothermal simulation of towed ballutes are considered. The first challenge, simulation of a complete system including inflatable tethers and a trailing toroidal ballute, is addressed using the unstructured-grid, Navier-Stokes solver FUN3D. Auxiliary simulations of a semi-infinite cylinder using the rarefied flow, Direct Simulation Monte Carlo solver, DSV2, provide additional insight into limiting behavior of the aerothermal environment around tethers directly exposed to the free stream. Simulations reveal pressures higher than stagnation and corresponding large heating rates on the tether as it emerges from the spacecraft base flow and passes through the spacecraft bow shock. The footprint of the tether shock on the toroidal ballute is also subject to heating amplification. Design options to accommodate or reduce these environments are discussed. The second challenge addresses time-accurate simulation to detect the onset of unsteady flow interactions as a function of geometry and Reynolds number. Video of unsteady interactions measured in the Langley Aerothermodynamic Laboratory 20-Inch Mach 6 Air Tunnel and CFD simulations using the structured grid, Navier-Stokes solver LAURA are compared for flow over a rigid spacecraft-sting-toroid system. The experimental data provides qualitative information on the amplitude and onset of unsteady motion which is captured in the numerical simulations. The presence of severe unsteady fluid - structure interactions is undesirable and numerical simulation must be able to predict the onset of such motion.

  2. Nonlinear Analyses of the Dynamic Properties of Hydrostatic Bearing Systems

    Institute of Scientific and Technical Information of China (English)

    LIU Wei(刘伟); WU Xiujiang(吴秀江); V.A. Prokopenko

    2003-01-01

    Nonlinear analyses of hydrostatic bearing systems are necessary to adequately model the fluid-solid interaction. The dynamic properties of linear and nonlinear analytical models of hydrostatic bearings are compared in this paper. The analyses were based on the determination of the aperiodic border of transient processes with external step loads. The results show that the dynamic properties can be most effectively improved by increasing the hydrostatic bearing crosspiece width and additional pocket volume in a bearing can extend the load range for which the transient process is aperiodic, but an additional restrictor and capacitor (RC) chain must be introduced for increasing damping. The nonlinear analyses can also be used to predict typical design parameters for a hydrostatic bearing.

  3. Genetic Analyses in Health Laboratories: Current Status and Expectations

    Science.gov (United States)

    Finotti, Alessia; Breveglieri, Giulia; Borgatti, Monica; Gambari, Roberto

    Genetic analyses performed in health laboratories involve adult patients, newborns, embryos/fetuses, pre-implanted pre-embryos, pre-fertilized oocytes and should meet the major medical needs of hospitals and pharmaceutical companies. Recent data support the concept that, in addition to diagnosis and prognosis, genetic analyses might lead to development of personalized therapy. Novel frontiers in genetic testing involve the development of single cell analyses and non-invasive assays, including those able to predict outcome of cancer pathologies by looking at circulating tumor cells, DNA, mRNA and microRNAs. In this respect, PCR-free diagnostics appears to be one of the most interesting and appealing approaches.

  4. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  5. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  6. Analyses of bundle experiment data using MATRA-h

    Energy Technology Data Exchange (ETDEWEB)

    Lim, In Cheol; Chea, Hee Taek [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    When the construction and operation license for HANARO was renewed in 1995, 25% of CHF penalty was imposed. The reason for this was that the validation work related to the CHF design calculation was not enough for the assurance of CHF margin. As a part of the works to recover this CHF penalty, MATRA-h was developed by implementing the new correlations for the heat transfer, CHF prediction, subcooled void to the MATRA-a, which is the modified version of COBRA-IV-I done by KAERI. Using MATRA-h, the subchannel analyses for the bundle experiment data were performed. The comparison of the code predictions with the experimental results, it was found that the code would give the conservative predictions as far as the CHF in the bundle geometry is concerned. (author). 12 refs., 25 figs., 16 tabs.

  7. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan;

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  8. Predicting Group Evolution in the Social Network

    OpenAIRE

    Bródka, Piotr; Kazienko, Przemysław; Kołoszczyk, Bartosz

    2012-01-01

    Groups - social communities are important components of entire societies, analysed by means of the social network concept. Their immanent feature is continuous evolution over time. If we know how groups in the social network has evolved we can use this information and try to predict the next step in the given group evolution. In the paper, a new aproach for group evolution prediction is presented and examined. Experimental studies on four evolving social networks revealed that (i) the predict...

  9. Computational Analyses of Pressurization in Cryogenic Tanks

    Science.gov (United States)

    Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chun P.; Field, Robert E.; Ryan, Harry

    2010-01-01

    A comprehensive numerical framework utilizing multi-element unstructured CFD and rigorous real fluid property routines has been developed to carry out analyses of propellant tank and delivery systems at NASA SSC. Traditionally CFD modeling of pressurization and mixing in cryogenic tanks has been difficult primarily because the fluids in the tank co-exist in different sub-critical and supercritical states with largely varying properties that have to be accurately accounted for in order to predict the correct mixing and phase change between the ullage and the propellant. For example, during tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. In our modeling framework, we incorporated two different approaches to real fluids modeling: (a) the first approach is based on the HBMS model developed by Hirschfelder, Beuler, McGee and Sutton and (b) the second approach is based on a cubic equation of state developed by Soave, Redlich and Kwong (SRK). Both approaches cover fluid properties and property variation spanning sub-critical gas and liquid states as well as the supercritical states. Both models were rigorously tested and properties for common fluids such as oxygen, nitrogen, hydrogen etc were compared against NIST data in both the sub-critical as well as supercritical regimes.

  10. VICTORIA-92 pretest analyses of PHEBUS-FPT0

    Energy Technology Data Exchange (ETDEWEB)

    Bixler, N.E.; Erickson, C.M.

    1994-01-01

    FPT0 is the first of six tests that are scheduled to be conducted in an experimental reactor in Cadarache, France. The test apparatus consists of an in-pile fuel bundle, an upper plenum, a hot leg, a steam generator, a cold leg, and a small containment. Thus, the test is integral in the sense that it attempts to simulate all of the processes that would be operative in a severe nuclear accident. In FPT0, the fuel will be trace irradiated; in subsequent tests high burn-up fuel will be used. This report discusses separate pretest analyses of the FPT0 fuel bundle and primary circuit have been conducted using the USNRC`s source term code, VICTORIA-92. Predictions for release of fission product, control rod, and structural elements from the test section are compared with those given by CORSOR-M. In general, the releases predicted by VICTORIA-92 occur earlier than those predicted by CORSOR-M. The other notable difference is that U release is predicted to be on a par with that of the control rod elements; CORSOR-M predicts U release to be about 2 orders of magnitude greater.

  11. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  12. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  13. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  14. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... monocytogenes, proteolytic count, psychrotrophic bacteria, Salmonella, Staphylococcus, thermoduric bacteria,...

  15. Green building performance prediction/assessment

    Energy Technology Data Exchange (ETDEWEB)

    Papamichael, Konstantinos

    2000-02-01

    To make decisions, building designers need to predict and assess the performance of their ideas with respect to various criteria, such as comfort, esthetics, energy, environmental impact, economics, etc. Performance prediction with respect to environmental impact requires complicated models and massive computations, which are usually possible only through computer-based tools. This paper focuses on the use of computer-based tools for predicting and assessing building performance with respect to environmental impact criteria for the design of green buildings. It contains analyses of green performance prediction/assessment and descriptions of available tools, along with discussions on their use by different types of users. Finally, it includes analyses of the cost and benefits of green performance prediction and assessment.

  16. 神经元蜡样质脂褐质沉积病(NCL)的基因型与表型相关性研究%Genotype-phenotype analyses of classic neuronal ceroid lipofuscinosis (NCLs): genetic predictions from clinical and pathological findings

    Institute of Scientific and Technical Information of China (English)

    Weina JU; W. Ted BROWN; Nanbert ZHONG; Anetta WRONSKA; Dorota N. MOROZIEWICZ; Rocksheng ZHONG; Natalia WISNIEWSKI; Anna JURKIEWICZ; Michael FIORY; Krystyna E. WISNIEWSKI; Lance JOHNSTON

    2006-01-01

    Objective:Genotype-phenotype associations were studied in 517 subjects clinically affected by classical neuronal ceroid lipofuscinosis (NCL). Methods:Genetic loci CLN1-3 were analyzed in regard to age of onset, initial neurological symptoms, and electron microscope (EM) profiles. Results: The most common initial symptom leading to a clinical evaluation was developmental delay (30%) in NCL1, seizures (42.4%) in NCL2, and vision problems (53.5%) in NCL3. Eighty-two percent of NCL1 cases had granular osmiophilic deposits (GRODs) or mixed-GROD-containing EM profiles; 94% of NCL2 cases had curvilinear (CV) or mixed-CV-containing profiles; and 91% of NCL3 had fingerprint (FP) or mixed-FP-containing profiles. The mixed-type EM profile was found in approximately one-third of the NCL cases. DNA mutations within a specific CLN gene were further correlated with NCL phenotypes. Seizures were noticed to associate with common mutations 523G>A and 636C>T of CLN2 in NCL2 but not with common mutations 223G>A and 451C>T of CLN1 in NCL1. Vision loss was the initial symptom in all types of mutations in NCL3. Surprisingly, our data showed that the age of onset was atypical in 51.3% of NCL1 (infantile form) cases, 19.7% of NCL2 (late-infantile form) cases, and 42.8% of NCL3 (juvenile form) cases.Conclusion:Our data provide an overall picture regarding the clinical recognition of classical childhood NCLs. This may assist in the prediction and genetic identification of NCL1-3 via their characteristic clinical features.

  17. Genome-Facilitated Analyses of Geomicrobial Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kenneth H. Nealson

    2012-05-02

    This project had the goal(s) of understanding the mechanism(s) of extracellular electron transport (EET) in the microbe Shewanella oneidensis MR-1, and a number of other strains and species in the genus Shewanella. The major accomplishments included sequencing, annotation, and analysis of more than 20 Shewanella genomes. The comparative genomics enabled the beginning of a systems biology approach to this genus. Another major contribution involved the study of gene regulation, primarily in the model organism, MR-1. As part of this work, we took advantage of special facilities at the DOE: e.g., the synchrotron radiation facility at ANL, where we successfully used this system for elemental characterization of single cells in different metabolic states (1). We began work with purified enzymes, and identification of partially purified enzymes, leading to initial characterization of several of the 42 c-type cytochromes from MR-1 (2). As the genome became annotated, we began experiments on transcriptome analysis under different conditions of growth, the first step towards systems biology (3,4). Conductive appendages of Shewanella, called bacterial nanowires were identified and characterized during this work (5, 11, 20,21). For the first time, it was possible to measure the electron transfer rate between single cells and a solid substrate (20), a rate that has been confirmed by several other laboratories. We also showed that MR-1 cells preferentially attach to cells at a given charge, and are not attracted, or even repelled by other charges. The interaction with the charged surfaces begins with a stimulation of motility (called electrokinesis), and eventually leads to attachment and growth. One of the things that genomics allows is the comparative analysis of the various Shewanella strains, which led to several important insights. First, while the genomes predicted that none of the strains looked like they should be able to degrade N-acetyl glucosamine (NAG), the monomer

  18. Novel Algorithms for Astronomical Plate Analyses

    Indian Academy of Sciences (India)

    Rene Hudec; Lukas Hudec

    2011-03-01

    Powerful computers and dedicated software allow effective data mining and scientific analyses in astronomical plate archives. We give and discuss examples of newly developed algorithms for astronomical plate analyses, e.g., searches for optical transients, as well as for major spectral and brightness changes.

  19. Moving Crystal Slow-Neutron Wavelength Analyser

    DEFF Research Database (Denmark)

    Buras, B.; Kjems, Jørgen

    1973-01-01

    Experimental proof that a moving single crystal can serve as a slow-neutron wavelength analyser of special features is presented. When the crystal moves with a velocity h/(2 md) (h-Planck constant, m-neutron mass, d-interplanar spacing) perpendicular to the diffracting plane and the analysed...

  20. Economische analyse van de Nederlandse biotechnologiesector

    NARCIS (Netherlands)

    Giessen, A.M. van der; Gijsbers, G.W.; Koops, R.; Zee, F.A. van der

    2014-01-01

    In opdracht van de Commissie Genetische Modificatie (COGEM) heeft TNO een deskstudie uitgevoerd getiteld “Economische analyse van de Nederlandse biotechnologiesector”. Deze analyse is één van de voorstudies die de COGEM laat uitvoeren als voorbereiding op de Trendanalyse Biotechnologie, die naar ver

  1. GIPSy: Genomic island prediction software.

    Science.gov (United States)

    Soares, Siomar C; Geyik, Hakan; Ramos, Rommel T J; de Sá, Pablo H C G; Barbosa, Eudes G V; Baumbach, Jan; Figueiredo, Henrique C P; Miyoshi, Anderson; Tauch, Andreas; Silva, Artur; Azevedo, Vasco

    2016-08-20

    Bacteria are highly diverse organisms that are able to adapt to a broad range of environments and hosts due to their high genomic plasticity. Horizontal gene transfer plays a pivotal role in this genome plasticity and in evolution by leaps through the incorporation of large blocks of genome sequences, ordinarily known as genomic islands (GEIs). GEIs may harbor genes encoding virulence, metabolism, antibiotic resistance and symbiosis-related functions, namely pathogenicity islands (PAIs), metabolic islands (MIs), resistance islands (RIs) and symbiotic islands (SIs). Although many software for the prediction of GEIs exist, they only focus on PAI prediction and present other limitations, such as complicated installation and inconvenient user interfaces. Here, we present GIPSy, the genomic island prediction software, a standalone and user-friendly software for the prediction of GEIs, built on our previously developed pathogenicity island prediction software (PIPS). We also present four application cases in which we crosslink data from literature to PAIs, MIs, RIs and SIs predicted by GIPSy. Briefly, GIPSy correctly predicted the following previously described GEIs: 13 PAIs larger than 30kb in Escherichia coli CFT073; 1 MI for Burkholderia pseudomallei K96243, which seems to be a miscellaneous island; 1 RI of Acinetobacter baumannii AYE, named AbaR1; and, 1 SI of Mesorhizobium loti MAFF303099 presenting a mosaic structure. GIPSy is the first life-style-specific genomic island prediction software to perform analyses of PAIs, MIs, RIs and SIs, opening a door for a better understanding of bacterial genome plasticity and the adaptation to new traits.

  2. Assessment of protein disorder region predictions in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-11-22

    The article presents the assessment of disorder region predictions submitted to CASP10. The evaluation is based on the three measures tested in previous CASPs: (i) balanced accuracy, (ii) the Matthews correlation coefficient for the binary predictions, and (iii) the area under the curve in the receiver operating characteristic (ROC) analysis of predictions using probability annotation. We also performed new analyses such as comparison of the submitted predictions with those obtained with a Naïve disorder prediction method and with predictions from the disorder prediction databases D2P2 and MobiDB. On average, the methods participating in CASP10 demonstrated slightly better performance than those in CASP9.

  3. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  4. Final report for confinement vessel analysis. Task 2, Safety vessel impact analyses

    Energy Technology Data Exchange (ETDEWEB)

    Murray, Y.D. [APTEK, Inc., Colorado Springs, CO (United States)

    1994-01-26

    This report describes two sets of finite element analyses performed under Task 2 of the Confinement Vessel Analysis Program. In each set of analyses, a charge is assumed to have detonated inside the confinement vessel, causing the confinement vessel to fail in either of two ways; locally around the weld line of a nozzle, or catastrophically into two hemispheres. High pressure gases from the internal detonation pressurize the inside of the safety vessel and accelerate the fractured nozzle or hemisphere into the safety vessel. The first set of analyses examines the structural integrity of the safety vessel when impacted by the fractured nozzle. The objective of these calculations is to determine if the high strength bolt heads attached to the nozzle penetrate or fracture the lower strength safety vessel, thus allowing gaseous detonation products to escape to the atmosphere. The two dimensional analyses predict partial penetration of the safety vessel beneath the tip of the penetrator. The analyses also predict maximum principal strains in the safety vessel which exceed the measured ultimate strain of steel. The second set of analyses examines the containment capability of the safety vessel closure when impacted by half a confinement vessel (hemisphere). The predicted response is the formation of a 0.6-inch gap, caused by relative sliding and separation between the two halves of the safety vessel. Additional analyses with closure designs that prevent the gap formation are recommended.

  5. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  6. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  7. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  8. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein......, or infrared light, the specific light radiates into the channel through said light transparent material, the second material has a second thermal expansion coefficient being different from the first thermal expansion coefficient. The micromechanical photothermal analyser also comprises an irradiation source...

  9. En kritisk analyse af Leviathan & Samfundspagten

    OpenAIRE

    Bjelobrk, Aleksandar

    2006-01-01

    I denne rapport undersøges de politiske filosoffer Thomas Hobbes’ (1588-1679) og Jean-Jacques Rousseaus (1712-1778) udgaver af kontraktteorien. Med udgangspunkt i Leviathan del 1, Of Man og del 2, Of Common-Wealth af Thomas Hobbes (Hobbes 1991, oprindelig udgivet 1651) og Samfundspagten af Jean-Jacques Rousseau (Rousseau 1987, oprindelig udgivet 1762) klarlægges inkonsisten i deres teorier gennem to separate analyser. Med udgangspunkt i den fundne inkonsistens foretages en komparativ analyse,...

  10. Thermal Analyse sof Cross-Linked Polyethylene

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2007-01-01

    Full Text Available The paper summarizes results obtained during the structural analyses measurements (Differential Scanning Calorimetry DSC, Thermogravimetry TG, Thermomechanical analysis TMA and Fourier transform infrared spectroscopy FT-IR. The samples of cross-linked polyethylene cable insulation were tested via these analyses. The DSC and TG were carried out using simultaneous thermal analyzer TA Instruments SDT Q600 with connection of Fourier transform infrared spectrometer Nicolet 380. Thermomechanical analysis was carried out by TMA Q400EM TA Instruments apparatus.

  11. Predictive mechanisms in idiom comprehension.

    Science.gov (United States)

    Vespignani, Francesco; Canal, Paolo; Molinaro, Nicola; Fonda, Sergio; Cacciari, Cristina

    2010-08-01

    Prediction is pervasive in human cognition and plays a central role in language comprehension. At an electrophysiological level, this cognitive function contributes substantially in determining the amplitude of the N400. In fact, the amplitude of the N400 to words within a sentence has been shown to depend on how predictable those words are: The more predictable a word, the smaller the N400 elicited. However, predictive processing can be based on different sources of information that allow anticipation of upcoming constituents and integration in context. In this study, we investigated the ERPs elicited during the comprehension of idioms, that is, prefabricated multiword strings stored in semantic memory. When a reader recognizes a string of words as an idiom before the idiom ends, she or he can develop expectations concerning the incoming idiomatic constituents. We hypothesized that the expectations driven by the activation of an idiom might differ from those driven by discourse-based constraints. To this aim, we compared the ERP waveforms elicited by idioms and two literal control conditions. The results showed that, in both cases, the literal conditions exhibited a more negative potential than the idiomatic condition. Our analyses suggest that before idiom recognition the effect is due to modulation of the N400 amplitude, whereas after idiom recognition a P300 for the idiomatic sentence has a fundamental role in the composition of the effect. These results suggest that two distinct predictive mechanisms are at work during language comprehension, based respectively on probabilistic information and on categorical template matching.

  12. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  13. Search for single top production using multivariate analyses at CDF

    CERN Document Server

    Hirschbuehl, Dominic

    2007-01-01

    This article reports on recent searches for single-top-quark production by the CDF collaboration at the Tevatron using a data set that corresponds to an integrated luminosity of 955 fb^-1. Three different analyses techniques are employed, one using likelihood discriminants, one neural networks and one matrix elements. The sensitivity to single-top production at the rate predicted by the standard model ranges from 2.1 to 2.6 sigma. While the first two analyses observe a deficit of single-top like events compared to the expectation, the matrix element method observes an excess corresponding to a background fluctuation of 2.3 sigma. The null results of the likelihood and neural network analyses translate in upper limits on the cross section of 2.6 pb for the t-channel production mode and 3.7 pb for the s-channel mode at the 95% C.L. The matrix element result corresponds to a measurement of 2.7^{+1.5}_{-1.3} pb for the combined t- and s-channel single-top cross section. In addition, CDF has searched for non-stand...

  14. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian;

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We...... examined all reviews approved and published by the Cochrane Heart Group in the 2012 Cochrane Library that included at least one meta-analysis with 5 or more randomized trials. We used trial sequential analysis to classify statistically significant meta-analyses as true positives if their pooled sample size...... but infrequently recognized, even among methodologically robust reviews published by the Cochrane Heart Group. Meta-analysts and readers should incorporate trial sequential analysis when interpreting results....

  15. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...... through the analysis of one of the earliest recorded examples of preschool education (initiated by J. F. Oberlin in northeastern France in 1767). The general idea of societal need is elaborated as a way of analysing practices, and a general analytic schema is presented for characterising preschool...

  16. Variability in spectrophotometric pyruvate analyses for predicting onion pungency and nutraceutical value.

    Science.gov (United States)

    Beretta, Vanesa H; Bannoud, Florencia; Insani, Marina; Galmarini, Claudio R; Cavagnaro, Pablo F

    2017-06-01

    Onion pyruvate concentration is used as a predictor of flavor intensity and nutraceutical value. The protocol of Schwimmer and Weston (SW) (1961) is the most widespread methodology for estimating onion pyruvate. Anthon and Barret (AB) (2003) proposed modifications to this procedure. Here, we compared these spectrophotometry-based procedures for pyruvate analysis using a diverse collection of onion cultivars. The SW method always led to over-estimation of pyruvate levels in colored, but not in white onions, by up to 65%. Identification of light-absorbance interfering compounds was performed by spectrophotometry and HPLC analysis. Interference by quercetin and anthocyanins, jointly, accounted for more than 90% of the over-estimation of pyruvate. Pyruvate determinations according to AB significantly reduced absorbance interference from compounds other than pyruvate. This study provides evidence about the mechanistic basis underlying differences between the SW and AB methods for indirect assessment of onion flavor and nutraceutical value.

  17. Behavioral and Physiological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction

    Science.gov (United States)

    Ninness, Chris; Lauter, Judy L.; Coffee, Michael; Clary, Logan; Kelly, Elizabeth; Rumph, Marilyn; Rumph, Robin; Kyle, Betty; Ninness, Sharon K.

    2012-01-01

    Using 3 diversified datasets, we explored the pattern-recognition ability of the Self-Organizing Map (SOM) artificial neural network as applied to diversified nonlinear data distributions in the areas of behavioral and physiological research. Experiment 1 employed a dataset obtained from the UCI Machine Learning Repository. Data for this study…

  18. Fertility prediction of frozen boar sperm using novel and conventional analyses

    Science.gov (United States)

    Frozen-thawed boar sperm is seldom used for artificial insemination (AI) because fertility is lower than fresh or cooled semen. Despite the many advantages of AI including reduced pathogen exposure and ease of semen transport, cryo-induced damage to sperm usually results in decreased litter sizes a...

  19. Analyses of the predicted changes of the global oceans under the increased greenhouse gases scenarios

    Institute of Scientific and Technical Information of China (English)

    MU Lin; WU Dexing; CHEN Xue'en; J Jungclaus

    2006-01-01

    A new climate model (ECHAM5/MPIOM1) developed for the fourth assessment report of the Intergovernmental Panel on Climate Change (IPCC) at Max-Planck Institute for Meteorology is used to study the climate changes under the different increased CO2 scenarios (B1, A1B and A2). Based on the corresponding model results, the sea surface temperature and salinity structure, the variations of the thermohaline circulation (THC) and the changes of sea ice in the northern hemisphere are analyzed. It is concluded that from the year of 2000 to 2100, under the B1, A1B and A2 scenarios, the global mean sea surface temperatures (SST) would increase by 2.5℃, 3.5℃ and 4.0℃ respectively, especially in the region of the Arctic, the increase of SST would be even above 10.0℃; the maximal negative value of the variation of the fresh water flux is located in the subtropical oceans, while the precipitation in the eastern tropical Pacific increases. The strength of THC decreases under the B1, A1B and A2 scenarios, and the reductions would be about 20%, 25% and 25.1% of the present THC strength respectively. In the northern hemisphere, the area of the sea ice cover would decrease by about 50% under the A1B scenario.

  20. Gene expression array analyses predict increased proto-oncogene expression in MMTV induced mammary tumors.

    Science.gov (United States)

    Popken-Harris, Pamela; Kirchhof, Nicole; Harrison, Ben; Harris, Lester F

    2006-08-01

    Exogenous infection by milk-borne mouse mammary tumor viruses (MMTV) typically induce mouse mammary tumors in genetically susceptible mice at a rate of 90-95% by 1 year of age. In contrast to other transforming retroviruses, MMTV acts as an insertional mutagen and under the influence of steroid hormones induces oncogenic transformation after insertion into the host genome. As these events correspond with increases in adjacent proto-oncogene transcription, we used expression array profiling to determine which commonly associated MMTV insertion site proto-oncogenes were transcriptionally active in MMTV induced mouse mammary tumors. To verify our gene expression array results we developed real-time quantitative RT-PCR assays for the common MMTV insertion site genes found in RIII/Sa mice (int-1/wnt-1, int-2/fgf-3, int-3/Notch 4, and fgf8/AIGF) as well as two genes that were consistently up regulated (CCND1, and MAT-8) and two genes that were consistently down regulated (FN1 and MAT-8) in the MMTV induced tumors as compared to normal mammary gland. Finally, each tumor was also examined histopathologically. Our expression array findings support a model whereby just one or a few common MMTV insertions into the host genome sets up a dominant cascade of events that leave a characteristic molecular signature.

  1. Eksakte metodar for analyse av tovegstabellar

    OpenAIRE

    Aaberge, Rolf

    1980-01-01

    Dei fleste matematisk-statistiske metodane som er utvikla til analyse av tabellar, byggjer på føresetnader om at talet på observasjonar i tabellcellene er "stort. Haldorsen (1977a) og (1977b) omtalar metodar som kviler på dette kravet. I denne rapporten skal vi presentere eksakte metodar for analyse av to-vegstabel lar, dvs. metodar som er gyldige sjølv om vi har småe observasjonstal i tabellcellene. I mange undersøkingar vil observasjonane ofte gi uttrykk for kva slags ...

  2. Identifying, analysing and solving problems in practice.

    Science.gov (United States)

    Hewitt-Taylor, Jaqui

    When a problem is identified in practice, it is important to clarify exactly what it is and establish the cause before seeking a solution. This solution-seeking process should include input from those directly involved in the problematic situation, to enable individuals to contribute their perspective, appreciate why any change in practice is necessary and what will be achieved by the change. This article describes some approaches to identifying and analysing problems in practice so that effective solutions can be devised. It includes a case study and examples of how the Five Whys analysis, fishbone diagram, problem tree analysis, and Seven-S Model can be used to analyse a problem.

  3. Use of EBSD data in mesoscale numerical analyses

    Energy Technology Data Exchange (ETDEWEB)

    Becker, R; Wiland, H

    2000-03-30

    EBSD and FEM studies will provide impetus for further development of microstructure models and theories of microstructure evolution. Early studies connecting EBSD data to detailed finite element models used manual measurements to define initial orientations for the simulations. In one study, manual measurements of the deformed structure were also obtained for comparison with the model predictions. More recent work has taken advantage of automated data collection on deformed specimens as a means of collecting detailed and spatially correlated data for FEM model validation. Although it will not be discussed here, EBSD data can also be incorporated in FEM analyses in a less direct manner that is suitable for simulations where the element size is much larger than the grain size. The purpose of such models is to account for the effects of evolving material anisotropy in macro-scale simulations. In these analyses, a polycrystal plasticity model (e.g., a Taylor model or a self-consistent model), or a yield surface constructed from a polycrystal plasticity model, is used to determine the constitutive response of each element. The initial orientations used in the polycrystal plasticity model can be obtained from EBSD analyses or by fitting distributions of discrete orientations to x-ray data. The use of EBSD data is advantageous in that it is easier to account for spatial gradients of orientation distribution within a part. Another area in which EBSD data is having a great impact is on recrystallization modeling. EBSD techniques can be used to collect data for quantitative microstructural analysis (Humphreys, 1998). This data can be used to infer growth kinetics of specific orientations, and this information can be synthesized into more accurate grain growth or recrystallization models (Vogel et al., 1996). A second role which EBSD techniques may play in recrystallization modeling is in determining initial structures for the models. A realistic starting structure is vital for

  4. Integrated genomic analyses of ovarian carcinoma

    NARCIS (Netherlands)

    Bell, D.; Berchuck, A.; Birrer, M.; Chien, J.; Dao, F.; Dhir, R.; DiSaia, P.; Gabra, H.; Glenn, P.; Godwin, A. K.; Gross, J.; Hartmann, L.; Huang, M.; Huntsman, D. G.; Iacocca, M.; Imielinski, M.; Kalloger, S.; Karlan, B. Y.; Levine, D. A.; Mills, G. B.; Morrison, C.; Mutch, D.; Olvera, N.; Orsulic, S.; Park, K.; Petrelli, N.; Rabeno, B.; Rader, J. S.; Sikic, B. I.; Smith-McCune, K.; Sood, A. K.; Bowtell, D.; Penny, R.; Testa, J. R.; Chang, K.; Dinh, H. H.; Drummond, J. A.; Fowler, G.; Gunaratne, P.; Hawes, A. C.; Kovar, C. L.; Lewis, L. R.; Morgan, M. B.; Newsham, I. F.; Santibanez, J.; Reid, J. G.; Trevino, L. R.; Wu, Y. -Q.; Wang, M.; Muzny, D. M.; Wheeler, D. A.; Gibbs, R. A.; Getz, G.; Lawrence, M. S.; Cibulskis, K.; Sivachenko, A. Y.; Sougnez, C.; Voet, D.; Wilkinson, J.; Bloom, T.; Ardlie, K.; Fennell, T.; Baldwin, J.; Gabriel, S.; Lander, E. S.; Ding, L.; Fulton, R. S.; Koboldt, D. C.; McLellan, M. D.; Wylie, T.; Walker, J.; O'Laughlin, M.; Dooling, D. J.; Fulton, L.; Abbott, R.; Dees, N. D.; Zhang, Q.; Kandoth, C.; Wendl, M.; Schierding, W.; Shen, D.; Harris, C. C.; Schmidt, H.; Kalicki, J.; Delehaunty, K. D.; Fronick, C. C.; Demeter, R.; Cook, L.; Wallis, J. W.; Lin, L.; Magrini, V. J.; Hodges, J. S.; Eldred, J. M.; Smith, S. M.; Pohl, C. S.; Vandin, F.; Raphael, B. J.; Weinstock, G. M.; Mardis, R.; Wilson, R. K.; Meyerson, M.; Winckler, W.; Getz, G.; Verhaak, R. G. W.; Carter, S. L.; Mermel, C. H.; Saksena, G.; Nguyen, H.; Onofrio, R. C.; Lawrence, M. S.; Hubbard, D.; Gupta, S.; Crenshaw, A.; Ramos, A. H.; Ardlie, K.; Chin, L.; Protopopov, A.; Zhang, Juinhua; Kim, T. M.; Perna, I.; Xiao, Y.; Zhang, H.; Ren, G.; Sathiamoorthy, N.; Park, R. W.; Lee, E.; Park, P. J.; Kucherlapati, R.; Absher, D. M.; Waite, L.; Sherlock, G.; Brooks, J. D.; Li, J. Z.; Xu, J.; Myers, R. M.; Laird, P. W.; Cope, L.; Herman, J. G.; Shen, H.; Weisenberger, D. J.; Noushmehr, H.; Pan, F.; Triche, T.; Berman, B. P.; Van den Berg, D. J.; Buckley, J.; Baylin, S. B.; Spellman, P. T.; Purdom, E.; Neuvial, P.; Bengtsson, H.; Jakkula, L. R.; Durinck, S.; Han, J.; Dorton, S.; Marr, H.; Choi, Y. G.; Wang, V.; Wang, N. J.; Ngai, J.; Conboy, J. G.; Parvin, B.; Feiler, H. S.; Speed, T. P.; Gray, J. W.; Levine, D. A.; Socci, N. D.; Liang, Y.; Taylor, B. S.; Schultz, N.; Borsu, L.; Lash, A. E.; Brennan, C.; Viale, A.; Sander, C.; Ladanyi, M.; Hoadley, K. A.; Meng, S.; Du, Y.; Shi, Y.; Li, L.; Turman, Y. J.; Zang, D.; Helms, E. B.; Balu, S.; Zhou, X.; Wu, J.; Topal, M. D.; Hayes, D. N.; Perou, C. M.; Getz, G.; Voet, D.; Saksena, G.; Zhang, Junihua; Zhang, H.; Wu, C. J.; Shukla, S.; Cibulskis, K.; Lawrence, M. S.; Sivachenko, A.; Jing, R.; Park, R. W.; Liu, Y.; Park, P. J.; Noble, M.; Chin, L.; Carter, H.; Kim, D.; Karchin, R.; Spellman, P. T.; Purdom, E.; Neuvial, P.; Bengtsson, H.; Durinck, S.; Han, J.; Korkola, J. E.; Heiser, L. M.; Cho, R. J.; Hu, Z.; Parvin, B.; Speed, T. P.; Gray, J. W.; Schultz, N.; Cerami, E.; Taylor, B. S.; Olshen, A.; Reva, B.; Antipin, Y.; Shen, R.; Mankoo, P.; Sheridan, R.; Ciriello, G.; Chang, W. K.; Bernanke, J. A.; Borsu, L.; Levine, D. A.; Ladanyi, M.; Sander, C.; Haussler, D.; Benz, C. C.; Stuart, J. M.; Benz, S. C.; Sanborn, J. Z.; Vaske, C. J.; Zhu, J.; Szeto, C.; Scott, G. K.; Yau, C.; Hoadley, K. A.; Du, Y.; Balu, S.; Hayes, D. N.; Perou, C. M.; Wilkerson, M. D.; Zhang, N.; Akbani, R.; Baggerly, K. A.; Yung, W. K.; Mills, G. B.; Weinstein, J. N.; Penny, R.; Shelton, T.; Grimm, D.; Hatfield, M.; Morris, S.; Yena, P.; Rhodes, P.; Sherman, M.; Paulauskis, J.; Millis, S.; Kahn, A.; Greene, J. M.; Sfeir, R.; Jensen, M. A.; Chen, J.; Whitmore, J.; Alonso, S.; Jordan, J.; Chu, A.; Zhang, Jinghui; Barker, A.; Compton, C.; Eley, G.; Ferguson, M.; Fielding, P.; Gerhard, D. S.; Myles, R.; Schaefer, C.; Shaw, K. R. Mills; Vaught, J.; Vockley, J. B.; Good, P. J.; Guyer, M. S.; Ozenberger, B.; Peterson, J.; Thomson, E.; Cramer, D.W.

    2011-01-01

    A catalogue of molecular aberrations that cause ovarian cancer is critical for developing and deploying therapies that will improve patients' lives. The Cancer Genome Atlas project has analysed messenger RNA expression, microRNA expression, promoter methylation and DNA copy number in 489 high-grade

  5. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  6. The Economic Cost of Homosexuality: Multilevel Analyses

    Science.gov (United States)

    Baumle, Amanda K.; Poston, Dudley, Jr.

    2011-01-01

    This article builds on earlier studies that have examined "the economic cost of homosexuality," by using data from the 2000 U.S. Census and by employing multilevel analyses. Our findings indicate that partnered gay men experience a 12.5 percent earnings penalty compared to married heterosexual men, and a statistically insignificant earnings…

  7. Chemical Analyses of Silicon Aerogel Samples

    CERN Document Server

    van der Werf, I; De Leo, R; Marrone, S

    2008-01-01

    After five years of operating, two Aerogel counters: A1 and A2, taking data in Hall A at Jefferson Lab, suffered a loss of performance. In this note possible causes of degradation have been studied. In particular, various chemical and physical analyses have been carried out on several Aerogel tiles and on adhesive tape in order to reveal the presence of contaminants.

  8. Antigen/Antibody Analyses in Leishmaniasis.

    Science.gov (United States)

    1983-09-01

    antibodies in human sera with antigens of protozoan parasites . It was found that enzyme substrate reactions had distinct advantages over typical...autoradiographic procedures. Analyses of various sera identified a number of antigens of protozoan parasites which may be useful in discriminating infections

  9. Cosmetology: Task Analyses. Competency-Based Education.

    Science.gov (United States)

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    These task analyses are designed to be used in combination with the "Trade and Industrial Education Service Area Resource" in order to implement competency-based education in the cosmetology program in Virginia. The task analysis document contains the task inventory, suggested task sequence lists, and content outlines for the secondary…

  10. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  11. Comparison of veterinary import risk analyses studies

    NARCIS (Netherlands)

    Vos-de Jong, de C.J.; Conraths, F.J.; Adkin, A.; Jones, E.M.; Hallgren, G.S.; Paisley, L.G.

    2011-01-01

    Twenty-two veterinary import risk analyses (IRAs) were audited: a) for inclusion of the main elements of risk analysis; b) between different types of IRAs; c) between reviewers' scores. No significant differences were detected between different types of IRAs, although quantitative IRAs and IRAs publ

  12. Written Case Analyses and Critical Reflection.

    Science.gov (United States)

    Harrington, Helen L.; And Others

    1996-01-01

    The study investigated the use of case-based pedagogy to develop critical reflection in prospective teachers. Analysis of students written analyses of dilemma-based cases found patterns showing evidence of students open-mindedness, sense of professional responsibility, and wholeheartedness in approach to teaching. (DB)

  13. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  14. What's missing from avian global diversification analyses?

    Science.gov (United States)

    Reddy, Sushma

    2014-08-01

    The accumulation of vast numbers of molecular phylogenetic studies has contributed to huge knowledge gains in the evolutionary history of birds. This permits subsequent analyses of avian diversity, such as how and why diversification varies across the globe and among taxonomic groups. However, available genetic data for these meta-analyses are unevenly distributed across different geographic regions and taxonomic groups. To comprehend the impact of this variation on the interpretation of global diversity patterns, I examined the availability of genetic data for possible biases in geographic and taxonomic sampling of birds. I identified three main disparities of sampling that are geographically associated with latitude (temperate, tropical), hemispheres (East, West), and range size. Tropical regions, which host the vast majority of species, are substantially less studied. Moreover, Eastern regions, such as the Old World Tropics and Australasia, stand out as being disproportionately undersampled, with up to half of communities not being represented in recent studies. In terms of taxonomic discrepancies, a majority of genetically undersampled clades are exclusively found in tropical regions. My analysis identifies several disparities in the key regions of interest of global diversity analyses. Differential sampling can have considerable impacts on these global comparisons and call into question recent interpretations of latitudinal or hemispheric differences of diversification rates. Moreover, this review pinpoints understudied regions whose biota are in critical need of modern systematic analyses.

  15. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe;

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  16. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  17. Metodekombination med kritisk analyse

    DEFF Research Database (Denmark)

    Bilfeldt, Annette

    2007-01-01

    "Metodekombination med kritisk teoretisk analyse" fremlægger en kombinationsmetode, der bygges op om en problemstilling med et normativt grundlag for studiet af køn og arbejdsliv. Med fokus på rationalitet og humaniseringsbarrierer i arbejdslivet lægges der med inspiration fra Marie Jahoda op til...

  18. Compilation of Sandia coal char combustion data and kinetic analyses

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.E.; Hurt, R.H.; Baxter, L.L.; Hardesty, D.R.

    1992-06-01

    An experimental project was undertaken to characterize the physical and chemical processes that govern the combustion of pulverized coal chars. The experimental endeavor establishes a database on the reactivities of coal chars as a function of coal type, particle size, particle temperature, gas temperature, and gas and composition. The project also provides a better understanding of the mechanism of char oxidation, and yields quantitative information on the release rates of nitrogen- and sulfur-containing species during char combustion. An accurate predictive engineering model of the overall char combustion process under technologically relevant conditions in a primary product of this experimental effort. This document summarizes the experimental effort, the approach used to analyze the data, and individual compilations of data and kinetic analyses for each of the parent coals investigates.

  19. Intron analyses reveal multiple calmodulin copies in Littorina.

    Science.gov (United States)

    Simpson, R J; Wilding, C S; Grahame, J

    2005-04-01

    Intron 3 and the flanking exons of the calmodulin gene have been amplified, cloned, and sequenced from 18 members of the gastropod genus Littorina. From the 48 sequences, at least five different gene copies have been identified and their functionality characterized using a strategy based upon the potential protein product predicted from flanking exon data. The functionality analyses suggest that four of the genes code for functional copies of calmodulin. All five copies have been identified across a wide range of littorinid species although not ubiquitously. Using this novel approach based on intron sequences, we have identified an unprecedented number of potential calmodulin copies in Littorina, exceeding that reported for any other invertebrate. This suggests a higher number of, and more ancient, gene duplications than previously detected in a single genus.

  20. Monte Carlo uncertainty analyses for integral beryllium experiments

    CERN Document Server

    Fischer, U; Tsige-Tamirat, H

    2000-01-01

    The novel Monte Carlo technique for calculating point detector sensitivities has been applied to two representative beryllium transmission experiments with the objective to investigate the sensitivity of important responses such as the neutron multiplication and to assess the related uncertainties due to the underlying cross-section data uncertainties. As an important result, it has been revealed that the neutron multiplication power of beryllium can be predicted with good accuracy using state-of-the-art nuclear data evaluations. Severe discrepancies do exist for the spectral neutron flux distribution that would transmit into significant uncertainties of the calculated neutron spectra and of the nuclear blanket performance in blanket design calculations. With regard to this, it is suggested to re-analyse the secondary energy and angle distribution data of beryllium by means of Monte Carlo based sensitivity and uncertainty calculations. Related code development work is underway.

  1. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  2. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... in the environmental DNA, but we found large variation in the level of detection among the groups and between modern and ancient samples. Success rates for the Pleistocene samples were highest for fungal DNA, whereas bryophyte, beetle and bird sequences could also be retrieved, but to a much lesser degree...

  3. Reliability of chemical analyses of water samples

    Energy Technology Data Exchange (ETDEWEB)

    Beardon, R.

    1989-11-01

    Ground-water quality investigations require reliable chemical analyses of water samples. Unfortunately, laboratory analytical results are often unreliable. The Uranium Mill Tailings Remedial Action (UMTRA) Project`s solution to this problem was to establish a two phase quality assurance program for the analysis of water samples. In the first phase, eight laboratories analyzed three solutions of known composition. The analytical accuracy of each laboratory was ranked and three laboratories were awarded contracts. The second phase consists of on-going monitoring of the reliability of the selected laboratories. The following conclusions are based on two years experience with the UMTRA Project`s Quality Assurance Program. The reliability of laboratory analyses should not be taken for granted. Analytical reliability may be independent of the prices charged by laboratories. Quality assurance programs benefit both the customer and the laboratory.

  4. Direct amino acid analyses of mozzarella cheese.

    Science.gov (United States)

    Hoskins, M N

    1985-12-01

    The amino acid content of mozzarella (low moisture, part skim milk) and asadero cheeses was determined by the column chromatographic method. Data from the direct analyses of the mozzarella cheeses were compared with the calculated amino acid composition reported in tables in Agriculture Handbook No. 8-1. Phenylalanine and tyrosine contents were found to be higher in the direct analyses than in the calculated data in Handbook No. 8-1 (1.390 gm and 1.127 gm for phenylalanine, and 1.493 gm and 1.249 gm for tyrosine per 100 gm edible portion, respectively). That is of particular concern in the dietary management of phenylketonuria, in which accuracy in computing levels of phenylalanine and tyrosine is essential.

  5. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    , and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses...... analyzed for a material containing a periodic distribution of spherical voids with two different void sizes, where the stress fields around larger voids may accelerate the growth of smaller voids. Another approach has been an analysis of a unit cell model in which a central cavity is discretely represented......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  6. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  7. Three-dimensional lake water quality modeling: sensitivity and uncertainty analyses.

    Science.gov (United States)

    Missaghi, Shahram; Hondzo, Miki; Melching, Charles

    2013-11-01

    Two sensitivity and uncertainty analysis methods are applied to a three-dimensional coupled hydrodynamic-ecological model (ELCOM-CAEDYM) of a morphologically complex lake. The primary goals of the analyses are to increase confidence in the model predictions, identify influential model parameters, quantify the uncertainty of model prediction, and explore the spatial and temporal variabilities of model predictions. The influence of model parameters on four model-predicted variables (model output) and the contributions of each of the model-predicted variables to the total variations in model output are presented. The contributions of predicted water temperature, dissolved oxygen, total phosphorus, and algal biomass contributed 3, 13, 26, and 58% of total model output variance, respectively. The fraction of variance resulting from model parameter uncertainty was calculated by two methods and used for evaluation and ranking of the most influential model parameters. Nine out of the top 10 parameters identified by each method agreed, but their ranks were different. Spatial and temporal changes of model uncertainty were investigated and visualized. Model uncertainty appeared to be concentrated around specific water depths and dates that corresponded to significant storm events. The results suggest that spatial and temporal variations in the predicted water quality variables are sensitive to the hydrodynamics of physical perturbations such as those caused by stream inflows generated by storm events. The sensitivity and uncertainty analyses identified the mineralization of dissolved organic carbon, sediment phosphorus release rate, algal metabolic loss rate, internal phosphorus concentration, and phosphorus uptake rate as the most influential model parameters.

  8. Center for Naval Analyses Annual Report 1982.

    Science.gov (United States)

    1982-01-01

    recipients, alike, was due mainly to temporary rather than permanent layoffs ; they were unemployed for about the same length of time, and their post- layoff ...equal to 70 percent of average weekly wages for 52 weeks in the two years following layoff ) apparently encouraged workers to remain unemployed longer...Institute for Defense Analyses. William A. Nierenberg, Director of the Scripps Institution of Oceanog- raphy. Member, NASA Advisory Council. Member

  9. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.;

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  10. ANALYSING SPACE: ADAPTING AND EXTENDING MULTIMODAL SEMIOTICS

    Directory of Open Access Journals (Sweden)

    Louise J. Ravelli

    2015-07-01

    Full Text Available In the field of multimodal discourse analysis, one of the most exciting sites of application is that of 3D space: examining aspects of built environment for its meaningmaking potential. For the built environment – homes, offices, public buildings, parks, etc. – does indeed make meaning. These are spaces which speak – often their meanings are so familiar, we no longer hear what they say; sometimes, new and unusual sites draw attention to their meanings, and they are hotly contested. This chapter will suggest ways of analyzing 3D texts, based on the framework of Kress and van Leeuwen (2006. This framework, developed primarily for the analysis of 2D images, has been successfully extended to a range of other multimodal texts. Extension to the built environment includes Pang (2004, O’Toole (1994, Ravelli (2006, Safeyton (2004, Stenglin (2004 and White (1994, whose studies will inform the analyses presented here. This article will identify some of the key theoretical principles which underline this approach, including the notions of text, context and metafunction, and will describe some of the main areas of analysis for 3D texts. Also, ways of bringing the analyses together will be considered. The analyses will be demonstrated in relation to the Scientia building at the University of New South Wales, Australia.

  11. Predicting the unpredictable.

    Science.gov (United States)

    Bonabeau, Eric

    2002-03-01

    The collective behavior of people in crowds, markets, and organizations has long been a mystery. Why, for instance, do employee bonuses sometimes lead to decreases in productivity? Why do some products generate tremendous buzz, seemingly out of nowhere, while others languish despite multimillion-dollar marketing campaigns? How could a simple clerical error snowball into a catastrophic loss that bankrupts a financial institution? Traditional approaches like spreadsheet and regression analyses have failed to explain such "emergent phenomena," says Eric Bonabeau, because they work from the top down, trying to apply global equations and frameworks to a particular situation. But the behavior of emergent phenomena, contends Bonabeau, is formed from the bottom up--starting with the local interactions of individuals who alter their actions in response to other participants. Together, the myriad interactions result in a group behavior that can easily elude any top-down analysis. But now, thanks to "agent-based modeling," some companies are finding ways to analyze--and even predict--emergent phenomena. Macy's, for instance, has used the technology to investigate better ways to design its department stores. Hewlett-Packard has run agent-based simulations to anticipate how changes in its hiring strategy would affect its corporate culture. And Société Générale has used the technology to determine the operational risk of its asset management group. This article discusses emergent phenomena in detail and explains why they have become more prevalent in recent years. In addition to providing real-world examples of companies that have improved their business practices through agent-based modeling, Bonabeau also examines the future of this technology and points to several fields that may be revolutionized by its use.

  12. Analyses of the OSU-MASLWR Experimental Test Facility

    Directory of Open Access Journals (Sweden)

    F. Mascari

    2012-01-01

    Full Text Available Today, considering the sustainability of the nuclear technology in the energy mix policy of developing and developed countries, the international community starts the development of new advanced reactor designs. In this framework, Oregon State University (OSU has constructed, a system level test facility to examine natural circulation phenomena of importance to multi-application small light water reactor (MASLWR design, a small modular pressurized water reactor (PWR, relying on natural circulation during both steady-state and transient operation. The target of this paper is to give a review of the main characteristics of the experimental facility, to analyse the main phenomena characterizing the tests already performed, the potential transients that could be investigated in the facility, and to describe the current IAEA International Collaborative Standard Problem that is being hosted at OSU and the experimental data will be collected at the OSU-MASLWR test facility. A summary of the best estimate thermal hydraulic system code analyses, already performed, to analyze the codes capability in predicting the phenomena typical of the MASLWR prototype, thermal hydraulically characterized in the OSU-MASLWR facility, is presented as well.

  13. Consumer brand choice: individual and group analyses of demand elasticity.

    Science.gov (United States)

    Oliveira-Castro, Jorge M; Foxall, Gordon R; Schrezenmaier, Teresa C

    2006-03-01

    Following the behavior-analytic tradition of analyzing individual behavior, the present research investigated demand elasticity of individual consumers purchasing supermarket products, and compared individual and group analyses of elasticity. Panel data from 80 UK consumers purchasing 9 product categories (i.e., baked beans, biscuits, breakfast cereals, butter, cheese, fruit juice, instant coffee, margarine and tea) during a 16-week period were used. Elasticity coefficients were calculated for individual consumers with data from all or only 1 product category (intra-consumer elasticities), and for each product category using all data points from all consumers (overall product elasticity) or 1 average data point per consumer (interconsumer elasticity). In addition to this, split-sample elasticity coefficients were obtained for each individual with data from all product categories purchased during weeks 1 to 8 and 9 to 16. The results suggest that: 1) demand elasticity coefficients calculated for individual consumers purchasing supermarket food products are compatible with predictions from economic theory and behavioral economics; 2) overall product elasticities, typically employed in marketing and econometric research, include effects of interconsumer and intraconsumer elasticities; 3) when comparing demand elasticities of different product categories, group and individual analyses yield similar trends; and 4) individual differences in demand elasticity are relatively consistent across time, but do not seem to be consistent across products. These results demonstrate the theoretical, methodological, and managerial relevance of investigating the behavior of individual consumers.

  14. On the accuracy of population analyses based on fitted densities().

    Science.gov (United States)

    de la Lande, Aurélien; Clavaguéra, Carine; Köster, Andreas

    2017-04-01

    Population analyses are part of the theoretical chemist's toolbox. They provide means to extract information about the repartition of the electronic density among molecules or solids. The values of atomic multipoles in a molecule can shed light on its electrostatic properties and may help to predict how different molecules could interact or to rationalize chemical reactivity for instance. Not being physical observables to which a quantum mechanical operator can be associated, atomic charges and higher order atomic multipoles cannot be defined unambiguously in a molecule, and therefore, several population schemes (PS) have been devised in the last decades. In the context of density functional theory (DFT), PS based on the electron density seem to be best grounded. In particular, some groups have proposed various iterative schemes the outcomes of which are very encouraging. Modern implementations of DFT that are for example based on density fitting techniques permit the investigation of molecular systems comprising of hundreds of atoms. However, population analyses following iterative schemes may become very CPU time consuming for such large systems. In this article, we investigate if the computationally less expensive analyses of the variationally fitted electronic densities can be safely carried out instead of the Kohn-Sham density. It is shown that as long as flexible auxiliary function sets including f and g functions are used, the multipoles extracted from the fitted densities are extremely close to those obtained from the KS density. We further assess if the multipoles obtained through the Hirshfeld's approach, in its standard or iterative form, can be a useful approach to calculate interaction energies in non-covalent complexes. Relative energies computed with the AMOEBA polarizable forced field combined to iterative Hirshfeld multipoles are encouraging.

  15. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  16. COMPARATIVE ANALYSES OF MORPHOLOGICAL CHARACTERS IN SPHAERODORIDAE AND ALLIES (ANNELIDA REVEALED BY AN INTEGRATIVE MICROSCOPICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Conrad eHelm

    2015-01-01

    Full Text Available Sphaerodoridae is a group of benthic marine worms (Annelida characterized by the presence of spherical tubercles covering their whole surface. They are commonly considered as belonging to Phyllodocida although sistergroup relationships are still far from being understood. Primary homology assessment of their morphological features are lacking, hindering the appraisal of evolutionary relationships between taxa. Therefore, our detailed morphological investigation focuses on different Sphaerodoridae as well as on other members of Phyllodocida using an integrative approach combining scanning electron microscopy (SEM as well as immunohistochemistry with standard neuronal (anti-5-HT and muscular (phalloidin-rhodamine markers and subsequent CLSM analysis of whole mounts and sections. Furthermore, we provide histological (HES and light microscopical data to shed light on the structures and hypothetical function of sphaerodorid key morphological features. We provide fundamental details into the sphaerodorid morphology supporting a Phyllodocida ancestry of these enigmatic worms. However, the muscular arrangement and the presence of an axial muscular pharynx is similar to conditions observed in other members of the Errantia too. Furthermore, nervous system and muscle staining as well as SEM and histological observations of different types of tubercles indicate a homology of the so called microtubercles, present in the long-bodied sphaerodorids, to the dorsal cirri of other Errantia. The macrotubercles seem to represent a sphaerodorid autapomorphy based on our investigations. Therefore, our results allow comparisons concerning morphological patterns between Sphaerodoridae and other Phyllodocida and constitute a starting point for further comparative investigations to reveal the evolution of the remarkable Sphaerodoridae.

  17. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D;

    Genomic prediction uses markers (SNPs) across the whole genome to predict individual breeding values at an early growth stage potentially before large scale phenotyping. One of the applications of genomic prediction in plant breeding is to identify the best individual candidate lines to contribut...

  18. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D;

    2015-01-01

    Genomic prediction uses markers (SNPs) across the whole genome to predict individual breeding values at an early growth stage potentially before large scale phenotyping. One of the applications of genomic prediction in plant breeding is to identify the best individual candidate lines to contribut...

  19. How to Establish Clinical Prediction Models

    Directory of Open Access Journals (Sweden)

    Yong-ho Lee

    2016-03-01

    Full Text Available A clinical prediction model can be applied to several challenging clinical scenarios: screening high-risk individuals for asymptomatic disease, predicting future events such as disease or death, and assisting medical decision-making and health education. Despite the impact of clinical prediction models on practice, prediction modeling is a complex process requiring careful statistical analyses and sound clinical judgement. Although there is no definite consensus on the best methodology for model development and validation, a few recommendations and checklists have been proposed. In this review, we summarize five steps for developing and validating a clinical prediction model: preparation for establishing clinical prediction models; dataset selection; handling variables; model generation; and model evaluation and validation. We also review several studies that detail methods for developing clinical prediction models with comparable examples from real practice. After model development and vigorous validation in relevant settings, possibly with evaluation of utility/usability and fine-tuning, good models can be ready for the use in practice. We anticipate that this framework will revitalize the use of predictive or prognostic research in endocrinology, leading to active applications in real clinical practice.

  20. Department of Energy's team's analyses of Soviet designed VVERs

    Energy Technology Data Exchange (ETDEWEB)

    1989-09-01

    This document provides Appendices A thru K of this report. The topics discussed respectively are: radiation induced embrittlement and annealing of reactor pressure vessel steels; loss of coolant accident blowdown analyses; LOCA blowdown response analyses; non-seismic structural response analyses; seismic analyses; S'' seal integrity; reactor transient analyses; fire protection; aircraft impacts; and boric acid induced corrosion. (FI).

  1. Statistical analyses in disease surveillance systems.

    Science.gov (United States)

    Lescano, Andres G; Larasati, Ria Purwita; Sedyaningsih, Endang R; Bounlu, Khanthong; Araujo-Castillo, Roger V; Munayco-Escate, Cesar V; Soto, Giselle; Mundaca, C Cecilia; Blazes, David L

    2008-11-14

    The performance of disease surveillance systems is evaluated and monitored using a diverse set of statistical analyses throughout each stage of surveillance implementation. An overview of their main elements is presented, with a specific emphasis on syndromic surveillance directed to outbreak detection in resource-limited settings. Statistical analyses are proposed for three implementation stages: planning, early implementation, and consolidation. Data sources and collection procedures are described for each analysis.During the planning and pilot stages, we propose to estimate the average data collection, data entry and data distribution time. This information can be collected by surveillance systems themselves or through specially designed surveys. During the initial implementation stage, epidemiologists should study the completeness and timeliness of the reporting, and describe thoroughly the population surveyed and the epidemiology of the health events recorded. Additional data collection processes or external data streams are often necessary to assess reporting completeness and other indicators. Once data collection processes are operating in a timely and stable manner, analyses of surveillance data should expand to establish baseline rates and detect aberrations. External investigations can be used to evaluate whether abnormally increased case frequency corresponds to a true outbreak, and thereby establish the sensitivity and specificity of aberration detection algorithms.Statistical methods for disease surveillance have focused mainly on the performance of outbreak detection algorithms without sufficient attention to the data quality and representativeness, two factors that are especially important in developing countries. It is important to assess data quality at each state of implementation using a diverse mix of data sources and analytical methods. Careful, close monitoring of selected indicators is needed to evaluate whether systems are reaching their

  2. Stable isotopic analyses in paleoclimatic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Wigand, P.E. [Univ. and Community College System of Nevada, Reno, NV (United States)

    1995-09-01

    Most traditional paleoclimatic proxy data have inherent time lags between climatic input and system response that constrain their use in accurate reconstruction of paleoclimate chronology, scaling of its variability, and the elucidation of the processes that determine its impact on the biotic and abiotic environment. With the exception of dendroclimatology, and studies of short-lived organisms and pollen recovered from annually varved lacustrine sediments, significant periods of time ranging from years, to centuries, to millennia may intervene between climate change and its first manifestation in paleoclimatic proxy data records. Reconstruction of past climate through changes in plant community composition derived from pollen sequences and plant remains from ancient woodrat middens, wet environments and dry caves all suffer from these lags. However, stable isotopic analyses can provide more immediate indication of biotic response to climate change. Evidence of past physiological response of organisms to changes in effective precipitation as climate varies can be provided by analyses of the stable isotopic content of plant macrofossils from various contexts. These analyses consider variation in the stable isotopic (hydrogen, oxygen and carbon) content of plant tissues as it reflects (1) past global or local temperature through changes in meteoric (rainfall) water chemistry in the case of the first two isotopes, and (2) plant stress through changes in plant respiration/transpiration processes under differing water availability, and varying atmospheric CO, composition (which itself may actually be a net result of biotic response to climate change). Studies currently being conducted in the Intermountain West indicate both long- and short-term responses that when calibrated with modem analogue studies have the potential of revealing not only the timing of climate events, but their direction, magnitude and rapidity.

  3. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...

  4. Visuelle Analyse von Eye-Tracking-Daten

    OpenAIRE

    2011-01-01

    Eye-Tracking ist eine der am häufigsten eingesetzten Techniken zur Analyse der Mensch-Computer-Interaktion sowie zur Untersuchung der Perzeption. Die erfassten Eye-Tracking-Daten werden meist mit Heat-Maps oder Scan-Paths analysiert, um die Usability der getesteten Anwendung zu ermitteln oder auf höhere kognitive Prozesse zu schließen. Das Ziel dieser Diplomarbeit ist die Entwicklung neuer Visualisierungstechniken für Eye-Tracking-Daten beziehungsweise die Entwicklung eines Studienkonzepts...

  5. Environmental monitoring final report: groundwater chemical analyses

    Energy Technology Data Exchange (ETDEWEB)

    1984-02-01

    This report presents the results of analyses of groundwater qualtiy at the SRC-I Demonstration Plant site in Newman, Kentucky. Samples were obtained from a network of 23 groundwater observation wells installed during previous studies. The groundwater was well within US EPA Interim Primary Drinking Water Standards for trace metals, radioactivity, and pesticides, but exceeded the standard for coliform bacteria. Several US EPA Secondary Drinking Water Standards were exceeded, namely, manganese, color, iron, and total dissolved solids. Based on the results, Dames and Moore recommend that all wells should be sterilized and those wells built in 1980 should be redeveloped. 1 figure, 6 tables.

  6. Analyses of containment structures with corrosion damage

    Energy Technology Data Exchange (ETDEWEB)

    Cherry, J.L.

    1996-12-31

    Corrosion damage to a nuclear power plant containment structure can degrade the pressure capacity of the vessel. For the low-carbon, low- strength steels used in containments, the effect of corrosion on material properties is discussed. Strain-to-failure tests, in uniaxial tension, have been performed on corroded material samples. Results were used to select strain-based failure criteria for corroded steel. Using the ABAQUS finite element analysis code, the capacity of a typical PWR Ice Condenser containment with corrosion damage has been studied. Multiple analyses were performed with the locations of the corrosion the containment, and the amount of corrosion varied in each analysis.

  7. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  8. Transportation systems analyses: Volume 1: Executive Summary

    Science.gov (United States)

    1993-05-01

    The principal objective of this study is to accomplish a systems engineering assessment of the nation's space transportation infrastructure. This analysis addresses the necessary elements to perform man delivery and return, cargo transfer, cargo delivery, payload servicing, and the exploration of the Moon and Mars. Specific elements analyzed, but not limited to, include the Space Exploration Initiative (SEI), the National Launch System (NLS), the current expendable launch vehicle (ELV) fleet, ground facilities, the Space Station Freedom (SSF), and other civil, military and commercial payloads. The performance of this study entails maintaining a broad perspective on the large number of transportation elements that could potentially comprise the U.S. space infrastructure over the next several decades. To perform this systems evaluation, top-level trade studies are conducted to enhance our understanding of the relationships between elements of the infrastructure. This broad 'infrastructure-level perspective' permits the identification of preferred infrastructures. Sensitivity analyses are performed to assure the credibility and usefulness of study results. This executive summary of the transportation systems analyses (TSM) semi-annual report addresses the SSF logistics resupply. Our analysis parallels the ongoing NASA SSF redesign effort. Therefore, there could be no SSF design to drive our logistics analysis. Consequently, the analysis attempted to bound the reasonable SSF design possibilities (and the subsequent transportation implications). No other strategy really exists until after a final decision is rendered on the SSF configuration.

  9. ANALYSES ON SYSTEMATIC CONFRONTATION OF FIGHTER AIRCRAFT

    Institute of Scientific and Technical Information of China (English)

    HuaiJinpeng; WuZhe; HuangJun

    2002-01-01

    Analyses of the systematic confrontation between two military forcfes are the highest hierarchy on opera-tional effectiveness study of weapon systema.The physi-cal model for tactical many-on-many engagements of an aerial warfare with heterogeneous figher aircraft is estab-lished.On the basis of Lanchester multivariate equations of square law,a mathematical model corresponding to the established physical model is given.A superiorityh parame-ter is then derived directly from the mathematical model.With view to the high -tech condition of modern war-fare,the concept of superiority parameter which more well and truly reflects the essential of an air-to-air en-gagement is further formulated.The attrition coeffi-cients,which are key to the differential equations,are de-termined by using tactics of random target assignment and air-to-air capability index of the fighter aircraft.Hereby,taking the mathematical model and superiority parameter as cores,calculations amd analyses of complicate systemic problems such as evaluation of battle superiority,prog-mostication of combat process and optimization of colloca-tions have been accomplished.Results indicate that a clas-sical combat theory with its certain recent development has received newer applications in the military operation research for complicated confrontation analysis issues.

  10. Autisme et douleur – analyse bibliographique

    Science.gov (United States)

    Dubois, Amandine; Rattaz, Cécile; Pry, René; Baghdadli, Amaria

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocommunicatif. Cette analyse bibliographique aborde, pour terminer, la question de l’évaluation et de la prise en compte de la douleur chez les personnes avec autisme. Les auteurs concluent à l’absence d’homogénéité des résultats des études publiées et au besoin de poursuivre les recherches afin de parvenir à des données consensuelles sur un domaine d’étude encore peu exploité au plan scientifique. Sur un plan clinique, l’approfondissement des connaissances dans ce domaine devrait permettre de mettre au point des outils d’évaluation de la douleur et d’ainsi en assurer une meilleure prise en charge au quotidien. PMID:20808970

  11. Fractal and multifractal analyses of bipartite networks.

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-31

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  12. Bioinformatics tools for analysing viral genomic data.

    Science.gov (United States)

    Orton, R J; Gu, Q; Hughes, J; Maabar, M; Modha, S; Vattipally, S B; Wilkie, G S; Davison, A J

    2016-04-01

    The field of viral genomics and bioinformatics is experiencing a strong resurgence due to high-throughput sequencing (HTS) technology, which enables the rapid and cost-effective sequencing and subsequent assembly of large numbers of viral genomes. In addition, the unprecedented power of HTS technologies has enabled the analysis of intra-host viral diversity and quasispecies dynamics in relation to important biological questions on viral transmission, vaccine resistance and host jumping. HTS also enables the rapid identification of both known and potentially new viruses from field and clinical samples, thus adding new tools to the fields of viral discovery and metagenomics. Bioinformatics has been central to the rise of HTS applications because new algorithms and software tools are continually needed to process and analyse the large, complex datasets generated in this rapidly evolving area. In this paper, the authors give a brief overview of the main bioinformatics tools available for viral genomic research, with a particular emphasis on HTS technologies and their main applications. They summarise the major steps in various HTS analyses, starting with quality control of raw reads and encompassing activities ranging from consensus and de novo genome assembly to variant calling and metagenomics, as well as RNA sequencing.

  13. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  14. Hitchhikers’ guide to analysing bird ringing data

    Directory of Open Access Journals (Sweden)

    Harnos Andrea

    2015-12-01

    Full Text Available Bird ringing datasets constitute possibly the largest source of temporal and spatial information on vertebrate taxa available on the globe. Initially, the method was invented to understand avian migration patterns. However, data deriving from bird ringing has been used in an array of other disciplines including population monitoring, changes in demography, conservation management and to study the effects of climate change to name a few. Despite the widespread usage and importance, there are no guidelines available specifically describing the practice of data management, preparation and analyses of ringing datasets. Here, we present the first of a series of comprehensive tutorials that may help fill this gap. We describe in detail and through a real-life example the intricacies of data cleaning and how to create a data table ready for analyses from raw ringing data in the R software environment. Moreover, we created and present here the R package; ringR, designed to carry out various specific tasks and plots related to bird ringing data. Most methods described here can also be applied to a wide range of capture-recapture type data based on individual marking, regardless to taxa or research question.

  15. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-01-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions. PMID:28361962

  16. CHEMICAL ANALYSES OF SODIUM SYSTEMS FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, W. O.; Yunker, W. H.; Scott, F. A.

    1970-06-01

    BNWL-1407 summarizes information gained from the Chemical Analyses of Sodium Systems Program pursued by Battelle- Northwest over the period from July 1967 through June 1969. Tasks included feasibility studies for performing coulometric titration and polarographic determinations of oxygen in sodium, and the development of new separation techniques for sodium impurities and their subsequent analyses. The program was terminated ahead of schedule so firm conclusions were not obtained in all areas of the work. At least 40 coulometric titrations were carried out and special test cells were developed for coulometric application. Data indicated that polarographic measurements are theoretically feasible, but practical application of the method was not verified. An emission spectrographic procedure for trace metal impurities was developed and published. Trace metal analysis by a neutron activation technique was shown to be feasible; key to the success of the activation technique was the application of a new ion exchange resin which provided a sodium separation factor of 10{sup 11}. Preliminary studies on direct scavenging of trace metals produced no conclusive results.

  17. Waste Stream Analyses for Nuclear Fuel Cycles

    Energy Technology Data Exchange (ETDEWEB)

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  18. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  19. Les conditions de l’analyse qualitative

    Directory of Open Access Journals (Sweden)

    Pierre Paillé

    2011-07-01

    Full Text Available Les méthodes d’analyse des données qualitatives et le monde informatique étaient faits pour se rencontrer. Et en effet, la question est d’actualité et les outils informatiques nombreux et avancés. Ce phénomène ne saurait s’estomper, d’autant moins que l’analyse des données secondaires connaît en même temps des développements importants. Mais l’attrait pour les logiciels d’analyse peut devenir tel qu’on ne verrait plus trop à quel titre et pour quelles raisons on pourrait s’en passer. L’article tente de cerner une vision et une pratique de l’analyse qualitative qui, dans son essence, ne se prête pas à l’utilisation d’outils informatiques spécialisés. Il situe sa réflexion dans le cadre de la méthodologie qualitative (démarche qualitative, recherche qualitative, analyse qualitative, plus particulièrement au niveau de l’enquête qualitative de terrain.The conditions of qualitative analysisThe methods for analyzing qualitative data and the computer world were meant to meet. And, by that, the question is valid and computer tools numerous and advanced. This phenomenon will not slowdown, especially not since secondary data analysis experiences significant developments. But the attraction for analysis software could develop so that we would not see too much why and for what type of reasons we could part from them. This article attempts to define a vision and a practice of qualitative analysis that, in essence, does not use specialized computer tools. It situates its reflection within qualitative methodology (qualitative approaches, qualitative research, qualitative analysis and moreover in the level of qualitative fieldwork investigation.Las condiciones del análisis cualitativo. Reflexiones sobre la utilización de programas informáticosLos métodos de análisis de los datos cualitativos y el mundo de la informática han nacido para entenderse mutualmente. En efecto, la cuestión es actual y los

  20. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available BACKGROUND: The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge. METHOD: Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls. RESULTS: The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002. CONCLUSIONS: Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or

  1. Analysing lawyers’ attitude towards knowledge sharing

    Directory of Open Access Journals (Sweden)

    Wole M. Olatokun

    2012-02-01

    Full Text Available Objectives: The study examined and identified the factors that affect lawyers’ attitudes to knowledge sharing, and their knowledge sharing behaviour. Specifically, it investigated the relationship between the salient beliefs affecting the knowledge sharing attitude of lawyers’, and applied a modified version of the Theory of Reasoned Action (TRA in the knowledge sharing context, to predict how these factors affect their knowledge sharing behaviour.Method: A field survey of 273 lawyers was carried out, using questionnaire for data collection. Collected data on all variables were structured into grouped frequency distributions. Principal Component Factor Analysis was applied to reduce the constructs and Simple Regression was applied to test the hypotheses. These were tested at 0.05% level of significance.Results: Results showed that expected associations and contributions were the major determinants of lawyers’ attitudes towards knowledge sharing. Expected reward was not significantly related to lawyers’ attitudes towards knowledge sharing. A positive attitude towards knowledge sharing was found to lead to a positive intention to share knowledge, although a positive intention to share knowledge did not significantly predict a positive knowledge sharing behaviour. The level of Information Technology (IT usage was also found to significantly affect the knowledge sharing behaviour of lawyers’.Conclusion: It was recommended that law firms in the study area should deploy more IT infrastructure and services that encourage effective knowledge sharing amongst lawyers. 

  2. ANALYSE OF PULSE WAVE PROPAGATION IN ARTERIES

    Institute of Scientific and Technical Information of China (English)

    PAN Yi-shan; JIA Xiao-bo; CUI Chang-kui; XIAO Xiao-chun

    2006-01-01

    Based upon the blood vessel of being regarded as the elasticity tube, and that the tissue restricts the blood vessel wall, the rule of pulse wave propagation in blood vessel was studied. The viscosity of blood, the elastic modulus of blood vessel, the radius of tube that influenced the pulse wave propagation were analyzed. Comparing the result that considered the viscosity of blood with another result that did not consider the viscosity of blood, we finally discover that the viscosity of blood that influences the pulse wave propagation can not be neglected; and with the accretion of the elastic modulus the speed of propagation augments and the press value of blood stream heightens; when diameter of blood vessel reduces, the press of blood stream also heightens and the speed of pulse wave also augments. These results will contribute to making use of the information of pulse wave to analyse and auxiliarily diagnose some causes of human disease.

  3. Modelling and Analysing Socio-Technical Systems

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Ivanova, Marieta Georgieva; Nielson, Flemming

    2015-01-01

    Modern organisations are complex, socio-technical systems consisting of a mixture of physical infrastructure, human actors, policies and processes. An in-creasing number of attacks on these organisations exploits vulnerabilities on all different levels, for example combining a malware attack...... with social engineering. Due to this combination of attack steps on technical and social levels, risk assessment in socio-technical systems is complex. Therefore, established risk assessment methods often abstract away the internal structure of an organisation and ignore human factors when modelling...... and assessing attacks. In our work we model all relevant levels of socio-technical systems, and propose evaluation techniques for analysing the security properties of the model. Our approach simplifies the identification of possible attacks and provides qualified assessment and ranking of attacks based...

  4. Autisme et Douleur – Analyse Bibliographique

    OpenAIRE

    Amandine Dubois; Cécile Rattaz; René Pry; Amaria Baghdadli

    2010-01-01

    La présente analyse bibliographique a pour objectif de réaliser un bilan des travaux publiés dans le champ de la douleur et de l’autisme. L’article aborde, dans un premier temps, les études publiées concernant les modes d’expression de la douleur observés dans cette population. Différentes hypothèses permettant d’expliquer les particularités expressives des personnes avec autisme sont ensuite passées en revue : excès d’endorphines, particularités dans le traitement sensoriel, déficit sociocom...

  5. First international intercomparison of image analysers

    CERN Document Server

    Pálfalvi, J; Eoerdoegh, I

    1999-01-01

    Image analyser systems used for evaluating solid state nuclear track detectors (SSNTD) were compared in order to establish minimum hardware and software requirements and methodology necessary in different fields of radiation dosimetry. For the purpose, CR-39 detectors (TASL, Bristol, U.K.) were irradiated with different (n,alpha) and (n,p) converters in a reference Pu-Be neutron field, in an underground laboratory with high radon concentration and by different alpha sources at the Atomic Energy Research Institute (AERI) in Budapest, Hungary. 6 sets of etched and pre-evaluated detectors and the 7th one without etching were distributed among the 14 laboratories from 11 countries. The participants measured the different track parameters and statistically evaluated the results, to determine the performance of their system. The statistical analysis of results showed high deviations from the mean values in many cases. As the conclusion of the intercomparison recommendations were given to fulfill those requirements ...

  6. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  7. Feasibility Analyses of Integrated Broiler Production

    Directory of Open Access Journals (Sweden)

    L. Komalasari

    2010-12-01

    Full Text Available The major obstacles in the development of broiler raising is the expensive price of feed and the fluctuative price of DOCs. The cheap price of imported leg quarters reduces the competitiveness of the local broilers. Therefore, an effort to increase production efficiency is needed through integration between broiler raising and corn farmers and feed producers (integrated farming. The purpose of this study is to analyze the feasibility of integrating broiler raising with corn cultivation and feed production. Besides that, a simulation was conducted to analyze the effects of DOC price changes, broiler price and production capacity. The analyses showed that integrated farming and a mere combination between broiler raising and feed factory of a 10,000 bird capacity is not financially feasible. Increasing the production to 25,000 broiler chickens will make the integrated farming financially feasible. Unintegrated broiler raising is relatively sensitive to broiler price decreases and DOC price increases compared to integrated farming.

  8. Conceptualizing analyses of ecological momentary assessment data.

    Science.gov (United States)

    Shiffman, Saul

    2014-05-01

    Ecological momentary assessment (EMA) methods, which involve collection of real-time data in subjects' real-world environments, are particularly well suited to studying tobacco use. Analyzing EMA datasets can be challenging, as the datasets include a large and varied number of observations per subject and are relatively unstructured. This paper suggests that time is typically a key organizing principle in EMA data and that conceptualizing the data as a timeline of events, behaviors, and experiences can help define analytic approaches. EMA datasets lend themselves to answering a diverse array of research questions, and the research question must drive how data are arranged for analysis, and the kinds of statistical models that are applied. This is illustrated this with brief examples of diverse analyses applied to answer different questions from an EMA study of tobacco use and relapse.

  9. Genetic Analyses of Meiotic Recombination in Arabidopsis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Meiosis is essential for sexual reproduction and recombination is a critical step required for normal meiosis. Understanding the underlying molecular mechanisms that regulate recombination ie important for medical, agricultural and ecological reasons. Readily available molecular and cytological tools make Arabidopsis an excellent system to study meiosis. Here we review recent developments in molecular genetic analyses on meiotic recombination. These Include studies on plant homologs of yeast and animal genes, as well as novel genes that were first identified in plants. The characterizations of these genes have demonstrated essential functions from the initiation of recombination by double-strand breaks to repair of such breaks, from the formation of double-Holliday junctions to possible resolution of these junctions, both of which are critical for crossover formation. The recent advances have ushered a new era in plant meiosis, in which the combination of genetics, genomics, and molecular cytology can uncover important gene functions.

  10. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  11. Lagune de Salses - Leucate. I - Analyse bibliographique.

    OpenAIRE

    Ladagnous, Helene; Le Bec, Claude

    1997-01-01

    La mise en place récente de procédures visant à améliorer la qualité des eaux de la lagune de Salses-Leucate (Schéma d'Aménagement et de Gestion de l'Eau, contrat d'étang) nous a amenés à réaliser une synthèse des connaissances disponibles sur ce secteur, phase nécessaire et préalable à toute concertation. Il ressort de cette analyse bibliographique que bon nombre des études réalisées sont aujourd'hui obsolètes dans une perspective de gestion intégrée du site. Les connaissances hydrogéologiqu...

  12. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  13. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  14. Analysing transfer phenomena in osmotic evaporation

    Directory of Open Access Journals (Sweden)

    Freddy Forero Longas

    2011-12-01

    Full Text Available Osmotic evaporation is a modification of traditional processes using membranes; by means of a vapour pressure differential, produced by a highly concentrated extraction solution, water is transferred through a hydrophobic membrane as vapour. This technique has many advantages over traditional processes, allowing work at atmospheric pressure and low temperatures, this being ideal for heatsensitive products. This paper presents and synthetically analyses the phenomena of heat and mass transfer which occurs in the process and describes the models used for estimating the parameters of interest, such as flow, temperature, heat transfer rate and the relationships that exist amongst them when hollow fibre modules are used, providing a quick reference tool and specific information about this process.

  15. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  16. Sivers function: SIDIS data, fits and predictions

    CERN Document Server

    Anselmino, M; D'Alesio, U; Kotzinian, A; Murgia, F; Prokudin, A

    2005-01-01

    The most recent data on the weighted transverse single spin asymmetry A_{UT}^{\\sin(\\phi_h-\\phi_S)} from HERMES and COMPASS collaborations are analysed within LO parton model; all transverse motions are taken into account. Extraction of the Sivers function for u and d quarks is performed. Based on the extracted Sivers functions, predictions for A_{UT}^{\\sin(\\phi_h-\\phi_S)} asymmetries at JLab are given; suggestions for further measurements at COMPASS, with a transversely polarized hydrogen target and selecting favourable kinematical ranges, are discussed. Predictions are also presented for Single Spin Asymmetries (SSA) in Drell-Yan processes at RHIC and GSI.

  17. RAMA Surveillance Capsule and Component Activation Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Kenneth E.; Jones, Eric N. [TransWare Enterprises Inc., 1565 Mediterranean Dr., Sycamore, IL 60178 (United States); Carter, Robert G. [Electric Power Research Institute, 1300 West W. T. Harris Blvd., Charlotte, NC 28262 (United States)

    2011-07-01

    This paper presents the calculated-to-measured ratios associated with the application of the RAMA Fluence Methodology software to light water reactor surveillance capsule and reactor component activation evaluations. Comparisons to measurements are performed for pressurized water reactor and boiling water reactor surveillance capsule activity specimens from seventeen operating light water reactors. Comparisons to measurements are also performed for samples removed from the core shroud, top guide, and jet pump brace pads from two reactors. In conclusion: The flexible geometry modeling capabilities provided by RAMA, combined with the detailed representation of operating reactor history and anisotropic scattering detail, produces accurate predictions of the fast neutron fluence and neutron activation for BWR and PWR surveillance capsule geometries. This allows best estimate RPV fluence to be determined without the need for multiplicative bias corrections. The three-dimensional modeling capability in RAMA provides an accurate estimate of the fast neutron fluence for regions far removed from the core mid-plane elevation. The comparisons to activation measurements for various core components indicate that the RAMA predictions are reasonable, and notably conservative (i.e., C/M ratios are consistently greater than unity). It should be noted that in the current evaluations, the top and bottom fuel regions are represented by six inch height nodes. As a result, the leakage-induced decrease in power near the upper and lower edges of the core are not well represented in the current models. More precise predictions of fluence for components that lie above and below the core boundaries could be obtained if the upper and lower fuel nodes were subdivided into multiple axial regions with assigned powers that reflect the neutron leakage at the top and bottom of the core. This use of additional axial sub-meshing at the top and bottom of the core is analogous to the use of pin

  18. Prediction of sheep responses by near infrared reflectance spectroscopy.

    Science.gov (United States)

    Eckman, D D; Shenk, J S; Wangsness, P J; Westerhaus, M O

    1983-09-01

    Prediction of animal response from near infrared reflectance spectra of feeds was compared with predictions from chemical analyses. Sixty samples of pure and mixed forage-based diets were obtained from sheep intake and digestion trials. Sheep responses measured were digestible energy, dry matter intake, and calculated intake of digestible energy. Diets were analyzed chemically for protein, neutral detergent fiber, and in vitro dry matter disappearance. Coefficients of multiple determination and standard errors for fitting the sheep responses to these 60 diverse diets by regression equations developed from chemical analyses (.62 to .70) or spectra (.63 to .72) were similar. The 60 diets were divided into two sets of 30; one set was used to develop calibration equations for each sheep response, and the second set was used to test the equations. Calibration and errors of prediction were similar. When wavelengths chosen for each of the laboratory measurements were used to fit the sheep responses, standard errors were higher than when responses of sheep were predicted directly from spectra. The scanning instrument has the capability of predicting laboratory analyses and shows potential for predicting animal response as accurately as animal response can be predicted from laboratory analyses.

  19. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  20. Predictability of conversation partners

    CERN Document Server

    Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-01-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information theoretic method to the spatiotemporal data of cell-phone locations, Song et al. (2010) found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one's conversation partners is defined as the degree to which one's next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between close sensor nodes. We find t...

  1. Distribution Free Prediction Bands

    CERN Document Server

    Lei, Jing

    2012-01-01

    We study distribution free, nonparametric prediction bands with a special focus on their finite sample behavior. First we investigate and develop different notions of finite sample coverage guarantees. Then we give a new prediction band estimator by combining the idea of "conformal prediction" (Vovk et al. 2009) with nonparametric conditional density estimation. The proposed estimator, called COPS (Conformal Optimized Prediction Set), always has finite sample guarantee in a stronger sense than the original conformal prediction estimator. Under regularity conditions the estimator converges to an oracle band at a minimax optimal rate. A fast approximation algorithm and a data driven method for selecting the bandwidth are developed. The method is illustrated first in simulated data. Then, an application shows that the proposed method gives desirable prediction intervals in an automatic way, as compared to the classical linear regression modeling.

  2. Predictability of Conversation Partners

    Science.gov (United States)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  3. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  4. Analysing CMS transfers using Machine Learning techniques

    CERN Document Server

    Diotalevi, Tommaso

    2016-01-01

    LHC experiments transfer more than 10 PB/week between all grid sites using the FTS transfer service. In particular, CMS manages almost 5 PB/week of FTS transfers with PhEDEx (Physics Experiment Data Export). FTS sends metrics about each transfer (e.g. transfer rate, duration, size) to a central HDFS storage at CERN. The work done during these three months, here as a Summer Student, involved the usage of ML techniques, using a CMS framework called DCAFPilot, to process this new data and generate predictions of transfer latencies on all links between Grid sites. This analysis will provide, as a future service, the necessary information in order to proactively identify and maybe fix latency issued transfer over the WLCG.

  5. Time-Frequency Analyses of Tide-Gauge Sensor Data

    Directory of Open Access Journals (Sweden)

    Serdar Erol

    2011-04-01

    Full Text Available The real world phenomena being observed by sensors are generally non-stationary in nature. The classical linear techniques for analysis and modeling natural time-series observations are inefficient and should be replaced by non-linear techniques of whose theoretical aspects and performances are varied. In this manner adopting the most appropriate technique and strategy is essential in evaluating sensors’ data. In this study, two different time-series analysis approaches, namely least squares spectral analysis (LSSA and wavelet analysis (continuous wavelet transform, cross wavelet transform and wavelet coherence algorithms as extensions of wavelet analysis, are applied to sea-level observations recorded by tide-gauge sensors, and the advantages and drawbacks of these methods are reviewed. The analyses were carried out using sea-level observations recorded at the Antalya-II and Erdek tide-gauge stations of the Turkish National Sea-Level Monitoring System. In the analyses, the useful information hidden in the noisy signals was detected, and the common features between the two sea-level time series were clarified. The tide-gauge records have data gaps in time because of issues such as instrumental shortcomings and power outages. Concerning the difficulties of the time-frequency analysis of data with voids, the sea-level observations were preprocessed, and the missing parts were predicted using the neural network method prior to the analysis. In conclusion the merits and limitations of the techniques in evaluating non-stationary observations by means of tide-gauge sensors records were documented and an analysis strategy for the sequential sensors observations was presented.

  6. Time-frequency analyses of tide-gauge sensor data.

    Science.gov (United States)

    Erol, Serdar

    2011-01-01

    The real world phenomena being observed by sensors are generally non-stationary in nature. The classical linear techniques for analysis and modeling natural time-series observations are inefficient and should be replaced by non-linear techniques of whose theoretical aspects and performances are varied. In this manner adopting the most appropriate technique and strategy is essential in evaluating sensors' data. In this study, two different time-series analysis approaches, namely least squares spectral analysis (LSSA) and wavelet analysis (continuous wavelet transform, cross wavelet transform and wavelet coherence algorithms as extensions of wavelet analysis), are applied to sea-level observations recorded by tide-gauge sensors, and the advantages and drawbacks of these methods are reviewed. The analyses were carried out using sea-level observations recorded at the Antalya-II and Erdek tide-gauge stations of the Turkish National Sea-Level Monitoring System. In the analyses, the useful information hidden in the noisy signals was detected, and the common features between the two sea-level time series were clarified. The tide-gauge records have data gaps in time because of issues such as instrumental shortcomings and power outages. Concerning the difficulties of the time-frequency analysis of data with voids, the sea-level observations were preprocessed, and the missing parts were predicted using the neural network method prior to the analysis. In conclusion the merits and limitations of the techniques in evaluating non-stationary observations by means of tide-gauge sensors records were documented and an analysis strategy for the sequential sensors observations was presented.

  7. Solar Cycle Predictions

    Science.gov (United States)

    Pesnell, William Dean

    2012-01-01

    Solar cycle predictions are needed to plan long-term space missions; just like weather predictions are needed to plan the launch. Fleets of satellites circle the Earth collecting many types of science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Predictions of drag on LEO spacecraft are one of the most important. Launching a satellite with less propellant can mean a higher orbit, but unanticipated solar activity and increased drag can make that a Pyrrhic victory as you consume the reduced propellant load more rapidly. Energetic events at the Sun can produce crippling radiation storms that endanger all assets in space. Solar cycle predictions also anticipate the shortwave emissions that cause degradation of solar panels. Testing solar dynamo theories by quantitative predictions of what will happen in 5-20 years is the next arena for solar cycle predictions. A summary and analysis of 75 predictions of the amplitude of the upcoming Solar Cycle 24 is presented. The current state of solar cycle predictions and some anticipations how those predictions could be made more accurate in the future will be discussed.

  8. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    -case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only......Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst...... compare the worst-case execution time bounds of different architectures....

  9. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  10. Split Hopkinson pressure bar technique: Experiments, analyses and applications

    Science.gov (United States)

    Gama, Bazle Anwer

    A critical review of the Hopkinson bar experimental technique is performed to identify the validity and applicability of the classic one-dimensional theory. A finite element model of the Hopkinson bar experiment is developed in three-dimensions and is used in detailed numerical analyses. For a small diameter hard specimen, the bar-specimen interfaces are non-planar, which predicts higher specimen strain and, thus, lower initial modulus in the linear elastic phase of deformation. In such cases, the stress distribution in the specimen is not uni-axial and a chamfered specimen geometry is found to provide better uni-axial stress condition in the specimen. In addition, a new Hopkinson bar with transmission tube is found suitable for small strain measurement of small diameter specimens. A one-dimensional exact Hopkinson bar theory considering the stress wave propagation in an equal diameter specimen has been formulated which predicts physically meaningful results in all extreme cases as compared to classic theory. In light of the theoretical and numerical investigations, an experimental methodology for rate dependent modulus and strength is developed. Quasi-static and dynamic behavior of plain weave (15 x 15) S-2 glass/SC15 composites has been investigated. A new circular-rectangular prism specimen (C-RPS) geometry is found suitable for testing laminated composites in the in-plane directions. Rate sensitive strength, non-linear strain and elastic modulus parameters for plain-weave (15 x 15) S-2 glass/SC15 composites have been experimentally determined.

  11. An Illumination Modeling System for Human Factors Analyses

    Science.gov (United States)

    Huynh, Thong; Maida, James C.; Bond, Robert L. (Technical Monitor)

    2002-01-01

    Seeing is critical to human performance. Lighting is critical for seeing. Therefore, lighting is critical to human performance. This is common sense, and here on earth, it is easily taken for granted. However, on orbit, because the sun will rise or set every 45 minutes on average, humans working in space must cope with extremely dynamic lighting conditions. Contrast conditions of harsh shadowing and glare is also severe. The prediction of lighting conditions for critical operations is essential. Crew training can factor lighting into the lesson plans when necessary. Mission planners can determine whether low-light video cameras are required or whether additional luminaires need to be flown. The optimization of the quantity and quality of light is needed because of the effects on crew safety, on electrical power and on equipment maintainability. To address all of these issues, an illumination modeling system has been developed by the Graphics Research and Analyses Facility (GRAF) and Lighting Environment Test Facility (LETF) in the Space Human Factors Laboratory at NASA Johnson Space Center. The system uses physically based ray tracing software (Radiance) developed at Lawrence Berkeley Laboratories, a human factors oriented geometric modeling system (PLAID) and an extensive database of humans and environments. Material reflectivity properties of major surfaces and critical surfaces are measured using a gonio-reflectometer. Luminaires (lights) are measured for beam spread distribution, color and intensity. Video camera performances are measured for color and light sensitivity. 3D geometric models of humans and the environment are combined with the material and light models to form a system capable of predicting lighting conditions and visibility conditions in space.

  12. Statistical prediction of Late Miocene climate

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Gupta, S.M.

    by the India Meteorological Department for forflcasting the rainfall over India during the South West Monsoon. This problem is known as long-range prediction. The use of regressional formulae is all pervasive. Often ws do not consciously realize that we... variables and assume that others are constant. The inter relationships between behaviour and environment CRn be analysed as they exist in real life. Essentially the aim of factor analysis is to quantit~tively summarise and reveal information hidden in a...

  13. Machine learning algorithms for datasets popularity prediction

    CERN Document Server

    Kancys, Kipras

    2016-01-01

    This report represents continued study where ML algorithms were used to predict databases popularity. Three topics were covered. First of all, there was a discrepancy between old and new meta-data collection procedures, so a reason for that had to be found. Secondly, different parameters were analysed and dropped to make algorithms perform better. And third, it was decided to move modelling part on Spark.

  14. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  15. Abundance analyses of cool extreme helium stars

    CERN Document Server

    Pandey, G; Lambert, D L; Jeffery, C S; Asplund, M; Pandey, Gajendra; Lambert, David L.; Asplund, Martin

    2001-01-01

    Extreme helium stars (EHe) with effective temperatures from 8000K to 13000K are among the coolest EHe stars and overlap the hotter R CrB stars in effective temperature. The cool EHes may represent an evolutionary link between the hot EHes and the R CrBs. Abundance analyses of four cool EHes are presented. To test for an evolutionary connection, the chemical compositions of cool EHes are compared with those of hot EHes and R CrBs. Relative to Fe, the N abundance of these stars is intermediate between those of hot EHes and R CrBs. For the R CrBs, the metallicity M derived from the mean of Si and S appears to be more consistent with the kinematics than that derived from Fe. When metallicity M derived from Si and S replaces Fe, the observed N abundances of EHes and R CrBs fall at or below the upper limit corresponding to thorough conversion of initial C and O to N. There is an apparent difference between the composition of R CrBs and EHes; the former having systematically higher [N/M] ratios. The material present...

  16. A new modular chemiluminescence immunoassay analyser evaluated.

    Science.gov (United States)

    Ognibene, A; Drake, C J; Jeng, K Y; Pascucci, T E; Hsu, S; Luceri, F; Messeri, G

    2000-03-01

    Thyrotropin (TSH), free thyroxine (fT4) and testosterone assays have been used as a probe to evaluate the performances of a new modular chemiluminescence (CL) immunoassay analyser, the Abbott Architect 2000. The evaluation was run in parallel on other systems that use CL as the detection reaction: DPC Immulite, Chiron Diagnostics ACS-180 and ACS Centaur (TSH functional sensitivity only). TSH functional sensitivity was 0.0012, 0.009, 0.033 and 0.039 mU/I for the Architect, Immulite, ACS Centaur and ACS-180, respectively. Testosterone functional sensitivity was 0.38, 3.7 and 2.0 nmol/l for Architect, Immulite and ACS-180, respectively. Good correlation was obtained between the ACS-180 and Architect for all assays. The Immulite correlation did not agree well with the Architect or ACS-180 for fT4 and testosterone but was in good agreement for TSH. Regarding fT4 and testosterone, equilibrium dialysis and isotopic dilution gas-chromatography mass-spectrometry (GC-MS) respectively were used as reference methods. For both within- and between-run precision, the Architect showed the best reproducibility for all three analytes (CV < 6%).

  17. Kinematic gait analyses in healthy Golden Retrievers

    Directory of Open Access Journals (Sweden)

    Gabriela C.A. Silva

    2014-12-01

    Full Text Available Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female, aged between 2 and 4 years, weighing 21.5 to 28 kg, clinically normal. Flexion and extension were described for shoulder, elbow, carpal, hip, femorotibialis and tarsal joints. The gait was characterized lateral and had accepted hypothesis of normality for all variables, except for the stance of hip and elbow, considering a confidence level of 95%, significance level α = 0.05. Variations have been attributed to displacement of the stripes during movement and the duplicated number of reviews. The kinematic analysis proved to be a consistent method of evaluation of the movement during canine gait and the data can be used in the diagnosis and evaluation of canine gait in comparison to other studies and treatment of dogs with musculoskeletal disorders.

  18. Network-Based and Binless Frequency Analyses.

    Directory of Open Access Journals (Sweden)

    Sybil Derrible

    Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.

  19. Field analyses of tritium at environmental levels

    Energy Technology Data Exchange (ETDEWEB)

    Hofstetter, K.J.; Cable, P.R.; Beals, D.M

    1999-02-11

    An automated, remote system to analyze tritium in aqueous solutions at environmental levels has been tested and has demonstrated laboratory quality tritium analysis capability in near real time. The field deployable tritium analysis system (FDTAS) consists of a novel multi-port autosampler, an on-line water purification system, and a prototype stop-flow liquid scintillation counter (LSC) which can be remotely controlled for unmanned operation. Backgrounds of {approx}1.5 counts/min in the tritium channel are routinely measured with a tritium detection efficiency of {approx}25% for the custom 11 ml cell. A detection limit of <0.3 pCi/ml has been achieved for 100-min counts using a 50 : 50 mixture of sample and cocktail. To assess the long-term performance characteristics of the FDTAS, a composite sampler was installed on the Savannah River, downstream of the Savannah River Site, and collected repetitive 12-hour composite samples over a 14 day period. The samples were analyzed using the FDTAS and in the laboratory using a standard bench-top LSC. The results of the tritium analyses by the FDTAS and by the laboratory LSC were consistent for comparable counting times at the typical river tritium background levels ({approx}1 pCi/ml)

  20. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  1. Consumption patterns and perception analyses of hangwa.

    Science.gov (United States)

    Kwock, Chang Geun; Lee, Min A; Park, So Hyun

    2012-03-01

    Hangwa is a traditional food, corresponding to the current consumption trend, in need of marketing strategies to extend its consumption. Therefore, the purpose of this study was to analyze consumers' consumption patterns and perception of Hangwa to increase consumption in the market. A questionnaire was sent to 250 consumers by e-mail from Oct 8∼23, 2009 and the data from 231 persons were analyzed in this study. Statistical, descriptive, paired samples t-test, and importance-performance analyses were conducted using SPSS WIN 17.0. According to the results, Hangwa was purchased mainly 'for present' (39.8%) and the main reasons for buying it were 'traditional image' (33.3%) and 'taste' (22.5%). When importance and performance of attributes considered in purchasing Hangwa were evaluated, performance was assessed to be lower than importance for all attributes. The attributes in the first quadrant with a high importance and a high performance were 'a sanitary process', 'a rigorous quality mark' and 'taste', which were related with quality of the products. In addition, those with a high importance but a low performance were 'popularization through advertisement', 'promotion through mass media', 'conversion of thought on traditional foods', 'a reasonable price' and 'a wide range of price'. In conclusion, Hangwa manufacturers need to diversify products and extend the expiration date based on technologies to promote its consumption. In terms of price, Hangwa should become more available by lowering the price barrier for consumers who are sensitive to price.

  2. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  3. ANALYSES AND INFLUENCES OF GLAZED BUILDING ENVELOPES

    Directory of Open Access Journals (Sweden)

    Sabina Jordan

    2011-01-01

    Full Text Available The article presents the results of an analytical study of the functioning of glazing at two different yet interacting levels: at the level of the building as a whole, and at that of glazing as a building element. At the building level, analyses were performed on a sample of high-rise business buildings in Slovenia, where the glazing"s share of the building envelope was calculated, and estimates of the proportion of shade provided by external blinds were made. It is shown that, especially in the case of modern buildings with large proportions of glazing and buildings with no shading devices, careful glazing design is needed, together with a sound knowledge of energy performance. In the second part of the article, the energy balance values relating to selected types of glazing are presented, including solar control glazing. The paper demonstrates the need for a holistic energy approach to glazing problems, as well as how different types of glazing can be methodically compared, thus improving the design of sustainability-orientated buildings.

  4. Statistical analyses of a screen cylinder wake

    Science.gov (United States)

    Mohd Azmi, Azlin; Zhou, Tongming; Zhou, Yu; Cheng, Liang

    2017-02-01

    The evolution of a screen cylinder wake was studied by analysing its statistical properties over a streamwise range of x/d={10-60}. The screen cylinder was made of a stainless steel screen mesh of 67% porosity. The experiments were conducted in a wind tunnel at a Reynolds number of 7000 using an X-probe. The results were compared with those obtained in the wake generated by a solid cylinder. It was observed that the evolution of the statistics in the wake of the screen cylinder was different from that of a solid cylinder, reflecting the differences in the formation of the organized large-scale vortices in both wakes. The streamwise evolution of the Reynolds stresses, energy spectra and cross-correlation coefficients indicated that there exists a critical location that differentiates the screen cylinder wake into two regions over the measured streamwise range. The formation of the fully formed large-scale vortices was delayed until this critical location. Comparison with existing results for screen strips showed that although the near-wake characteristics and the vortex formation mechanism were similar between the two wake generators, variation in the Strouhal frequencies was observed and the self-preservation states were non-universal, reconfirming the dependence of a wake on its initial condition.

  5. Genomic analyses of the CAM plant pineapple.

    Science.gov (United States)

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars.

  6. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  7. Trend Analyses of Nitrate in Danish Groundwater

    Science.gov (United States)

    Hansen, B.; Thorling, L.; Dalgaard, T.; Erlandsen, M.

    2012-04-01

    This presentation assesses the long-term development in the oxic groundwater nitrate concentration and nitrogen (N) loss due to intensive farming in Denmark. Firstly, up to 20-year time-series from the national groundwater monitoring network enable a statistically systematic analysis of distribution, trends and trend reversals in the groundwater nitrate concentration. Secondly, knowledge about the N surplus in Danish agriculture since 1950 is used as an indicator of the potential loss of N. Thirdly, groundwater recharge CFC (Chlorofluorocarbon) age determination allows linking of the first two dataset. The development in the nitrate concentration of oxic groundwater clearly mirrors the development in the national agricultural N surplus, and a corresponding trend reversal is found in groundwater. Regulation and technical improvements in the intensive farming in Denmark have succeeded in decreasing the N surplus by 40% since the mid 1980s while at the same time maintaining crop yields and increasing the animal production of especially pigs. Trend analyses prove that the youngest (0-15 years old) oxic groundwater shows more pronounced significant downward nitrate trends (44%) than the oldest (25-50 years old) oxic groundwater (9%). This amounts to clear evidence of the effect of reduced nitrate leaching on groundwater nitrate concentrations in Denmark. Are the Danish groundwater monitoring strategy obtimal for detection of nitrate trends? Will the nitrate concentrations in Danish groundwater continue to decrease or are the Danish nitrate concentration levels now appropriate according to the Water Framework Directive?

  8. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  9. Seismic Soil-Structure Interaction Analyses of a Deeply Embedded Model Reactor – SASSI Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nie J.; Braverman J.; Costantino, M.

    2013-10-31

    This report summarizes the SASSI analyses of a deeply embedded reactor model performed by BNL and CJC and Associates, as part of the seismic soil-structure interaction (SSI) simulation capability project for the NEAMS (Nuclear Energy Advanced Modeling and Simulation) Program of the Department of Energy. The SASSI analyses included three cases: 0.2 g, 0.5 g, and 0.9g, all of which refer to nominal peak accelerations at the top of the bedrock. The analyses utilized the modified subtraction method (MSM) for performing the seismic SSI evaluations. Each case consisted of two analyses: input motion in one horizontal direction (X) and input motion in the vertical direction (Z), both of which utilized the same in-column input motion. Besides providing SASSI results for use in comparison with the time domain SSI results obtained using the DIABLO computer code, this study also leads to the recognition that the frequency-domain method should be modernized so that it can better serve its mission-critical role for analysis and design of nuclear power plants.

  10. Asteroid Bennu Temperature Maps for OSIRIS-REx Spacecraft and Instrument Thermal Analyses

    Science.gov (United States)

    Choi, Michael K.; Emery, Josh; Delbo, Marco

    2014-01-01

    A thermophysical model has been developed to generate asteroid Bennu surface temperature maps for OSIRIS-REx spacecraft and instrument thermal design and analyses at the Critical Design Review (CDR). Two-dimensional temperature maps for worst hot and worst cold cases are used in Thermal Desktop to assure adequate thermal design margins. To minimize the complexity of the Bennu geometry in Thermal Desktop, it is modeled as a sphere instead of the radar shape. The post-CDR updated thermal inertia and a modified approach show that the new surface temperature predictions are more benign. Therefore the CDR Bennu surface temperature predictions are conservative.

  11. Use of EBSD Data in Numerical Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Becker, R; Wiland, H

    2000-01-14

    obtained for comparison with the model predictions. More recent work has taken advantage of automated data collection on deformed specimens as a means of collecting detailed and spatially correlated data for model validation. Although it will not be discussed in detail here, another area in which EBSD data is having a great impact is on recrystallization modeling. EBSD techniques can be used to collect data for quantitative microstructural analysis. This data can be used to infer growth kinetics of specific orientations, and this information can be synthesized into more accurate grain growth or recrystallization models. Another role which EBSD techniques may play is in determining initial structures for recrystallization models. A realistic starting structure is vital for evaluating the models, and attempts at predicting realistic structures with finite element simulations are not yet successful. As methodologies and equipment resolution continue to improve, it is possible that measured structures will serve as input for recrystallization models. Simulations have already been run using information obtained manually from a TEM.

  12. Predicting protein structure classes from function predictions

    DEFF Research Database (Denmark)

    Sommer, I.; Rahnenfuhrer, J.; de Lichtenberg, Ulrik;

    2004-01-01

    We introduce a new approach to using the information contained in sequence-to-function prediction data in order to recognize protein template classes, a critical step in predicting protein structure. The data on which our method is based comprise probabilities of functional categories; for given...... query sequences these probabilities are obtained by a neural net that has previously been trained on a variety of functionally important features. On a training set of sequences we assess the relevance of individual functional categories for identifying a given structural family. Using a combination...... of the most relevant categories, the likelihood of a query sequence to belong to a specific family can be estimated. Results: The performance of the method is evaluated using cross-validation. For a fixed structural family and for every sequence, a score is calculated that measures the evidence for family...

  13. 'Red Flag' Predictions

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    -generation prediction markets and outline its unique features as a third-generation prediction market. It is argued that frontline employees gain deep insights when they execute operational activities on an ongoing basis in the organization. The experiential learning from close interaction with internal and external...

  14. Improved nonlinear prediction method

    Science.gov (United States)

    Adenan, Nur Hamiza; Md Noorani, Mohd Salmi

    2014-06-01

    The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.

  15. Predicting AD conversion

    DEFF Research Database (Denmark)

    Liu, Yawu; Mattila, Jussi; Ruiz, Miguel �ngel Mu�oz

    2013-01-01

    To compare the accuracies of predicting AD conversion by using a decision support system (PredictAD tool) and current research criteria of prodromal AD as identified by combinations of episodic memory impairment of hippocampal type and visual assessment of medial temporal lobe atrophy (MTA) on MRI...

  16. The Prediction Value

    NARCIS (Netherlands)

    Koster, M.; Kurz, S.; Lindner, I.; Napel, S.

    2013-01-01

    We introduce the prediction value (PV) as a measure of players’ informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player i’s prediction value equals the difference between the conditional expectations

  17. Non-independence and sensitivity analyses in ecological and evolutionary meta-analyses.

    Science.gov (United States)

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-01-30

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to non-independence. Non-independence can affect two major interrelated components of a meta-analysis: 1) the calculation of effect size statistics and 2) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to non-independence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with non-independent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g., inclusion of different quality data, choice of effect size) and statistical assumptions (e.g., assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of non-independence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g., impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of non-independence. To encourage better practice for dealing with non-independence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring non-independence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with non-independent study designs, and for analyzing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of non-independence in meta-analyses, leading to greater transparency and more robust conclusions. This article is protected by copyright. All rights reserved.

  18. Analyse textuelle des discours: Niveaux ou plans d´analyse

    Directory of Open Access Journals (Sweden)

    Jean-Michel Adam

    2012-12-01

    Full Text Available L’article porte sur la théorie de l´Analyse Textuelle des Discours, à partir d´une reprisede la traduction brésilienne de La linguistique textuelle: introduction à l’analyse textuelle desdiscours (Cortez, 2008. L’ATD est pensée en fonction de trois observations préliminaires: lalinguistique textuelle est une des disciplines de l’analyse de discours, le texte est l’objet d’analysede l’ATD, et, dès qu’il y a texte, c’est-à-dire reconnaissance du fait qu’une suite d’énoncésforme un tout de communication, il y a effet de généricité, c’est-à-dire inscription de cette suited’énoncés dans une classe de discours. Le modèle théorique de l’ATD est éclairé par une reprisede son schéma 4, où sont représentés huit niveaux d’analyse. L´ATD est abordée sous l’angled’une double exigence – des raisons théoriques et des raisons méthodologiques et didactiquesqui conduisent à ces niveaux – et sont détaillées et illustrées les cinq plans ou niveaux d’analysetextuelle. Pour finir, des parties de l’oeuvre sont reprises et élargies, avec d’autres analyses où denouveaux aspcts théoriques sont détaillés.

  19. Prediction by Compression

    CERN Document Server

    Ratsaby, Joel

    2010-01-01

    It is well known that text compression can be achieved by predicting the next symbol in the stream of text data based on the history seen up to the current symbol. The better the prediction the more skewed the conditional probability distribution of the next symbol and the shorter the codeword that needs to be assigned to represent this next symbol. What about the opposite direction ? suppose we have a black box that can compress text stream. Can it be used to predict the next symbol in the stream ? We introduce a criterion based on the length of the compressed data and use it to predict the next symbol. We examine empirically the prediction error rate and its dependency on some compression parameters.

  20. Theoretical and computational analyses of LNG evaporator

    Science.gov (United States)

    Chidambaram, Palani Kumar; Jo, Yang Myung; Kim, Heuy Dong

    2017-04-01

    Theoretical and numerical analysis on the fluid flow and heat transfer inside a LNG evaporator is conducted in this work. Methane is used instead of LNG as the operating fluid. This is because; methane constitutes over 80% of natural gas. The analytical calculations are performed using simple mass and energy balance equations. The analytical calculations are made to assess the pressure and temperature variations in the steam tube. Multiphase numerical simulations are performed by solving the governing equations (basic flow equations of continuity, momentum and energy equations) in a portion of the evaporator domain consisting of a single steam pipe. The flow equations are solved along with equations of species transport. Multiphase modeling is incorporated using VOF method. Liquid methane is the primary phase. It vaporizes into the secondary phase gaseous methane. Steam is another secondary phase which flows through the heating coils. Turbulence is modeled by a two equation turbulence model. Both the theoretical and numerical predictions are seen to match well with each other. Further parametric studies are planned based on the current research.

  1. Proteomic analyses of apoplastic proteins from germinating Arabidopsis thaliana pollen.

    Science.gov (United States)

    Ge, Weina; Song, Yun; Zhang, Cuijun; Zhang, Yafang; Burlingame, Alma L; Guo, Yi

    2011-12-01

    Pollen grains play important roles in the reproductive processes of flowering plants. The roles of apoplastic proteins in pollen germination and in pollen tube growth are comparatively less well understood. To investigate the functions of apoplastic proteins in pollen germination, the global apoplastic proteins of mature and germinated Arabidopsis thaliana pollen grains were prepared for differential analyses by using 2-dimensional fluorescence difference gel electrophoresis (2-D DIGE) saturation labeling techniques. One hundred and three proteins differentially expressed (p value≤0.01) in pollen germinated for 6h compared with un-germination mature pollen, and 98 spots, which represented 71 proteins, were identified by LC-MS/MS. By bioinformatics analysis, 50 proteins were identified as secreted proteins. These proteins were mainly involved in cell wall modification and remodeling, protein metabolism and signal transduction. Three of the differentially expressed proteins were randomly selected to determine their subcellular localizations by transiently expressing YFP fusion proteins. The results of subcellular localization were identical with the bioinformatics prediction. Based on these data, we proposed a model for apoplastic proteins functioning in pollen germination and pollen tube growth. These results will lead to a better understanding of the mechanisms of pollen germination and pollen tube growth.

  2. Experimental and numerical analyses of magnesium alloy hot workability

    Directory of Open Access Journals (Sweden)

    F. Abbassi

    2016-12-01

    Full Text Available Due to their hexagonal crystal structure, magnesium alloys have relatively low workability at room temperature. In this study, the hot workability behavior of cast-extruded AZ31B magnesium alloy is studied through hot compression testing, numerical modeling and microstructural analyses. Hot deformation tests are performed at temperatures of 250 °C to 400 °C under strain rates of 0.01 to 1.0 s−1. Transmission electron microscopy is used to reveal the presence of dynamic recrystallization (DRX, dynamic recovery (DRY, cracks and shear bands. To predict plastic instabilities during hot compression tests of AZ31B magnesium alloy, the authors use Johnson–Cook damage model in a 3D finite element simulation. The optimal hot workability of magnesium alloy is found at a temperature (T of 400 °C and strain rate (ε˙ of 0.01 s−1. Stability is found at a lower strain rate, and instability is found at a higher strain rate.

  3. Non-Statistical Methods of Analysing of Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Pisula Tomasz

    2015-06-01

    Full Text Available The article focuses on assessing the effectiveness of a non-statistical approach to bankruptcy modelling in enterprises operating in the logistics sector. In order to describe the issue more comprehensively, the aforementioned prediction of the possible negative results of business operations was carried out for companies functioning in the Polish region of Podkarpacie, and in Slovakia. The bankruptcy predictors selected for the assessment of companies operating in the logistics sector included 28 financial indicators characterizing these enterprises in terms of their financial standing and management effectiveness. The purpose of the study was to identify factors (models describing the bankruptcy risk in enterprises in the context of their forecasting effectiveness in a one-year and two-year time horizon. In order to assess their practical applicability the models were carefully analysed and validated. The usefulness of the models was assessed in terms of their classification properties, and the capacity to accurately identify enterprises at risk of bankruptcy and healthy companies as well as proper calibration of the models to the data from training sample sets.

  4. Genome-wide analyses of small noncoding RNAs in streptococci

    Directory of Open Access Journals (Sweden)

    Nadja ePatenge

    2015-05-01

    Full Text Available Streptococci represent a diverse group of Gram-positive bacteria, which colonize a wide range of hosts among animals and humans. Streptococcal species occur as commensal as well as pathogenic organisms. Many of the pathogenic species can cause severe, invasive infections in their hosts leading to a high morbidity and mortality. The consequence is a tremendous suffering on the part of men and livestock besides the significant financial burden in the agricultural and healthcare sectors. An environmentally stimulated and tightly controlled expression of virulence factor genes is of fundamental importance for streptococcal pathogenicity. Bacterial small noncoding RNAs (sRNAs modulate the expression of genes involved in stress response, sugar metabolism, surface composition, and other properties that are related to bacterial virulence. Even though the regulatory character is shared by this class of RNAs, variation on the molecular level results in a high diversity of functional mechanisms. The knowledge about the role of sRNAs in streptococci is still limited, but in recent years, genome-wide screens for sRNAs have been conducted in an increasing number of species. Bioinformatics prediction approaches have been employed as well as expression analyses by classical array techniques or next generation sequencing. This review will give an overview of whole genome screens for sRNAs in streptococci with a focus on describing the different methods and comparing their outcome considering sRNA conservation among species, functional similarities, and relevance for streptococcal infection.

  5. Taxometric analyses of paranoid and schizoid personality disorders.

    Science.gov (United States)

    Ahmed, Anthony Olufemi; Green, Bradley Andrew; Buckley, Peter Francis; McFarland, Megan Elizabeth

    2012-03-30

    There remains debate about whether personality disorders (PDs) are better conceptualized as categorical, reflecting discontinuity from normal personality; or dimensional, existing on a continuum of severity with normal personality traits. Evidence suggests that most PDs are dimensional but there is a lack of consensus about the structure of Cluster A disorders. Taxometric methods are adaptable to investigating the taxonic status of psychiatric disorders. The current study investigated the latent structure of paranoid and schizoid PDs in an epidemiological sample (N=43,093) drawn from the National Epidemiological Survey on Alcohol and Related Conditions (NESARC) using taxometric analyses. The current study used taxometric methods to analyze three indicators of paranoid PD - mistrust, resentment, and functional disturbance - and three indicators of schizoid PD - emotional detachment, social withdrawal, and functional disturbance - derived factor analytically. Overall, taxometrics supported a dimensional rather than taxonic structure for paranoid and schizoid PDs through examination of taxometric graphs and comparative curve fit indices. Dimensional models of paranoid and schizoid PDs better predicted social functioning, role-emotional, and mental health scales in the survey than categorical models. Evidence from the current study supports recent efforts to represent paranoid and schizoid PDs as well as other PDs along broad personality dimensions.

  6. Quantifying Fire Cycle from Dendroecological Records Using Survival Analyses

    Directory of Open Access Journals (Sweden)

    Dominic Cyr

    2016-06-01

    Full Text Available Quantifying fire regimes in the boreal forest ecosystem is crucial for understanding the past and present dynamics, as well as for predicting its future dynamics. Survival analyses have often been used to estimate the fire cycle in eastern Canada because they make it possible to take into account the censored information that is made prevalent by the typically long fire return intervals and the limited scope of the dendroecological methods that are used to quantify them. Here, we assess how the true length of the fire cycle, the short-term temporal variations in fire activity, and the sampling effort affect the accuracy and precision of estimates obtained from two types of parametric survival models, the Weibull and the exponential models, and one non-parametric model obtained with the Cox regression. Then, we apply those results in a case area located in eastern Canada. Our simulation experiment confirms some documented concerns regarding the detrimental effects of temporal variations in fire activity on parametric estimation of the fire cycle. Cox regressions appear to provide the most accurate and robust estimator, being by far the least affected by temporal variations in fire activity. The Cox-based estimate of the fire cycle for the last 300 years in the case study area is 229 years (CI95: 162–407, compared with the likely overestimated 319 years obtained with the commonly used exponential model.

  7. Pipeline for macro- and microarray analyses

    Directory of Open Access Journals (Sweden)

    R. Vicentini

    2007-05-01

    Full Text Available The pipeline for macro- and microarray analyses (PMmA is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps. It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.

  8. Structural prediction in aphasia

    Directory of Open Access Journals (Sweden)

    Tessa Warren

    2015-05-01

    Full Text Available There is considerable evidence that young healthy comprehenders predict the structure of upcoming material, and that their processing is facilitated when they encounter material matching those predictions (e.g., Staub & Clifton, 2006; Yoshida, Dickey & Sturt, 2013. However, less is known about structural prediction in aphasia. There is evidence that lexical prediction may be spared in aphasia (Dickey et al., 2014; Love & Webb, 1977; cf. Mack et al, 2013. However, predictive mechanisms supporting facilitated lexical access may not necessarily support structural facilitation. Given that many people with aphasia (PWA exhibit syntactic deficits (e.g. Goodglass, 1993, PWA with such impairments may not engage in structural prediction. However, recent evidence suggests that some PWA may indeed predict upcoming structure (Hanne, Burchert, De Bleser, & Vashishth, 2015. Hanne et al. tracked the eyes of PWA (n=8 with sentence-comprehension deficits while they listened to reversible subject-verb-object (SVO and object-verb-subject (OVS sentences in German, in a sentence-picture matching task. Hanne et al. manipulated case and number marking to disambiguate the sentences’ structure. Gazes to an OVS or SVO picture during the unfolding of a sentence were assumed to indicate prediction of the structure congruent with that picture. According to this measure, the PWA’s structural prediction was impaired compared to controls, but they did successfully predict upcoming structure when morphosyntactic cues were strong and unambiguous. Hanne et al.’s visual-world evidence is suggestive, but their forced-choice sentence-picture matching task places tight constraints on possible structural predictions. Clearer evidence of structural prediction would come from paradigms where the content of upcoming material is not as constrained. The current study used self-paced reading study to examine structural prediction among PWA in less constrained contexts. PWA (n=17 who

  9. Application of Polar Cap (PC) indices in analyses and forecasts of geophysical conditions

    Science.gov (United States)

    Stauning, Peter

    2016-07-01

    The Polar Cap (PC) indices could be considered to represent the input of power from the solar wind to the Earth's magnetosphere. The indices have been used to analyse interplanetary electric fields, effects of solar wind pressure pulses, cross polar cap voltages and polar cap diameter, ionospheric Joule heating, and other issues of polar cap dynamics. The PC indices have also been used to predict auroral electrojet intensities and global auroral power as well as ring current intensities. For specific space weather purposes the PC indices could be used to forecast substorm development and predict associated power line disturbances in the subauroral regions. The presentation shall outline the general background for applying the PC indices in analyses or forecasts of solar wind-magnetosphere-ionosphere interactions and provide illustrative examples of the use of the Polar Cap indices in specific cases

  10. Weld investigations by 3D analyses of Charpy V-notch specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo; Needleman, Allan

    2005-01-01

    The Charpy impact test is a standard procedure for determining the ductile-brittle transition in welds. The predictions of such tests have been investigated by full three dimensional transient analyses of Charpy V-notch specimens. The material response is characterised by an elastic-viscoplastic ......The Charpy impact test is a standard procedure for determining the ductile-brittle transition in welds. The predictions of such tests have been investigated by full three dimensional transient analyses of Charpy V-notch specimens. The material response is characterised by an elastic...... parameters in the weld material differ from those in the base material, and the heat a®ected zone (HAZ) tends to be more brittle than the other material regions. The effect of weld strength undermatch or overmatch is an important issue. Some specimens, for which the notched surface is rotated relative...

  11. Landslide forecasting and factors influencing predictability

    Science.gov (United States)

    Intrieri, Emanuele; Gigli, Giovanni

    2016-11-01

    Forecasting a catastrophic collapse is a key element in landslide risk reduction, but it is also a very difficult task owing to the scientific difficulties in predicting a complex natural event and also to the severe social repercussions caused by a false or missed alarm. A prediction is always affected by a certain error; however, when this error can imply evacuations or other severe consequences a high reliability in the forecast is, at least, desirable. In order to increase the confidence of predictions, a new methodology is presented here. In contrast to traditional approaches, this methodology iteratively applies several forecasting methods based on displacement data and, thanks to an innovative data representation, gives a valuation of the reliability of the prediction. This approach has been employed to back-analyse 15 landslide collapses. By introducing a predictability index, this study also contributes to the understanding of how geology and other factors influence the possibility of forecasting a slope failure. The results showed how kinematics, and all the factors influencing it, such as geomechanics, rainfall and other external agents, are key concerning landslide predictability.

  12. Aging analyses of aircraft wire insulation

    Energy Technology Data Exchange (ETDEWEB)

    GILLEN,KENNETH T.; CLOUGH,ROGER LEE; CELINA,MATHIAS C.; AUBERT,JAMES H.; MALONE,G. MICHAEL

    2000-05-08

    Over the past two decades, Sandia has developed a variety of specialized analytical techniques for evaluating the long-term aging and stability of cable insulation and other related materials. These techniques have been applied to cable reliability studies involving numerous insulation types and environmental factors. This work has allowed the monitoring of the occurrence and progression of cable material deterioration in application environments, and has provided insights into material degradation mechanisms. It has also allowed development of more reliable lifetime prediction methodologies. As a part of the FAA program for intrusive inspection of aircraft wiring, they are beginning to apply a battery of techniques to assessing the condition of cable specimens removed from retired aircraft. It is anticipated that in a future part of this program, they may employ these techniques in conjunction with accelerated aging methodologies and models that the authros have developed and employed in the past to predict cable lifetimes. The types of materials to be assessed include 5 different wire types: polyimide, PVC/Glass/Nylon, extruded XL-polyalkene/PVDF, Poly-X, and XL-ETFE. This presentation provides a brief overview of the main techniques that will be employed in assessing the state of health of aircraft wire insulation. The discussion will be illustrated with data from their prior cable aging studies, highlighting the methods used and their important conclusions. A few of the techniques that they employ are widely used in aging studies on polymers, but others are unique to Sandia. All of their techniques are non-proprietary, and maybe of interest for use by others in terms of application to aircraft wiring analysis. At the end of this report is a list showing some leading references to papers that have been published in the open literature which provide more detailed information on the analytical techniques for elastomer aging studies. The first step in the

  13. Adolescent conscientiousness predicts lower lifetime unemployment.

    Science.gov (United States)

    Egan, Mark; Daly, Michael; Delaney, Liam; Boyce, Christopher J; Wood, Alex M

    2017-04-01

    Existing research on Big Five personality and unemployment has relied on personality measures elicited after the respondents had already spent years in the labor market, an experience that could change personality. We clarify the direction of influence by using the British Cohort Study (N = 4,206) to examine whether conscientiousness and other Big Five personality traits at age 16-17 predict unemployment over age 16-42. Our hypothesis that higher conscientiousness in adolescence would predict lower unemployment was supported. In analyses controlling for intelligence, gender, and parental socioeconomic status, the less conscientious (-1 SD) had a predicted probability of unemployment twice as high (3.4% vs. 1.7%) as the highly conscientious (+1 SD), an effect size comparable to intelligence. Mediation analysis revealed that academic motivation and educational attainment explained only 8.9% of this association. Fostering conscientiousness in early life may be an effective way to reduce unemployment throughout adulthood. (PsycINFO Database Record

  14. Bond return predictability in expansions and recessions

    DEFF Research Database (Denmark)

    Engsted, Tom; Møller, Stig Vinther; Jensen, Magnus David Sander

    We document that over the period 1953-2011 US bond returns are predictable in expansionary periods but unpredictable during recessions. This result holds in both in-sample and out-of-sample analyses and using both univariate regressions and combination forecasting techniques. A simulation study...... shows that our tests have power to reject unpredictability in both expansions and recessions. To judge the economic significance of the results we compute utility gains for a meanvariance investor who takes the predictability patterns into account and show that utility gains are positive in expansions...... but negative in recessions. The results are also consistent with tests showing that the expectations hypothesis of the term structure holds in recessions but not in expansions. However, the results for bonds are in sharp contrast to results for stocks showing that stock returns are predictable in recessions...

  15. On the Predictability of Hub Height Winds

    DEFF Research Database (Denmark)

    Draxl, Caroline

    grids. These systems require forecasts with temporal scales of tens of minutes to a few days in advance at wind farm locations. Traditionally these forecasts predict the wind at turbine hub heights; this information is then converted by transmission system operators and energy companies into predictions...... of the importance of wind energy forecasts, this thesis continues with an analysis of wind speed predictions at hub height using the Weather Research and Forecasting (WRF) model. This analysis demonstrates the need for more detailed analyses of wind speeds and it is shown that wind energy forecasting cannot...... on the PBL scheme adopted and is different under varying atmospheric stability conditions, among other modeling factors. This has important implications for wind energy applications: shallow stable boundary layers can result in excessive wind shear, which is detrimental for wind energy applications...

  16. Differential AR algorithm for packet delay prediction

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Different delay prediction algorithms have been applied in multimedia communication, among which linear prediction is attractive because of its low complexity. AR (auto regressive) algorithm is a traditional one with low computation cost, while NLMS (normalize least mean square) algorithm is more precise. In this paper, referring to ARIMA (auto regression integrated with moving averages) model, a differential AR algorithm (DIAR) is proposed based on the analyses of both AR and NLMS algorithms. The prediction precision of the new algorithm is about 5-10 db higher than that of the AR algorithm without increasing the computation complexity.Compared with NLMS algorithm, its precision slightly improves by 0.1 db on average, but the algorithm complexity reduces more than 90%. Our simulation and tests also demonstrate that this method improves the performance of the average end-to-end delay and packet loss ratio significantly.

  17. The Strepsiptera-Odyssey: the history of the systematic placement of an enigmatic parasitic insect order

    Directory of Open Access Journals (Sweden)

    H. Pohl

    2013-09-01

    Full Text Available The history of the phylogenetic placement of the parasitic insect order Strepsiptera is outlined. The first species was described in 1793 by P. Rossi and assigned to the hymenopteran family Ichneumonidae. A position close to the cucujiform beetle family Rhipiphoridae was suggested by several earlier authors. Others proposed a close relationship with Diptera or even a group Pupariata including Diptera, Strepsiptera and Coccoidea. A subordinate placement within the polyphagan series Cucujiformia close to the wood-associated Lymexylidae was favored by the coleopterist R.A. Crowson. W. Hennig considered a sistergroup relationship with Coleoptera as the most likely hypothesis but emphasized the uncertainty. Cladistic analyses of morphological data sets yielded very different placements, alternatively as sistergroup of Coleoptera, Antliophora, or all other holometabolan orders. Results based on ribosomal genes suggested a sistergroup relationship with Diptera (Halteria concept. A clade Coleopterida (Strepsiptera and Coleoptera was supported in two studies based on different combinations of protein coding nuclear genes. Analyses of data sets comprising seven or nine genes (7 single copy nuclear genes, respectively, yielded either a subordinate placement within Coleoptera or a sistergroup relationship with Neuropterida. Several early hypotheses based on a typological approach − affinities with Diptera, Coleoptera, a coleopteran subgroup, or Neuropterida − were revived using either a Hennigian approach or formal analyses of morphological characters or different molecular data sets. A phylogenomic approach finally supported a sistergroup relationship with monophyletic Coleoptera.

  18. Quantitative DNA Analyses for Airborne Birch Pollen.

    Directory of Open Access Journals (Sweden)

    Isabell Müller-Germann

    Full Text Available Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR, which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8 and the other for a multi-copy gene (ITS. The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm, the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  19. Quantitative DNA Analyses for Airborne Birch Pollen.

    Science.gov (United States)

    Müller-Germann, Isabell; Vogel, Bernhard; Vogel, Heike; Pauling, Andreas; Fröhlich-Nowoisky, Janine; Pöschl, Ulrich; Després, Viviane R

    2015-01-01

    Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR), which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm) and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8) and the other for a multi-copy gene (ITS). The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm), the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  20. A conceptual DFT approach towards analysing toxicity

    Indian Academy of Sciences (India)

    U Sarkar; D R Roy; P K Chattaraj; R Parthasarathi; J Padmanabhan; V Subramanian

    2005-09-01

    The applicability of DFT-based descriptors for the development of toxicological structure-activity relationships is assessed. Emphasis in the present study is on the quality of DFT-based descriptors for the development of toxicological QSARs and, more specifically, on the potential of the electrophilicity concept in predicting toxicity of benzidine derivatives and the series of polyaromatic hydrocarbons (PAH) expressed in terms of their biological activity data (50). First, two benzidine derivatives, which act as electron-donating agents in their interactions with biomolecules are considered. Overall toxicity in general and the most probable site of reactivity in particular are effectively described by the global and local electrophilicity parameters respectively. Interaction of two benzidine derivatives with nucleic acid (NA) bases/selected base pairs is determined using Parr’s charge transfer formula. The experimental biological activity data (50) for the family of PAH, namely polychlorinated dibenzofurans (PCDF), polyhalogenated dibenzo--dioxins (PHDD) and polychlorinated biphenyls (PCB) are taken as dependent variables and the HF energy (), along with DFT-based global and local descriptors, viz., electrophilicity index () and local electrophilic power (+) respectively are taken as independent variables. Fairly good correlation is obtained showing the significance of the selected descriptors in the QSAR on toxins that act as electron acceptors in the presence of biomolecules. Effects of population analysis schemes in the calculation of Fukui functions as well as that of solvation are probed. Similarly, some electron-donor aliphatic amines are studied in the present work. We see that global and local electrophilicities along with the HF energy are adequate in explaining the toxicity of several substances

  1. Seismic criteria studies and analyses. Quarterly progress report No. 3. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-03

    Information is presented concerning the extent to which vibratory motions at the subsurface foundation level might differ from motions at the ground surface and the effects of the various subsurface materials on the overall Clinch River Breeder Reactor site response; seismic analyses of LMFBR type reactors to establish analytical procedures for predicting structure stresses and deformations; and aspects of the current technology regarding the representation of energy losses in nuclear power plants as equivalent viscous damping.

  2. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  3. Evolution prediction from tomography

    Science.gov (United States)

    Dominy, Jason M.; Venuti, Lorenzo Campos; Shabani, Alireza; Lidar, Daniel A.

    2017-03-01

    Quantum process tomography provides a means of measuring the evolution operator for a system at a fixed measurement time t. The problem of using that tomographic snapshot to predict the evolution operator at other times is generally ill-posed since there are, in general, infinitely many distinct and compatible solutions. We describe the prediction, in some "maximal ignorance" sense, of the evolution of a quantum system based on knowledge only of the evolution operator for finitely many times 0evolution at times away from the measurement times. Even if the original evolution is unitary, the predicted evolution is described by a non-unitary, completely positive map.

  4. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  5. Wind Power Prediction Investigation

    Directory of Open Access Journals (Sweden)

    Yuanlong Liu

    2013-02-01

    Full Text Available Daily and real-time forecast data of wind power is predicted in this study using three methods, which are Kalman filter model, GARCH model and time-series-based BP neural network model. Then, owing to evaluation to the calculation of accuracy and qualification rate, the best method, the time-series-based BP neural network model, was selected for its highest accuracy. Moreover, the prediction error influence due to convergence of wind turbine is on consideration according to the evaluation. Finally, suggestions of improving the prediction accuracy were put forward based on the discussion of accuracy-obstacle factors.

  6. Analyse énonciative du point du vue, narration et analyse de discours

    Directory of Open Access Journals (Sweden)

    Alain Rabatel

    2007-01-01

    Full Text Available Cet article montre que l’analyse énonciative du point de vue (PDV, en rupture avec la typologie des focalisations de Genette, peut renouveler partiellement la narratologie, à la condition de substituter à l’approche immanentiste du récit une analyse interactionnelle de la narration. L’article présente d’abord l’approche énonciative du PDV, en appui des théories de Ducrot, et, sur cette base, propose diverses modalités de PDV (représenté, raconté, asserté qui donnent corps au point de vue des personnages ou du narrateur en modifiant sensiblement les analyses de Genette. Dans une deuxième partie, l’article envisage le rôle des PDV dans la narration, notamment dans la réévaluation des dimensions cognitive et pragmatique de la mimésis, puis dans les mécanismes inférentiels-interprétatifs, proches du système de sympathie de Jouve, enfin, dans la revalorisation du rôle du narrateur •puisque ce dernier se construit dans le même temps qu’il construit ses personnages.

  7. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  8. Sensitivity analyses of spatial population viability analysis models for species at risk and habitat conservation planning.

    Science.gov (United States)

    Naujokaitis-Lewis, Ilona R; Curtis, Janelle M R; Arcese, Peter; Rosenfeld, Jordan

    2009-02-01

    Population viability analysis (PVA) is an effective framework for modeling species- and habitat-recovery efforts, but uncertainty in parameter estimates and model structure can lead to unreliable predictions. Integrating complex and often uncertain information into spatial PVA models requires that comprehensive sensitivity analyses be applied to explore the influence of spatial and nonspatial parameters on model predictions. We reviewed 87 analyses of spatial demographic PVA models of plants and animals to identify common approaches to sensitivity analysis in recent publications. In contrast to best practices recommended in the broader modeling community, sensitivity analyses of spatial PVAs were typically ad hoc, inconsistent, and difficult to compare. Most studies applied local approaches to sensitivity analyses, but few varied multiple parameters simultaneously. A lack of standards for sensitivity analysis and reporting in spatial PVAs has the potential to compromise the ability to learn collectively from PVA results, accurately interpret results in cases where model relationships include nonlinearities and interactions, prioritize monitoring and management actions, and ensure conservation-planning decisions are robust to uncertainties in spatial and nonspatial parameters. Our review underscores the need to develop tools for global sensitivity analysis and apply these to spatial PVA.

  9. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Rev 00

    Energy Technology Data Exchange (ETDEWEB)

    David Dobson

    2001-06-30

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S&ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S&ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S&ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate and

  10. Chapter VII. Predicting Fertility

    Science.gov (United States)

    Section 2. Visual and Microscopic Approaches for Differentiating Unfertilized Germinal Discs and Early dead Embryos from Pre-Incubated Blastoderms Section 3. Predicting the Duration of fertility by Counting Sperm in the Outer Perivitelline Layer of Laid Eggs...

  11. Outcome predictability biases learning.

    Science.gov (United States)

    Griffiths, Oren; Mitchell, Chris J; Bethmont, Anna; Lovibond, Peter F

    2015-01-01

    Much of contemporary associative learning research is focused on understanding how and when the associative history of cues affects later learning about those cues. Very little work has investigated the effects of the associative history of outcomes on human learning. Three experiments extended the "learned irrelevance" paradigm from the animal conditioning literature to examine the influence of an outcome's prior predictability on subsequent learning of relationships between cues and that outcome. All 3 experiments found evidence for the idea that learning is biased by the prior predictability of the outcome. Previously predictable outcomes were readily associated with novel predictive cues, whereas previously unpredictable outcomes were more readily associated with novel nonpredictive cues. This finding highlights the importance of considering the associative history of outcomes, as well as cues, when interpreting multistage designs. Associative and cognitive explanations of this certainty matching effect are discussed.

  12. Predicting toxicity of nanoparticles

    OpenAIRE

    BURELLO ENRICO; Worth, Andrew

    2011-01-01

    A statistical model based on a quantitative structure–activity relationship accurately predicts the cytotoxicity of various metal oxide nanoparticles, thus offering a way to rapidly screen nanomaterials and prioritize testing.

  13. Highlights, predictions, and changes.

    Science.gov (United States)

    Jeang, Kuan-Teh

    2012-11-15

    Recent literature highlights at Retrovirology are described. Predictions are made regarding "hot" retrovirology research trends for the coming year based on recent journal access statistics. Changes in Retrovirology editor and the frequency of the Retrovirology Prize are announced.

  14. Predictable grammatical constructions

    DEFF Research Database (Denmark)

    Lucas, Sandra

    2015-01-01

    My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting...... that these predictable units should be considered grammatical constructions on a par with the nonpredictable constructions. Frequency has usually been seen as the only possible argument speaking in favor of viewing some formally and semantically fully predictable units as grammatical constructions. However, this paper...... semantically and formally predictable. Despite this difference, [méllo INF], like the other future periphrases, seems to be highly entrenched in the cognition (and grammar) of Early Medieval Greek language users, and consequently a grammatical construction. The syntactic evidence speaking in favor of [méllo...

  15. Predicted value of $0 \\, \

    CERN Document Server

    Maedan, Shinji

    2016-01-01

    Assuming that the lightest neutrino mass $ m_0 $ is measured, we study the influence of error of the measured $ m_0 $ on the uncertainty of the predicted value of the neutrinoless double beta decay ($0 \\, \

  16. Predicting Online Purchasing Behavior

    OpenAIRE

    W.R BUCKINX; D. VAN DEN POEL

    2003-01-01

    This empirical study investigates the contribution of different types of predictors to the purchasing behaviour at an online store. We use logit modelling to predict whether or not a purchase is made during the next visit to the website using both forward and backward variable-selection techniques, as well as Furnival and Wilson’s global score search algorithm to find the best subset of predictors. We contribute to the literature by using variables from four different categories in predicting...

  17. Nonparametric Predictive Regression

    OpenAIRE

    Ioannis Kasparis; Elena Andreou; Phillips, Peter C.B.

    2012-01-01

    A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit...

  18. Aircraft Noise Prediction

    OpenAIRE

    2014-01-01

    This contribution addresses the state-of-the-art in the field of aircraft noise prediction, simulation and minimisation. The point of view taken in this context is that of comprehensive models that couple the various aircraft systems with the acoustic sources, the propagation and the flight trajectories. After an exhaustive review of the present predictive technologies in the relevant fields (airframe, propulsion, propagation, aircraft operations, trajectory optimisation), the paper add...

  19. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  20. Runtime and Pressurization Analyses of Propellant Tanks

    Science.gov (United States)

    Field, Robert E.; Ryan, Harry M.; Ahuja, Vineet; Hosangadi, Ashvin; Lee, Chung P.

    2007-01-01

    Multi-element unstructured CFD has been utilized at NASA SSC to carry out analyses of propellant tank systems in different modes of operation. The three regimes of interest at SSC include (a) tank chill down (b) tank pressurization and (c) runtime propellant draw-down and purge. While tank chill down is an important event that is best addressed with long time-scale heat transfer calculations, CFD can play a critical role in the tank pressurization and runtime modes of operation. In these situations, problems with contamination of the propellant by inclusion of the pressurant gas from the ullage causes a deterioration of the quality of the propellant delivered to the test article. CFD can be used to help quantify the mixing and propellant degradation. During tank pressurization under some circumstances, rapid mixing of relatively warm pressurant gas with cryogenic propellant can lead to rapid densification of the gas and loss of pressure in the tank. This phenomenon can cause serious problems during testing because of the resulting decrease in propellant flow rate. With proper physical models implemented, CFD can model the coupling between the propellant and pressurant including heat transfer and phase change effects and accurately capture the complex physics in the evolving flowfields. This holds the promise of allowing the specification of operational conditions and procedures that could minimize the undesirable mixing and heat transfer inherent in propellant tank operation. It should be noted that traditional CFD modeling is inadequate for such simulations because the fluids in the tank are in a range of different sub-critical and supercritical states and elaborate phase change and mixing rules have to be developed to accurately model the interaction between the ullage gas and the propellant. We show a typical run-time simulation of a spherical propellant tank, containing RP-1 in this case, being pressurized with room-temperature nitrogen at 540 R. Nitrogen

  1. Operational Dust Prediction

    Science.gov (United States)

    Benedetti, Angela; Baldasano, Jose M.; Basart, Sara; Benincasa, Francesco; Boucher, Olivier; Brooks, Malcolm E.; Chen, Jen-Ping; Colarco, Peter R.; Gong, Sunlin; Huneeus, Nicolas; Jones, Luke; Lu, Sarah; Menut, Laurent; Morcrette, Jean-Jacques; Mulcahy, Jane; Nickovic, Slobodan; Garcia-Pando, Carlos P.; Reid, Jeffrey S.; Sekiyama, Thomas T.; Tanaka, Taichu Y.; Terradellas, Enric; Westphal, Douglas L.; Zhang, Xiao-Ye; Zhou, Chun-Hong

    2014-01-01

    Over the last few years, numerical prediction of dust aerosol concentration has become prominent at several research and operational weather centres due to growing interest from diverse stakeholders, such as solar energy plant managers, health professionals, aviation and military authorities and policymakers. Dust prediction in numerical weather prediction-type models faces a number of challenges owing to the complexity of the system. At the centre of the problem is the vast range of scales required to fully account for all of the physical processes related to dust. Another limiting factor is the paucity of suitable dust observations available for model, evaluation and assimilation. This chapter discusses in detail numerical prediction of dust with examples from systems that are currently providing dust forecasts in near real-time or are part of international efforts to establish daily provision of dust forecasts based on multi-model ensembles. The various models are introduced and described along with an overview on the importance of dust prediction activities and a historical perspective. Assimilation and evaluation aspects in dust prediction are also discussed.

  2. L'analyse qualitative comme approche multiple

    Directory of Open Access Journals (Sweden)

    Roberto Cipriani

    2009-11-01

    Full Text Available L’exemple de l’enquête historique, visant à identifier les caractéristiques de la naissance et du développement d’une science et des lectures qu’elle donne des événements sociaux, est des plus originaux. Toute méthodologie historique non seulement débouche sur une pure et simple masse d’épisodes et d’événements, mais est également une narration et une élaboration critique de ces mêmes faits. Michael Postan écrit à juste titre que la complexité des données historiques est cependant de telle nature, et les différences et les similitudes tellement difficiles à cerner, que les efforts des historiens et des sociologues pour construire des comparaisons explicites se sont soldées, pour la plupart, par des tentatives grossières et naïves. La leçon des Annales a contribué en effet à construire l’idée d’une histoire qui puisse lire et expliquer ce qui est uniforme et ce qui est singulier. Rien de plus naturel que la réunion d’« êtres psychiques », à l’instar de l’assemblage des cellules en un organisme, en un « être psychique » nouveau et différent. Un tournant s’impose donc vers une expérimentation empirique plus ample et plus correcte, afin de disposer des instruments adéquats, capables de garantir à la méthodologie micro, qualitative et biographique, une fiabilité suffisante.Historical approach offers a relevant contribution in order to find the features of birth and development of a science which analyses social events. Historical methodology produces not only a lot of data but also a narrative, and an interpretation of facts. According to Michael Postan, history and sociology have made many efforts to compare data that are complex but similar and different at the same time. And the results seem to be naïf. Thanks to Les Annales suggestion it is possible to read and to explain what is uniform and what is singular. To put together “psychical beings”, like organic cells, in a new

  3. MEDUSA (Martian Environmental DUst Systematic Analyser)

    Science.gov (United States)

    Battaglia, R.; Colangeli, L.; della Corte, V.; Esposito, F.; Ferrini, G.; Mazzotta Epifani, E.; Palomba, E.; Palumbo, P.; Panizza, A.; Rotundi, A.

    2003-04-01

    ) onboard the Mars Global Surveyor. Seasonal variations in the column abundance are due to the combined effect of exchange of H_2O between atmosphere and water reservoirs (i.e. polar caps, regolith) and atmospheric transport. Despite the low absolute water content (0.03% by volume), relative humidity can exceed 100% leading to frosting phenomena, thanks to low Martian temperatures. The typical value of the pressure at surface, close to the triple point value of water phase diagram, makes the persistence of liquid water at the surface of Mars highly improbable. This means that the water is probably present exclusively in gaseous and solid states, at the surface level. Attempts to use space-born and earth-based observations to estimate quantitatively surface and near-surface sources and sinks of water vapour have had good but also partial success. Most important questions that appear from the present knowledge is how the water vapour atmospheric circulation occurs and how to explain the difference in the hemispheric and seasonal behaviour of the water vapour. Despite TES results showed that a percentage of hemispheric "asymmetry" of the seasonal vapour abundance was probably due to the presence of two dust storms during MAWD observations, an evident difference remains partially unexplained. In this context, it is extremely important to study the role of the different contributions to the production of atmospheric vapour from the main reservoirs and to the formation of water ice clouds most probably catalysed by the atmospheric dust. At present, no in situ measurement of water vapour content was performed yet. We discuss the possibility of using a new concept instrument for extraterrestrial planetary environments, based on the past experience acquired for dust monitoring in space and on Earth and new possible technologies for space applications. MEDUSA (Martian Environmental Dust Analyser) project is a multisensor and multistage instrument based on an optical detector of dust

  4. Cytomics in predictive medicine

    Science.gov (United States)

    Tarnok, Attila; Valet, Guenther K.

    2004-07-01

    Predictive Medicine aims at the detection of changes in patient's disease state prior to the manifestation of deterioration or improvement of the current status. Patient-specific, disease-course predictions with >95% or >99% accuracy during therapy would be highly valuable for everyday medicine. If these predictors were available, disease aggravation or progression, frequently accompanied by irreversible tissue damage or therapeutic side effects, could then potentially be avoided by early preventive therapy. The molecular analysis of heterogeneous cellular systems (Cytomics) by cytometry in conjunction with pattern-oriented bioinformatic analysis of the multiparametric cytometric and other data provides a promising approach to individualized or personalized medical treatment or disease management. Predictive medicine is best implemented by cell oriented measurements e.g. by flow or image cytometry. Cell oriented gene or protein arrays as well as bead arrays for the capture of solute molecules form serum, plasma, urine or liquor are equally of high value. Clinical applications of predictive medicine by Cytomics will include multi organ failure in sepsis or non infectious posttraumatic shock in intensive care, or the pretherapeutic identification of high risk patients in cancer cytostatic. Early individualized therapy may provide better survival chances for individual patient at concomitant cost containment. Predictive medicine guided early reduction or stop of therapy may lower or abrogate potential therapeutic side effects. Further important aspects of predictive medicine concern the preoperative identification of patients with a tendency for postoperative complications or coronary artery disease patients with an increased tendency for restenosis. As a consequence, better patient care and new forms of inductive scientific hypothesis development based on the interpretation of predictive data patterns are at reach.

  5. Analyses of the early stages of star formation

    Science.gov (United States)

    Lintott, Christopher John

    This thesis presents a study of the physical and chemical properties of star forming regions, both in the Milky Way and in the distant Universe, building on the existing astrochem- ical models developed by the group at UCL. Observations of the nearby star-forming region, L134A, which were carried out with the James Clark Maxwell Telescope (JCMT) in Hawai'i are compared to the predictions of a model of star formation from gas rich in atomic (rather than molecular) hydrogen. A similar model is used to investigate the effect of non-equilibrium chemistry on the derivation of the cosmic-ray ionization rate, an important parameter in controlling both the chemistry and the physics of star forming clumps. A collapse faster than free-fall is proposed as an explanation for differences be tween the distribution of CS and N2H+ in such regions. Moving beyond the Milky Way, JCMT observations of sulphur-bearing species in the nearby starburst galaxy, M82, are presented and compared with existing molecular observations of similar systems. M82 is a local anlogue for star forming systems in the early Universe, many of which have star formation rates several thousand times that of the Milky Way. A model which treats the molecular gas in such systems as an assembly of 'hot cores' (protostellar cores which have a distinctive chemical signature) has been developed, and is used to predict the abundance of many species. An application of this model is used to explain the observed deviation in the early Universe from the otherwise tight relation between infrared and HCN luminosity via relatively recent star formation from near-primordial gas. Many of the stars formed in the early Universe must now be in massive elliptical systems, and work on the structure of these systems is presented. Data from the Sloan Digital Sky Survey is analysed to show that such galaxies have cores dominated by baryons rather than dark matter, and the dark matter profile is constrained by adiabatic contraction.

  6. Non-destructive infrared analyses: a method for provenance analyses of sandstones

    Science.gov (United States)

    Bowitz, Jörg; Ehling, Angela

    2008-12-01

    Infrared spectroscopy (IR spectroscopy) is commonly applied in the laboratory for mineral analyses in addition to XRD. Because such technical efforts are time and cost consuming, we present an infrared-based mobile method for non-destructive mineral and provenance analyses of sandstones. IR spectroscopy is based on activating chemical bonds. By irradiating a mineral mixture, special bonds are activated to vibrate depending on the bond energy (resonance vibration). Accordingly, the energy of the IR spectrum will be reduced thereby generating an absorption spectrum. The positions of the absorption maxima within the spectral region indicate the type of the bonds and in many cases identify minerals containing these bonds. The non-destructive reflection spectroscopy operates in the near infrared region (NIR) and can detect all common clay minerals as well as sulfates, hydroxides and carbonates. The spectra produced have been interpreted by computer using digital mineral libraries that have been especially collected for sandstones. The comparison of all results with XRD, RFA and interpretations of thin sections demonstrates impressively the accuracy and reliability of this method. Not only are different minerals detectable, but also differently ordered kaolinites and varieties of illites can be identified by the shape and size of the absorption bands. Especially clay minerals and their varieties in combination with their relative contents form the characteristic spectra of sandstones. Other components such as limonite, hematite and amorphous silica also influence the spectra. Sandstones, similar in colour and texture, often can be identified by their characteristic reflectance spectra. Reference libraries with more than 60 spectra of important German sandstones have been created to enable entirely computerized interpretations and identifications of these dimension stones. The analysis of infrared spectroscopy results is demonstrated with examples of different sandstones

  7. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  8. Prediction of Factors Determining Changes in Stability in Protein Mutants

    OpenAIRE

    Parthiban, Vijayarangakannan

    2006-01-01

    Analysing the factors behind protein stability is a key research topic in molecular biology and has direct implications on protein structure prediction and protein-protein docking solutions. Protein stability upon point mutations were analysed using a distance dependant pair potential representing mainly through-space interactions and torsion angle potential representing neighbouring effects as a basic statistical mechanical setup for the analysis. The synergetic effect of accessible surface ...

  9. Prediction of cereal feed value using spectroscopy and chemometrics

    DEFF Research Database (Denmark)

    Jørgensen, Johannes Ravn; Gislum, René

    2009-01-01

    of EDOM, EDOMi, FEso and FEsv. The outcome of a successful NIRS calibration will be a relatively cheap tool to monitor, diversify and evaluate the quality of cereals for animal feed, a possible tool to assess the feed value of new varieties in the variety testing and a useful, cheap and rapid tool...... for cereal breeders. A collection of 1213 grain samples of wheat, triticale, barley and rye, and related chemical reference analyses to describe the feed value have been established. The samples originate from available field trials over a three-year period. The chemical reference analyses are dry matter...... value, the prediction error has to be compared with the error in the chemical analysis. Prediction error by NIRS prediction of feed value is above the error of the chemical measurement. The conclusion is that it is possible to predict the feed value in cereals with NIRS quickly and cheaply...

  10. Aircraft noise prediction

    Science.gov (United States)

    Filippone, Antonio

    2014-07-01

    This contribution addresses the state-of-the-art in the field of aircraft noise prediction, simulation and minimisation. The point of view taken in this context is that of comprehensive models that couple the various aircraft systems with the acoustic sources, the propagation and the flight trajectories. After an exhaustive review of the present predictive technologies in the relevant fields (airframe, propulsion, propagation, aircraft operations, trajectory optimisation), the paper addresses items for further research and development. Examples are shown for several airplanes, including the Airbus A319-100 (CFM engines), the Bombardier Dash8-Q400 (PW150 engines, Dowty R408 propellers) and the Boeing B737-800 (CFM engines). Predictions are done with the flight mechanics code FLIGHT. The transfer function between flight mechanics and the noise prediction is discussed in some details, along with the numerical procedures for validation and verification. Some code-to-code comparisons are shown. It is contended that the field of aircraft noise prediction has not yet reached a sufficient level of maturity. In particular, some parametric effects cannot be investigated, issues of accuracy are not currently addressed, and validation standards are still lacking.

  11. Validation of HELIOS for ATR Core Follow Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bays, Samuel E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Swain, Emily T. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Crawford, Douglas S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nigg, David W. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    This work summarizes the validation analyses for the HELIOS code to support core design and safety assurance calculations of the Advanced Test Reactor (ATR). Past and current core safety assurance is performed by the PDQ-7 diffusion code; a state of the art reactor physics simulation tool from the nuclear industry’s earlier days. Over the past twenty years, improvements in computational speed have enabled the use of modern neutron transport methodologies to replace the role of diffusion theory for simulation of complex systems, such as the ATR. More exact methodologies have enabled a paradigm-shift away from highly tuned codes that force compliance with a bounding safety envelope, and towards codes regularly validated against routine measurements. To validate HELIOS, the 16 ATR operational cycles from late-2009 to present were modeled. The computed power distribution was compared against data collected by the ATR’s on-line power surveillance system. It was found that the ATR’s lobe-powers could be determined with ±10% accuracy. Also, the ATR’s cold startup shim configuration for each of these 16 cycles was estimated and compared against the reported critical position from the reactor log-book. HELIOS successfully predicted criticality within the tolerance set by the ATR startup procedure for 13 out of the 16 cycles. This is compared to 12 times for PDQ (without empirical adjustment). These findings, as well as other insights discussed in this report, suggest that HELIOS is highly suited for replacing PDQ for core safety assurance of the ATR. Furthermore, a modern verification and validation framework has been established that allows reactor and fuel performance data to be computed with a known degree of accuracy and stated uncertainty.

  12. Pan-cancer analyses of the nuclear receptor superfamily

    Science.gov (United States)

    Long, Mark D.; Campbell, Moray J.

    2016-01-01

    Nuclear receptors (NR) act as an integrated conduit for environmental and hormonal signals to govern genomic responses, which relate to cell fate decisions. We review how their integrated actions with each other, shared co-factors and other transcription factors are disrupted in cancer. Steroid hormone nuclear receptors are oncogenic drivers in breast and prostate cancer and blockade of signaling is a major therapeutic goal. By contrast to blockade of receptors, in other cancers enhanced receptor function is attractive, as illustrated initially with targeting of retinoic acid receptors in leukemia. In the post-genomic era large consortia, such as The Cancer Genome Atlas, have developed a remarkable volume of genomic data with which to examine multiple aspects of nuclear receptor status in a pan-cancer manner. Therefore to extend the review of NR function we have also undertaken bioinformatics analyses of NR expression in over 3000 tumors, spread across six different tumor types (bladder, breast, colon, head and neck, liver and prostate). Specifically, to ask how the NR expression was distorted (altered expression, mutation and CNV) we have applied bootstrapping approaches to simulate data for comparison, and also compared these NR findings to 12 other transcription factor families. Nuclear receptors were uniquely and uniformly downregulated across all six tumor types, more than predicted by chance. These approaches also revealed that each tumor type had a specific NR expression profile but these were most similar between breast and prostate cancer. Some NRs were down-regulated in at least five tumor types (e.g. NR3C2/MR and NR5A2/LRH-1)) whereas others were uniquely down-regulated in one tumor (e.g. NR1B3/RARG). The downregulation was not driven by copy number variation or mutation and epigenetic mechanisms maybe responsible for the altered nuclear receptor expression. PMID:27200367

  13. Pan-Cancer Analyses of the Nuclear Receptor Superfamily

    Directory of Open Access Journals (Sweden)

    Mark D. Long

    2015-12-01

    Full Text Available Nuclear receptors (NR act as an integrated conduit for environmental and hormonal signals to govern genomic responses, which relate to cell fate decisions. We review how their integrated actions with each other, shared co-factors and other transcription factors are disrupted in cancer. Steroid hormone nuclear receptors are oncogenic drivers in breast and prostate cancer and blockade of signaling is a major therapeutic goal. By contrast to blockade of receptors, in other cancers enhanced receptor function is attractive, as illustrated initially with targeting of retinoic acid receptors in leukemia. In the post-genomic era large consortia, such as The Cancer Genome Atlas, have developed a remarkable volume of genomic data with which to examine multiple aspects of nuclear receptor status in a pan-cancer manner. Therefore to extend the review of NR function we have also undertaken bioinformatics analyses of NR expression in over 3000 tumors, spread across six different tumor types (bladder, breast, colon, head and neck, liver and prostate. Specifically, to ask how the NR expression was distorted (altered expression, mutation and CNV we have applied bootstrapping approaches to simulate data for comparison, and also compared these NR findings to 12 other transcription factor families. Nuclear receptors were uniquely and uniformly downregulated across all six tumor types, more than predicted by chance. These approaches also revealed that each tumor type had a specific NR expression profile but these were most similar between breast and prostate cancer. Some NRs were down-regulated in at least five tumor types (e.g., NR3C2/MR and NR5A2/LRH-1 whereas others were uniquely down-regulated in one tumor (e.g., NR1B3/RARG. The downregulation was not driven by copy number variation or mutation and epigenetic mechanisms maybe responsible for the altered nuclear receptor expression.

  14. Singular vector decomposition for sensitivity analyses of tropospheric chemical scenarios

    Directory of Open Access Journals (Sweden)

    N. Goris

    2011-06-01

    Full Text Available Observations of the chemical state of the atmosphere typically provide only sparse snapshots of the state of the system due to their insufficient temporal and spatial density. Therefore the measurement configurations need to be optimised to get a best possible state estimate. One possibility to optimise the state estimate is provided by observation targeting of sensitive system states, to identify measurement configurations of best value for forecast improvements. In recent years, numerical weather prediction adapted singular vector analysis with respect to initial values as a novel method to identify sensitive states. In the present work, this technique is transferred from meteorological to chemical forecast. Besides initial values, emissions are investigated as controlling variables. More precisely uncertainties in the amplitude of the diurnal profile of emissions are analysed, yielding emission factors as target variables. Singular vector analysis is extended to allow for projected target variables not only at final time but also at initial time. Further, special operators are introduced, which consider the combined influence of groups of chemical species.

    As a preparation for targeted observation calculations, the concept of adaptive observations is studied with a chemistry box model. For a set of six different scenarios, the VOC versus NOx limitation of the ozone formation is investigated. Results reveal, that the singular vectors are strongly dependent on start time and length of the simulation. As expected, singular vectors with initial values as target variables tend to be more sensitive to initial values, while emission factors as target variables are more sensitive to simulation length. Further, the particular importance of chemical compounds differs strongly between absolute and relative error growth.

  15. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  16. Prediction of postoperative pain: a systematic review of predictive experimental pain studies

    DEFF Research Database (Denmark)

    Werner, Mads Utke; Mjöbo, Helena N; Nielsen, Per R

    2010-01-01

    Quantitative testing of a patient's basal pain perception before surgery has the potential to be of clinical value if it can accurately predict the magnitude of pain and requirement of analgesics after surgery. This review includes 14 studies that have investigated the correlation between preoper...... previously reported for single factor analyses of demographics and psychologic factors. In addition, some of these studies indicate that an increase in preoperative pain sensitivity is associated with a high probability of development of sustained postsurgical pain....

  17. Some Observations on Damage Tolerance Analyses in Pressure Vessels

    Science.gov (United States)

    Raju, Ivatury S.; Dawicke, David S.; Hampton, Roy W.

    2017-01-01

    that for this loading, using Approach I and the initial detectable crack sizes at the (a/c) endpoints in 5009 specified for the ET and UT NDE methods, the smallest life is not at the two required limits of the (a/c) range, but rather is at an intermediate configuration in the range (a/c) of 0.4 to 0.6. Similar analyses using both Approach I and III with the initial detectable crack size at the (a/c) endpoints in 5009 for PT NDE showed the smallest life may be at an (a/c) endpoint or an intermediate (a/c), depending upon which Approach is used. As such, analyses that interrogate only the two (a/c) values of 0.2 and 1 may result in unconservative life predictions. The standard practice may need to be revised based on these results.

  18. Predicting tile drainage discharge

    DEFF Research Database (Denmark)

    Iversen, Bo Vangsø; Kjærgaard, Charlotte; Petersen, Rasmus Jes;

    of the water load coming from the tile drainage system is therefore essential. This work aims at predicting tile drainage discharge using dynamic as well as a statistical predictive models. A large dataset of historical tile drain discharge data, daily discharge values as well as yearly average values were......More than 50 % of Danish agricultural areas are expected to be artificial tile drained. Transport of water and nutrients through the tile drain system to the aquatic environment is expected to be significant. For different mitigation strategies such as constructed wetlands an exact knowledge...... used in the analysis. For the dynamic modelling, a simple linear reservoir model was used where different outlets in the model represented tile drain as well as groundwater discharge outputs. This modelling was based on daily measured tile drain discharge values. The statistical predictive model...

  19. Partially predictable chaos

    CERN Document Server

    Wernecke, Hendrik; Gros, Claudius

    2016-01-01

    For a chaotic system pairs of initially close-by trajectories become eventually fully uncorrelated on the attracting set. This process of decorrelation is split into an initial decrease characterized by the maximal Lyapunov exponent and a subsequent diffusive process on the chaotic attractor causing the final loss of predictability. The time scales of both processes can be either of the same or of very different orders of magnitude. In the latter case the two trajectories linger within a finite but small distance (with respect to the overall size of the attractor) for exceedingly long times and therefore remain partially predictable. We introduce a 0-1 indicator for chaos capable of describing this scenario, arguing, in addition, that the chaotic closed braids found close to a period-doubling transition are generically partially predictable.

  20. Basis of predictive mycology.

    Science.gov (United States)

    Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice

    2005-04-15

    For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.

  1. Prediction of Antibody Epitopes

    DEFF Research Database (Denmark)

    Nielsen, Morten; Marcatili, Paolo

    2015-01-01

    Antibodies recognize their cognate antigens in a precise and effective way. In order to do so, they target regions of the antigenic molecules that have specific features such as large exposed areas, presence of charged or polar atoms, specific secondary structure elements, and lack of similarity...... to self-proteins. Given the sequence or the structure of a protein of interest, several methods exploit such features to predict the residues that are more likely to be recognized by an immunoglobulin.Here, we present two methods (BepiPred and DiscoTope) to predict linear and discontinuous antibody...

  2. Linguistic Structure Prediction

    CERN Document Server

    Smith, Noah A

    2011-01-01

    A major part of natural language processing now depends on the use of text data to build linguistic analyzers. We consider statistical, computational approaches to modeling linguistic structure. We seek to unify across many approaches and many kinds of linguistic structures. Assuming a basic understanding of natural language processing and/or machine learning, we seek to bridge the gap between the two fields. Approaches to decoding (i.e., carrying out linguistic structure prediction) and supervised and unsupervised learning of models that predict discrete structures as outputs are the focus. W

  3. Nuclear level density predictions

    Directory of Open Access Journals (Sweden)

    Bucurescu Dorel

    2015-01-01

    Full Text Available Simple formulas depending only on nuclear masses were previously proposed for the parameters of the Back-Shifted Fermi Gas (BSFG model and of the Constant Temperature (CT model of the nuclear level density, respectively. They are now applied for the prediction of the level density parameters of all nuclei with available masses. Both masses from the new 2012 mass table and from different models are considered and the predictions are discussed in connection with nuclear regions most affected by shell corrections and nuclear structure effects and relevant for the nucleosynthesis.

  4. Angular analyses in relativistic quantum mechanics; Analyses angulaires en mecanique quantique relativiste

    Energy Technology Data Exchange (ETDEWEB)

    Moussa, P. [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les

  5. A Characterization of Prediction Errors

    OpenAIRE

    Meek, Christopher

    2016-01-01

    Understanding prediction errors and determining how to fix them is critical to building effective predictive systems. In this paper, we delineate four types of prediction errors and demonstrate that these four types characterize all prediction errors. In addition, we describe potential remedies and tools that can be used to reduce the uncertainty when trying to determine the source of a prediction error and when trying to take action to remove a prediction errors.

  6. Azimuthal angular distributions in EDDE as a spin-parity analyser and glueball filter for the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Petrov, Vladimir Alexeevich [Institute for High Energy Physics, 142 281, Protvino (Russian Federation); Ryutin, Roman Anatolievich [Institute for High Energy Physics, 142 281, Protvino (Russian Federation); Sobol, Andrei E. [Institute for High Energy Physics, 142 281, Protvino (Russian Federation); Guillaud, Jean-Paul [LAPP, Annecy (France)

    2005-06-01

    Exclusive Double Diffractive Events (EDDE) are analysed as the source of the information about the central system. The experimental possibilities for the exotic particles searches are considered. From the reggeized tensor current picture some azimuthal angle dependences were obtained to fit the data from the WA102 experiment and to make predictions for the LHC collider.

  7. Prediction method abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  8. Predictability of critical transitions

    Science.gov (United States)

    Zhang, Xiaozhu; Kuehn, Christian; Hallerberg, Sarah

    2015-11-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socioeconomic changes and climate transitions between ice ages and warm ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However, especially in the presence of noise, it is not clear whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the quadratic integrate-and-fire model and the van der Pol model under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictability of the system. The performance of different indicator variables turns out to be dependent on the specific model under study and the conditions of accessing it. Furthermore, we study the influence of the magnitude of transitions on the predictive performance.

  9. Predicting coronary heart disease

    DEFF Research Database (Denmark)

    Sillesen, Henrik; Fuster, Valentin

    2012-01-01

    Atherosclerosis is the leading cause of death and disabling disease. Whereas risk factors are well known and constitute therapeutic targets, they are not useful for prediction of risk of future myocardial infarction, stroke, or death. Therefore, methods to identify atherosclerosis itself have been...

  10. THE PREDICTION OF OVULATION

    Institute of Scientific and Technical Information of China (English)

    WANGXin-Xing; ZHAShu-Wei; WUZhou-Ya

    1989-01-01

    The authors present their work on the prediction of ovulation in forty-five women with normal menstrual cycles for a total of 72 cycles by several indices, including ultrasonography, BBT graph, cervical mucus and mittelschmerz, LH peak values were also determined for reference in 20 cases ( 20 cycles ), Results are as follows:

  11. Space Weather Prediction

    Science.gov (United States)

    2014-10-31

    Mason University (Odstrcil), worked to modify the WSA-Enlil operational solar wind model so that it runs in a more realistic, time-dependent fashion...Ruždjak, D., Cliver, E., Svalgaard, L., and Roth , M., “On solar cycle predictions and reconstructions,” Astronomy & Astrophysics, 496, Mar 2009, pp

  12. Predicting Visibility of Aircraft

    Science.gov (United States)

    Watson, Andrew; Ramirez, Cesar V.; Salud, Ellen

    2009-01-01

    Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO). In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration. PMID:19462007

  13. Predicting Reasoning from Memory

    Science.gov (United States)

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  14. Predictive models in urology.

    Science.gov (United States)

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology.

  15. Vertebral Fracture Prediction

    DEFF Research Database (Denmark)

    2008-01-01

    Vertebral Fracture Prediction A method of processing data derived from an image of at least part of a spine is provided for estimating the risk of a future fracture in vertebraeof the spine. Position data relating to at least four neighbouring vertebrae of the spine is processed. The curvature...

  16. Highlights, predictions, and changes

    Directory of Open Access Journals (Sweden)

    Jeang Kuan-Teh

    2012-11-01

    Full Text Available Abstract Recent literature highlights at Retrovirology are described. Predictions are made regarding “hot” retrovirology research trends for the coming year based on recent journal access statistics. Changes in Retrovirology editor and the frequency of the Retrovirology Prize are announced.

  17. Highlights, predictions, and changes

    OpenAIRE

    Jeang Kuan-Teh

    2012-01-01

    Abstract Recent literature highlights at Retrovirology are described. Predictions are made regarding “hot” retrovirology research trends for the coming year based on recent journal access statistics. Changes in Retrovirology editor and the frequency of the Retrovirology Prize are announced.

  18. Predicting Intrinsic Motivation

    Science.gov (United States)

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the extent to…

  19. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje

    2014-01-01

    To investigate the role of neurological abnormalities and magnetic resonance imaging (MRI) lesions in predicting global functional decline in a cohort of initially independent-living elderly subjects. The Leukoaraiosis And DISability (LADIS) Study, involving 11 European centres, was primarily aimed...

  20. Hypotheses and Inductive Predictions

    NARCIS (Netherlands)

    ROMEYN, J.-W.

    2008-01-01

    ABSTRACT. This paper studies the use of hypotheses schemes in generating inductive predictions. After discussing Carnap–Hintikka inductive logic, hypotheses schemes are defined and illustrated with two partitions. One partition results in the Carnapian continuum of inductive methods, the other resul

  1. Predicting Classroom Success.

    Science.gov (United States)

    Kessler, Ronald P.

    A study was conducted at Rancho Santiago College (RSC) to identify personal and academic factors that are predictive of students' success in their courses. The study examined the following possible predictors of success: language and math test scores; background characteristics; length of time out of high school; high school background; college…

  2. Predicting rainfall beyond tomorrow

    Science.gov (United States)

    NOAA’s Climate Prediction Center issues climate precipitation forecasts that offer potential support for water resource managers and farmers and ranchers in New Mexico, but the forecasts are frequently misunderstood and not widely used in practical decision making. The objectives of this newsletter ...

  3. Genetically optimizing weather predictions

    Science.gov (United States)

    Potter, S. B.; Staats, Kai; Romero-Colmenero, Encarni

    2016-07-01

    humidity, air pressure, wind speed and wind direction) into a database. Built upon this database, we have developed a remarkably simple approach to derive a functional weather predictor. The aim is provide up to the minute local weather predictions in order to e.g. prepare dome environment conditions ready for night time operations or plan, prioritize and update weather dependent observing queues. In order to predict the weather for the next 24 hours, we take the current live weather readings and search the entire archive for similar conditions. Predictions are made against an averaged, subsequent 24 hours of the closest matches for the current readings. We use an Evolutionary Algorithm to optimize our formula through weighted parameters. The accuracy of the predictor is routinely tested and tuned against the full, updated archive to account for seasonal trends and total, climate shifts. The live (updated every 5 minutes) SALT weather predictor can be viewed here: http://www.saao.ac.za/ sbp/suthweather_predict.html

  4. PREDICTION OF OVULATION

    Institute of Scientific and Technical Information of China (English)

    LIUYong; CHENSu-Ru; ZHOUJin-Ting; LIUJi-Ying

    1989-01-01

    The purpose or this research is: I) to observe the secretory pattern of five reproductive hormones in Chinese women with normal menstrual cyclcs, especially at the prc-ovulatory peroid; 2) to study whether urinary LH measurement could be used instead of serum LH measurement; 3) to evaluate the significance of LH-EIA kit (Right-Day) for ovulation prediction.

  5. Integrative analyses shed new light on human ribosomal protein gene regulation

    Science.gov (United States)

    Li, Xin; Zheng, Yiyu; Hu, Haiyan; Li, Xiaoman

    2016-01-01

    Ribosomal protein genes (RPGs) are important house-keeping genes that are well-known for their coordinated expression. Previous studies on RPGs are largely limited to their promoter regions. Recent high-throughput studies provide an unprecedented opportunity to study how human RPGs are transcriptionally modulated and how such transcriptional regulation may contribute to the coordinate gene expression in various tissues and cell types. By analyzing the DNase I hypersensitive sites under 349 experimental conditions, we predicted 217 RPG regulatory regions in the human genome. More than 86.6% of these computationally predicted regulatory regions were partially corroborated by independent experimental measurements. Motif analyses on these predicted regulatory regions identified 31 DNA motifs, including 57.1% of experimentally validated motifs in literature that regulate RPGs. Interestingly, we observed that the majority of the predicted motifs were shared by the predicted distal and proximal regulatory regions of the same RPGs, a likely general mechanism for enhancer-promoter interactions. We also found that RPGs may be differently regulated in different cells, indicating that condition-specific RPG regulatory regions still need to be discovered and investigated. Our study advances the understanding of how RPGs are coordinately modulated, which sheds light to the general principles of gene transcriptional regulation in mammals. PMID:27346035

  6. Prolonged grief and depression after unnatural loss: Latent class analyses and cognitive correlates.

    Science.gov (United States)

    Boelen, Paul A; Reijntjes, Albert; J Djelantik, A A A Manik; Smid, Geert E

    2016-06-30

    This study sought to identify (a) subgroups among people confronted with unnatural/violent loss characterized by different symptoms profiles of prolonged grief disorder (PGD) and depression, and (b) socio-demographic, loss-related, and cognitive variables associated with subgroup membership. We used data from 245 individuals confronted with the death of a loved one due to an accident (47.3%), suicide (49%) or homicide (3.7%). Latent class analysis revealed three classes of participants: a resilient-class (25.3%), a predominantly PGD-class (39.2%), and a combined PGD/Depression-class (35.5%). Membership in the resilient-class was predicted by longer time since loss and lower age; membership in the combined class was predicted by lower education. Endorsement of negative cognitions about the self, life, the future, and one's own grief-reactions was lowest in the Resilient-class, intermediate in the PGD-class, and highest in the combined PGD/Depression-class. When all socio-demographic, loss-related, and cognitive variables were included in multinomial regression analyses predicting class-membership, it was found that negative cognitions about one's grief was the only variable predicting membership of the PGD-class. Negative cognitions about the self, life, and grief predicted membership of the combined PGD/Depression-class. These findings provide valuable information for the development of interventions for different subgroups of bereaved individuals confronted with unnatural/violent loss.

  7. Refining intra-protein contact prediction by graph analysis

    Directory of Open Access Journals (Sweden)

    Eyal Eran

    2007-05-01

    Full Text Available Abstract Background Accurate prediction of intra-protein residue contacts from sequence information will allow the prediction of protein structures. Basic predictions of such specific contacts can be further refined by jointly analyzing predicted contacts, and by adding information on the relative positions of contacts in the protein primary sequence. Results We introduce a method for graph analysis refinement of intra-protein contacts, termed GARP. Our previously presented intra-contact prediction method by means of pair-to-pair substitution matrix (P2PConPred was used to test the GARP method. In our approach, the top contact predictions obtained by a basic prediction method were used as edges to create a weighted graph. The edges were scored by a mutual clustering coefficient that identifies highly connected graph regions, and by the density of edges between the sequence regions of the edge nodes. A test set of 57 proteins with known structures was used to determine contacts. GARP improves the accuracy of the P2PConPred basic prediction method in whole proteins from 12% to 18%. Conclusion Using a simple approach we increased the contact prediction accuracy of a basic method by 1.5 times. Our graph approach is simple to implement, can be used with various basic prediction methods, and can provide input for further downstream analyses.

  8. Exchange Rate Predictions

    OpenAIRE

    Yablonskyy, Karen

    2012-01-01

    The aim of this thesis is to analyze the foreign exchange currency forecasting techniques. Moreover the central idea behind the topic is to develop the strategy of forecasting by choosing indicators and techniques to make own forecast on currency pair EUR/USD. This thesis work is a mixture of theory and practice analyses. The goal during the work on this project was to study different types of forecasting techniques and make own forecast, practice forecasting and trading on Forex platform, ba...

  9. Predictive Manufacturing: Classification of categorical data

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat;

    2017-01-01

    processes is high volume of information about the process dynamics. In this paper we present a methodology to deal with the categorical data streams from manufacturing processes, with an objective of predicting failures on the last stage of the process. A thorough examination of the behaviour...... and classification capabilities of our methodology (on different experimental settings) is done through a specially designed simulation experiment. Secondly, in order to demonstrate the applicability in a real life problem a data set from electronics component manufacturing is being analysed through our proposed...

  10. Final report on reliability and lifetime prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Kenneth T; Wise, Jonathan; Jones, Gary D.; Causa, Al G.; Terrill, Edward R.; Borowczak, Marc

    2012-12-01

    This document highlights the important results obtained from the subtask of the Goodyear CRADA devoted to better understanding reliability of tires and to developing better lifetime prediction methods. The overall objective was to establish the chemical and physical basis for the degradation of tires using standard as well as unique models and experimental techniques. Of particular interest was the potential application of our unique modulus profiling apparatus for assessing tire properties and for following tire degradation. During the course of this complex investigation, extensive relevant information was generated, including experimental results, data analyses and development of models and instruments. Detailed descriptions of the findings are included in this report.

  11. Educational Data Mining & Students’ Performance Prediction

    Directory of Open Access Journals (Sweden)

    Amjad Abu Saa

    2016-05-01

    Full Text Available It is important to study and analyse educational data especially students’ performance. Educational Data Mining (EDM is the field of study concerned with mining educational data to find out interesting patterns and knowledge in educational organizations. This study is equally concerned with this subject, specifically, the students’ performance. This study explores multiple factors theoretically assumed to affect students’ performance in higher education, and finds a qualitative model which best classifies and predicts the students’ performance based on related personal and social factors.

  12. Model Predictive Control of Sewer Networks

    Science.gov (United States)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik; Poulsen, Niels K.; Falk, Anne K. V.

    2017-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored and controlled have thus become essential factors for effcient performance of waste water treatment plants. This paper examines methods for simplified modelling and controlling a sewer network. A practical approach to the problem is used by analysing simplified design model, which is based on the Barcelona benchmark model. Due to the inherent constraints the applied approach is based on Model Predictive Control.

  13. Prediction of Wild-type Enzyme Characteristics

    DEFF Research Database (Denmark)

    Geertz-Hansen, Henrik Marcus

    of biotechnology, including enzyme discovery and characterization. This work presents two articles on sequence-based discovery and functional annotation of enzymes in environmental samples, and two articles on analysis and prediction of enzyme thermostability and cofactor requirements. The first article presents...... a sequence-based approach to discovery of proteolytic enzymes in metagenomes obtained from the Polar oceans. We show that microorganisms living in these extreme environments of constant low temperature harbour genes encoding novel proteolytic enzymes with potential industrial relevance. The second article...... presents a web server for the processing and annotation of functional metagenomics sequencing data, tailored to meet the requirements of non-bioinformaticians. The third article presents analyses of the molecular determinants of enzyme thermostability, and a feature-based prediction method of the melting...

  14. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  15. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  16. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...

  17. Quantifying data worth toward reducing predictive uncertainty

    Science.gov (United States)

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  18. Predicting Community Evolution in Social Networks

    Directory of Open Access Journals (Sweden)

    Stanisław Saganowski

    2015-05-01

    Full Text Available Nowadays, sustained development of different social media can be observed worldwide. One of the relevant research domains intensively explored recently is analysis of social communities existing in social media as well as prediction of their future evolution taking into account collected historical evolution chains. These evolution chains proposed in the paper contain group states in the previous time frames and its historical transitions that were identified using one out of two methods: Stable Group Changes Identification (SGCI and Group Evolution Discovery (GED. Based on the observed evolution chains of various length, structural network features are extracted, validated and selected as well as used to learn classification models. The experimental studies were performed on three real datasets with different profile: DBLP, Facebook and Polish blogosphere. The process of group prediction was analysed with respect to different classifiers as well as various descriptive feature sets extracted from evolution chains of different length. The results revealed that, in general, the longer evolution chains the better predictive abilities of the classification models. However, chains of length 3 to 7 enabled the GED-based method to almost reach its maximum possible prediction quality. For SGCI, this value was at the level of 3–5 last periods.

  19. Genomic prediction using QTL derived from whole genome sequence data

    DEFF Research Database (Denmark)

    Brøndum, Rasmus Froberg; Su, Guosheng; Janss, Luc

    This study investigated the gain in accuracy of genomic prediction when a small number of significant variants from single marker analysis based on whole genome sequence data were added to the regular 54k SNP data. Analyses were performed for Nordic Holstein and Danish Jersey animals, using eithe...

  20. Determinants of work ability and its predictive value for disability

    NARCIS (Netherlands)

    S.M. Alavinia; A.G.E.M. de Boer; J.C. van Duivenbooden; M.H.W. Frings-Dresen; A. Burdorf

    2009-01-01

    Background Maintaining the ability of workers to cope with physical and psychosocial demands at work becomes increasingly important in prolonging working life. Aims To analyse the effects of work-related factors and individual characteristics on work ability and to determine the predictive value of

  1. Predicting travel time variability for cost-benefit analysis

    NARCIS (Netherlands)

    S. Peer; C. Koopmans; E.T. Verhoef

    2010-01-01

    Unreliable travel times cause substantial costs to travelers. Nevertheless, they are not taken into account in many cost-benefit-analyses (CBA), or only in very rough ways. This paper aims at providing simple rules on how variability can be predicted, based on travel time data from Dutch highways. T

  2. Predicting Alcohol, Cigarette, and Marijuana Use from Preferential Music Consumption

    Science.gov (United States)

    Oberle, Crystal D.; Garcia, Javier A.

    2015-01-01

    This study investigated whether use of alcohol, cigarettes, and marijuana may be predicted from preferential consumption of particular music genres. Undergraduates (257 women and 78 men) completed a questionnaire assessing these variables. Partial correlation analyses, controlling for sensation-seeking tendencies and behaviors, revealed that…

  3. Essays on Earnings Predictability

    DEFF Research Database (Denmark)

    Bruun, Mark

    affect the accuracy of analysts´earnings forecasts. Finally, the objective of the dissertation is to investigate how the stock market is affected by the accuracy of corporate earnings projections. The dissertation contributes to a deeper understanding of these issues. First, it is shown how earnings...... of analysts’ earnings forecasts. Furthermore, the dissertation shows how the stock market’s reaction to the disclosure of information about corporate earnings depends on how well corporate earnings can be predicted. The dissertation indicates that the stock market’s reaction to the disclosure of earnings...... forecasts are not more accurate than the simpler forecasts based on a historical timeseries of earnings. Secondly, the dissertation shows how accounting standards affect analysts’ earnings predictions. Accounting conservatism contributes to a more volatile earnings process, which lowers the accuracy...

  4. Predictive Hypothesis Identification

    CERN Document Server

    Hutter, Marcus

    2008-01-01

    While statistics focusses on hypothesis testing and on estimating (properties of) the true sampling distribution, in machine learning the performance of learning algorithms on future data is the primary issue. In this paper we bridge the gap with a general principle (PHI) that identifies hypotheses with best predictive performance. This includes predictive point and interval estimation, simple and composite hypothesis testing, (mixture) model selection, and others as special cases. For concrete instantiations we will recover well-known methods, variations thereof, and new ones. PHI nicely justifies, reconciles, and blends (a reparametrization invariant variation of) MAP, ML, MDL, and moment estimation. One particular feature of PHI is that it can genuinely deal with nested hypotheses.

  5. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Jørgensen, Claus Bjørn; Suetens, Sigrid; Tyran, Jean-Robert

    numbers based on recent drawings. While most players pick the same set of numbers week after week without regards of numbers drawn or anything else, we find that those who do change, act on average in the way predicted by the law of small numbers as formalized in recent behavioral theory. In particular......We investigate the “law of small numbers” using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto......, on average they move away from numbers that have recently been drawn, as suggested by the “gambler’s fallacy”, and move toward numbers that are on streak, i.e. have been drawn several weeks in a row, consistent with the “hot hand fallacy”....

  6. Chaos detection and predictability

    CERN Document Server

    Gottwald, Georg; Laskar, Jacques

    2016-01-01

    Distinguishing chaoticity from regularity in deterministic dynamical systems and specifying the subspace of the phase space in which instabilities are expected to occur is of utmost importance in as disparate areas as astronomy, particle physics and climate dynamics.   To address these issues there exists a plethora of methods for chaos detection and predictability. The most commonly employed technique for investigating chaotic dynamics, i.e. the computation of Lyapunov exponents, however, may suffer a number of problems and drawbacks, for example when applied to noisy experimental data.   In the last two decades, several novel methods have been developed for the fast and reliable determination of the regular or chaotic nature of orbits, aimed at overcoming the shortcomings of more traditional techniques. This set of lecture notes and tutorial reviews serves as an introduction to and overview of modern chaos detection and predictability techniques for graduate students and non-specialists.   The book cover...

  7. Prediction of the

    Directory of Open Access Journals (Sweden)

    Prasenjit Dey

    2016-06-01

    Full Text Available The aerodynamic behavior of a square cylinder with rounded corner edges in steady flow regime in the range of Reynolds number (Re 5–45; is predicted by Artificial Neural Network (ANN using MATLAB. The ANN has trained by back propagation algorithm. The ANN requires input and output data to train the network, which is obtained from the commercial Computational Fluid Dynamics (CFD software FLUENT in the present study. In FLUENT, all the governing equations are discretized by the finite volume method. Results from numerical simulation and back propagation based ANN have been compared. It has been discovered that the ANN predicts the aerodynamic behavior correctly within the given range of the training data. It is additionally observed that back propagation based ANN is an effective tool to forecast the aerodynamic behavior than simulation, that has very much longer computational time.

  8. Predictability of Critical Transitions

    CERN Document Server

    Zhang, Xiaozhu; Hallerberg, Sarah

    2015-01-01

    Critical transitions in multistable systems have been discussed as models for a variety of phenomena ranging from the extinctions of species to socio-economic changes and climate transitions between ice-ages and warm-ages. From bifurcation theory we can expect certain critical transitions to be preceded by a decreased recovery from external perturbations. The consequences of this critical slowing down have been observed as an increase in variance and autocorrelation prior to the transition. However especially in the presence of noise it is not clear, whether these changes in observation variables are statistically relevant such that they could be used as indicators for critical transitions. In this contribution we investigate the predictability of critical transitions in conceptual models. We study the the quadratic integrate-and-fire model and the van der Pol model, under the influence of external noise. We focus especially on the statistical analysis of the success of predictions and the overall predictabil...

  9. Predicting Bankruptcy in Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul RASHID

    2011-09-01

    Full Text Available This paper aims to identify the financial ratios that are most significant in bankruptcy prediction for the non-financial sector of Pakistan based on a sample of companies which became bankrupt over the time period 1996-2006. Twenty four financial ratios covering four important financial attributes, namely profitability, liquidity, leverage, and turnover ratios, were examined for a five-year period prior bankruptcy. The discriminant analysis produced a parsimonious model of three variables viz. sales to total assets, EBIT to current liabilities, and cash flow ratio. Our estimates provide evidence that the firms having Z-value below zero fall into the “bankrupt” whereas the firms with Z-value above zero fall into the “non-bankrupt” category. The model achieved 76.9% prediction accuracy when it is applied to forecast bankruptcies on the underlying sample.

  10. Urban pluvial flood prediction

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer

    2016-01-01

    historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs......Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...

  11. Predictability of Solar Flares

    Science.gov (United States)

    Mares, Peter; Balasubramaniam, K. S.

    2009-05-01

    Solar flares are significant drivers of space weather. With the availability of high cadence solar chromospheric and photospheric data from the USAF's Optical Solar PAtrol Network (OSPAN; photosphere and chromosphere imaging) Telescope and the Global Oscillations Network Group (GONG; photosphere magnetic imaging), at the National Solar Observatory, we have gained insights into potential uses of the data for solar flare prediction. We apply the Principal Component Analysis (PCA) to parameterize the flaring system and extract consistent observables at solar chromospheric and photospheric layers that indicate a viable recognition of flaring activity. Rather than limiting ourselves to a few known indicators of solar activity, PCA helps us to characterize the entire system using several tens of variables for each observed layer. The components of the Eigen vectors derived from PCA help us recognize and quantify innate characteristics of solar flares and compare them. We will present an analysis of these results to explore the viability of PCA to assist in predicting solar flares.

  12. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Jørgensen, Claus Bjørn; Suetens, Sigrid; Tyran, Jean-Robert

    We investigate the “law of small numbers” using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto...... numbers based on recent drawings. While most players pick the same set of numbers week after week without regards of numbers drawn or anything else, we find that those who do change, act on average in the way predicted by the law of small numbers as formalized in recent behavioral theory. In particular......, on average they move away from numbers that have recently been drawn, as suggested by the “gambler’s fallacy”, and move toward numbers that are on streak, i.e. have been drawn several weeks in a row, consistent with the “hot hand fallacy”....

  13. Crystal structure and prediction.

    Science.gov (United States)

    Thakur, Tejender S; Dubey, Ritesh; Desiraju, Gautam R

    2015-04-01

    The notion of structure is central to the subject of chemistry. This review traces the development of the idea of crystal structure since the time when a crystal structure could be determined from a three-dimensional diffraction pattern and assesses the feasibility of computationally predicting an unknown crystal structure of a given molecule. Crystal structure prediction is of considerable fundamental and applied importance, and its successful execution is by no means a solved problem. The ease of crystal structure determination today has resulted in the availability of large numbers of crystal structures of higher-energy polymorphs and pseudopolymorphs. These structural libraries lead to the concept of a crystal structure landscape. A crystal structure of a compound may accordingly be taken as a data point in such a landscape.

  14. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.

    2011-11-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis is that of no difference, and a spatial loss differential is created based on the observed data, the two sets of predictions, and the loss function chosen by the researcher. The test assumes only isotropy and short-range spatial dependence of the loss differential but does allow it to be non-Gaussian, non-zero-mean, and spatially correlated. Constant and nonconstant spatial trends in the loss differential are treated in two separate cases. Monte Carlo simulations illustrate the size and power properties of this test, and an example based on daily average wind speeds in Oklahoma is used for illustration. Supplemental results are available online. © 2011 American Statistical Association and the American Society for Qualitys.

  15. Predictive dynamic digital holography

    Science.gov (United States)

    Sulaiman, Sennan; Gibson, Steve; Spencer, Mark

    2016-09-01

    Digital holography has received recent attention for many imaging and sensing applications, including imaging through turbulent and turbid media, adaptive optics, three dimensional projective display technology and optical tweezing. A significant obstacle for digital holography in real-time applications, such as wavefront sensing for high energy laser systems and high speed imaging for target tracking, is the fact that digital holography is computationally intensive; it requires iterative virtual wavefront propagation and hill-climbing to optimize some sharpness criteria. This paper demonstrates real-time methods for digital holography based on approaches developed recently at UCLA for optimal and adaptive identification, prediction, and control of optical wavefronts. The methods presented integrate minimum variance wavefront prediction into digital holography schemes to short-circuit the computationally intensive algorithms for iterative propagation of virtual wavefronts and hill climbing for sharpness optimization.

  16. Multivariate respiratory motion prediction

    Science.gov (United States)

    Dürichen, R.; Wissel, T.; Ernst, F.; Schlaefer, A.; Schweikard, A.

    2014-10-01

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation.

  17. Predictive Game Theory

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  18. Characterization of Mesoscale Predictability

    Science.gov (United States)

    2013-09-30

    assimilation to either create pairs of different initial conditions (Bei and Zhang 2007, Mapes et al. 2008) or to initialize a large ensemble (Durran et...curves over all wave numbers where the error had not yet saturated. Following the terminology suggested by Mapes et al. (2008), the evolution of... Mapes , B., S. Tulich, T. Nasuno, and M. Satoh, 2008: Predictability aspects of global aqua- planet simulations with explicit convection. J. Meteor. Soc

  19. Nominal Model Predictive Control

    OpenAIRE

    Grüne, Lars

    2014-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  20. Nominal model predictive control

    OpenAIRE

    Grüne, Lars

    2013-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  1. Predicting appointment breaking.

    Science.gov (United States)

    Bean, A G; Talaga, J

    1995-01-01

    The goal of physician referral services is to schedule appointments, but if too many patients fail to show up, the value of the service will be compromised. The authors found that appointment breaking can be predicted by the number of days to the scheduled appointment, the doctor's specialty, and the patient's age and gender. They also offer specific suggestions for modifying the marketing mix to reduce the incidence of no-shows.

  2. Prediction of Algebraic Instabilities

    Science.gov (United States)

    Zaretzky, Paula; King, Kristina; Hill, Nicole; Keithley, Kimberlee; Barlow, Nathaniel; Weinstein, Steven; Cromer, Michael

    2016-11-01

    A widely unexplored type of hydrodynamic instability is examined - large-time algebraic growth. Such growth occurs on the threshold of (exponentially) neutral stability. A new methodology is provided for predicting the algebraic growth rate of an initial disturbance, when applied to the governing differential equation (or dispersion relation) describing wave propagation in dispersive media. Several types of algebraic instabilities are explored in the context of both linear and nonlinear waves.

  3. Time-predictable architectures

    CERN Document Server

    Rochange, Christine; Uhrig , Sascha

    2014-01-01

    Building computers that can be used to design embedded real-time systems is the subject of this title. Real-time embedded software requires increasingly higher performances. The authors therefore consider processors that implement advanced mechanisms such as pipelining, out-of-order execution, branch prediction, cache memories, multi-threading, multicorearchitectures, etc. The authors of this book investigate the timepredictability of such schemes.

  4. Kuiper Belt Occultation Predictions

    CERN Document Server

    Fraser, Wesley C; Trujillo, Chad; Stephens, Andrew W; Kavelaars, JJ; Brown, Michael E; Bianco, Federica B; Boyle, Richard P; Brucker, Melissa J; Hetherington, Nathan; Joner, Michael; Keel, William C; Langill, Phil P; Lister, Tim; McMillan, Russet J; Young, Leslie

    2013-01-01

    Here we present observations of 7 large Kuiper Belt Objects. From these observations, we extract a point source catalog with $\\sim0.01"$ precision, and astrometry of our target Kuiper Belt Objects with $0.04-0.08"$ precision within that catalog. We have developed a new technique to predict the future occurrence of stellar occultations by Kuiper Belt Objects. The technique makes use of a maximum likelihood approach which determines the best-fit adjustment to cataloged orbital elements of an object. Using simulations of a theoretical object, we discuss the merits and weaknesses of this technique compared to the commonly adopted ephemeris offset approach. We demonstrate that both methods suffer from separate weaknesses, and thus, together provide a fair assessment of the true uncertainty in a particular prediction. We present occultation predictions made by both methods for the 7 tracked objects, with dates as late as 2015. Finally, we discuss observations of three separate close passages of Quaoar to field star...

  5. Predicting Human Cooperation.

    Directory of Open Access Journals (Sweden)

    John J Nay

    Full Text Available The Prisoner's Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner's Dilemma (defection, when played by both players, is mutually harmful. Repetition of the Prisoner's Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner's Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner's Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation.

  6. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  7. Is Suicide Predictable?

    Directory of Open Access Journals (Sweden)

    S Asmaee

    2012-04-01

    Full Text Available Background:The current study aimed to test the hypothesis: Is suicide predictable? And try to classify the predictive factors in multiple suicide attempts.Methods:A cross-sectional study was administered to 223 multiple attempters, women who came to a medical poison centre after a suicide attempt.The participants were young, poor, and single.A Logistic Regression Analiysis was used to classify the predictive factors of suicide.Results:Women who had multiple suicide attempts exhibited a significant tendency to attempt suicide again. They had a history for more than two years of multiple suicide attempts, from three to as many as 18 times, plus mental illnesses such as depression and substance abuse.They also had a positive history of mental illnesses.Conclusion:Results indicate that contributing factors for another suicide attempt include previous suicide attempts, mental illness (depression,or a positive history of mental illnesses in the family affecting them at a young age, and substance abuse.

  8. Picturewise inter-view prediction selection for multiview video coding

    Science.gov (United States)

    Huo, Junyan; Chang, Yilin; Li, Ming; Yang, Haitao

    2010-11-01

    Inter-view prediction is introduced in multiview video coding (MVC) to exploit the inter-view correlation. Statistical analyses show that the coding gain benefited from inter-view prediction is unequal among pictures. On the basis of this observation, a picturewise interview prediction selection scheme is proposed. This scheme employs a novel inter-view prediction selection criterion to determine whether it is necessary to apply inter-view prediction to the current coding picture. This criterion is derived from the available coding information of the temporal reference pictures. Experimental results show that the proposed scheme can improve the performance of MVC with a comprehensive consideration of compression efficiency, computational complexity, and random access ability.

  9. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel;

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions...... day (using the area under the receiver operating characteristic curve (AUC) and kappa statistics) and by assessing consistency in predictions of range size changes under future climate (using cluster analysis). Results Our analyses show significant differences between predictions from different models......, with predicted changes in range size by 2030 differing in both magnitude and direction (e.g. from 92% loss to 322% gain). We explain differences with reference to two characteristics of the modelling techniques: data input requirements (presence/absence vs. presence-only approaches) and assumptions made by each...

  10. Multi-state models: metapopulation and life history analyses

    Directory of Open Access Journals (Sweden)

    Arnason, A. N.

    2004-06-01

    Full Text Available Multi–state models are designed to describe populations that move among a fixed set of categorical states. The obvious application is to population interchange among geographic locations such as breeding sites or feeding areas (e.g., Hestbeck et al., 1991; Blums et al., 2003; Cam et al., 2004 but they are increasingly used to address important questions of evolutionary biology and life history strategies (Nichols & Kendall, 1995. In these applications, the states include life history stages such as breeding states. The multi–state models, by permitting estimation of stage–specific survival and transition rates, can help assess trade–offs between life history mechanisms (e.g. Yoccoz et al., 2000. These trade–offs are also important in meta–population analyses where, for example, the pre–and post–breeding rates of transfer among sub–populations can be analysed in terms of target colony distance, density, and other covariates (e.g., Lebreton et al. 2003; Breton et al., in review. Further examples of the use of multi–state models in analysing dispersal and life–history trade–offs can be found in the session on Migration and Dispersal. In this session, we concentrate on applications that did not involve dispersal. These applications fall in two main categories: those that address life history questions using stage categories, and a more technical use of multi–state models to address problems arising from the violation of mark–recapture assumptions leading to the potential for seriously biased predictions or misleading insights from the models. Our plenary paper, by William Kendall (Kendall, 2004, gives an overview of the use of Multi–state Mark–Recapture (MSMR models to address two such violations. The first is the occurrence of unobservable states that can arise, for example, from temporary emigration or by incomplete sampling coverage of a target population. Such states can also occur for life history reasons, such

  11. Fracture mechanics analyses of partial crack closure in shell structures

    Science.gov (United States)

    Zhao, Jun

    2007-12-01

    This thesis presents the theoretical and finite element analyses of crack-face closure behavior in shells and its effect on the stress intensity factor under a bending load condition. Various shell geometries, such as spherical shell, cylindrical shell containing an axial crack, cylindrical shell containing a circumferential crack and shell with double curvatures, are all studied. In addition, the influence of material orthotropy on the crack closure effect in shells is also considered. The theoretical formulation is developed based on the shallow shell theory of Delale and Erdogan, incorporating the effect of crack-face closure at the compressive edges. The line-contact assumption, simulating the crack-face closure at the compressive edges, is employed so that the contact force at the closure edges is introduced, which can be translated to the mid-plane of the shell, accompanied by an additional distributed bending moment. The unknown contact force is computed by solving a mixed-boundary value problem iteratively, that is, along the crack length, either the normal displacement of the crack face at the compressive edges is equal to zero or the contact pressure is equal to zero. It is found that due to the curvature effects crack closure may not always occur on the entire length of the crack, depending on the direction of the bending load and the geometry of the shell. The crack-face closure influences significantly the magnitude of the stress intensity factors; it increases the membrane component but decreases the bending component. The maximum stress intensity factor is reduced by the crack-face closure. The significant influence of geometry and material orthotropy on rack closure behavior in shells is also predicted based on the analytical solutions. Three-dimensional FEA is performed to validate the theoretical solutions. It demonstrates that the crack face closure occurs actually over an area, not on a line, but the theoretical solutions of the stress intensity

  12. The cosmic dust analyser onboard cassini: ten years of discoveries

    Science.gov (United States)

    Srama, R.; Kempf, S.; Moragas-Klostermeyer, G.; Altobelli, N.; Auer, S.; Beckmann, U.; Bugiel, S.; Burton, M.; Economomou, T.; Fechtig, H.; Fiege, K.; Green, S. F.; Grande, M.; Havnes, O.; Hillier, J. K.; Helfert, S.; Horanyi, M.; Hsu, S.; Igenbergs, E.; Jessberger, E. K.; Johnson, T. V.; Khalisi, E.; Krüger, H.; Matt, G.; Mocker, A.; Lamy, P.; Linkert, G.; Lura, F.; Möhlmann, D.; Morfill, G. E.; Otto, K.; Postberg, F.; Roy, M.; Schmidt, J.; Schwehm, G. H.; Spahn, F.; Sterken, V.; Svestka, J.; Tschernjawski, V.; Grün, E.; Röser, H.-P.

    2011-12-01

    The interplanetary space probe Cassini/Huygens reached Saturn in July 2004 after 7 years of cruise phase. The German cosmic dust analyser (CDA) was developed under the leadership of the Max Planck Institute for Nuclear Physics in Heidelberg under the support of the DLR e.V. This instrument measures the interplanetary, interstellar and planetary dust in our solar system since 1999 and provided unique discoveries. In 1999, CDA detected interstellar dust in the inner solar system followed by the detection of electrical charges of interplanetary dust grains during the cruise phase between Earth and Jupiter. The instrument determined the composition of interplanetary dust and the nanometre-sized dust streams originating from Jupiter's moon Io. During the approach to Saturn in 2004, similar streams of submicron grains with speeds in the order of 100 km/s were detected from Saturn's inner and outer ring system and are released to the interplanetary magnetic field. Since 2004 CDA measured more than one million dust impacts characterising the dust environment of Saturn. The instrument is one of the three experiments which discovered the active ice geysers located at the south pole of Saturn's moon Enceladus in 2005. Later, a detailed compositional analysis of the water ice grains in Saturn's E ring system led to the discovery of large reservoirs of liquid water (oceans) below the icy crust of Enceladus. Finally, the determination of the dust-magnetosphere interaction and the discovery of the extended E ring (at least twice as large as predicted) allowed the definition of a dynamical dust model of Saturn's E ring describing the observed properties. This paper summarizes the discoveries of a 10-year story of success based on reliable measurements with the most advanced dust detector flown in space until today. This paper focuses on cruise results and findings achieved at Saturn with a focus on flux and density measurements. CDA discoveries related to the detailed dust stream

  13. Static and Dynamic Mechanical Analyses for the Vacuum Vessel of EAST Superconducting Tokamak Device

    Science.gov (United States)

    Song, Yuntao; Yao, Damao; Du, Shijun; Wu, Songtao; Weng, Peide

    2006-03-01

    EAST (experimental advanced superconducting tokamak) is an advanced steady-state plasma physics experimental device, which is being constructed as the Chinese National Nuclear Fusion Research Project. During the plasma operation the vacuum vessel as one of the key component will withstand the electromagnetic force due to the plasma disruption, the Halo current and the toroidal field coil quench, the pressure of boride water and the thermal load due to 250 oC baking by pressurized nitrogen gas. In this paper a report of the static and dynamic mechanical analyses of the vacuum vessel is made. Firstly the applied loads on the vacuum vessel were given and the static stress distribution under the gravitational loads, the pressure loads, the electromagnetic loads and thermal loads were investigated. Then a series of primary dynamic, buckling and fatigue life analyses were performed to predict the structure's dynamic behavior. A seismic analysis was also conducted.

  14. Fully plastic crack opening analyses of complex-cracked pipes for Ramberg-Osgood materials

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jae Uk; Choi, Jae Boong [Sungkyunkwan University, Suwon (Korea, Republic of); Huh, Nam Su [Seoul National University, Seoul (Korea, Republic of); Kim, Yun Jae [Korea University, Seoul (Korea, Republic of)

    2016-04-15

    The plastic influence functions for calculating fully plastic Crack opening displacement (COD) of complex-cracked pipes were newly proposed based on systematic 3-dimensional (3-D) elastic-plastic Finite element (FE) analyses using Ramberg-Osgood (R-O) relation, where global bending moment, axial tension and internal pressure are considered separately as a loading condition. Then, crack opening analyses were performed based on GE/EPRI concept by using the new plastic influence functions for complex-cracked pipes made of SA376 TP304 stainless steel, and the predicted CODs were compared with FE results based on deformation plasticity theory of tensile material behavior. From the comparison, the confidence of the proposed fully plastic crack opening solutions for complex-cracked pipes was gained. Therefore, the proposed engineering scheme for COD estimation using the new plastic influence functions can be utilized to estimate leak rate of a complex-cracked pipe for R-O material.

  15. Measurement of the analysing power in proton–proton elastic scattering at small angles

    Directory of Open Access Journals (Sweden)

    Z. Bagdasarian

    2014-12-01

    Full Text Available The proton analysing power in p→p elastic scattering has been measured at small angles at COSY-ANKE at 796 MeV and five other beam energies between 1.6 and 2.4 GeV using a polarised proton beam. The asymmetries obtained by detecting the fast proton in the ANKE forward detector or the slow recoil proton in a silicon tracking telescope are completely consistent. Although the analysing power results agree well with the many published data at 796 MeV, and also with the most recent partial wave solution at this energy, the ANKE data at the higher energies lie well above the predictions of this solution at small angles. An updated phase shift analysis that uses the ANKE results together with the World data leads to a much better description of these new measurements.

  16. Sensitivity analyses of biodiesel thermo-physical properties under diesel engine conditions

    DEFF Research Database (Denmark)

    Cheng, Xinwei; Ng, Hoon Kiat; Gan, Suyin

    2016-01-01

    This reported work investigates the sensitivities of spray and soot developments to the change of thermo-physical properties for coconut and soybean methyl esters, using two-dimensional computational fluid dynamics fuel spray modelling. The choice of test fuels made was due to their contrasting...... saturation-unsaturation compositions. The sensitivity analyses for non-reacting and reacting sprays were carried out against a total of 12 thermo-physical properties, at an ambient temperature of 900 K and density of 22.8 kg/m3. For the sensitivity analyses, all the thermo-physical properties were set...... as the baseline case and each property was individually replaced by that of diesel. The significance of individual thermo-physical property was determined based on the deviations found in predictions such as liquid penetration, ignition delay period and peak soot concentration when compared to those of baseline...

  17. The predicted secretome of Lactobacillus plantarum WCFS1 sheds light on interactions with its environment

    NARCIS (Netherlands)

    Boekhorst, J.; Wels, M.; Kleerebezem, M.; Siezen, R.J.

    2006-01-01

    The predicted extracellular proteins of the bacterium Lactobacillus plantarum were analysed to gain insight into the mechanisms underlying interactions of this bacterium with its environment. Extracellular proteins play important roles in processes ranging from probiotic effects in the gastrointesti

  18. Aftaler om arbejdsmiljø - en analyse af udvalgte overenskomster

    DEFF Research Database (Denmark)

    Petersen, Jens Voxtrup; Wiegmann, Inger-Marie; Vogt-Nielsen, Karl

    En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift.......En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift....

  19. 36 CFR 228.102 - Leasing analyses and decisions.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Leasing analyses and... AGRICULTURE MINERALS Oil and Gas Resources Leasing § 228.102 Leasing analyses and decisions. (a) Compliance with the National Environmental Policy Act of 1969. In analyzing lands for leasing, the...

  20. 41 CFR 101-27.208 - Inventory analyses.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Inventory analyses. 101... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 27-INVENTORY MANAGEMENT 27.2-Management of Shelf-Life Materials § 101-27.208 Inventory analyses. (a) An inventory analysis shall...

  1. Treatment of Pica through Multiple Analyses of Its Reinforcing Functions.

    Science.gov (United States)

    Piazza, Cathleen C.; Fisher, Wayne W.; Hanley, Gregory P.; LeBlanc, Linda A.; Worsdell, April S.; And Others

    1998-01-01

    A study conducted functional analyses of the pica of three young children. The pica of one participant was maintained by automatic reinforcement; that of the other two was multiply-controlled by social and automatic reinforcement. Preference and treatment analyses were used to address the automatic function of the pica. (Author/CR)

  2. On the reproducibility of meta-analyses : six practical recommendations

    NARCIS (Netherlands)

    Lakens, D.; Hilgard, J.; Staaks, J.

    2016-01-01

    Meta-analyses play an important role in cumulative science by combining information across multiple studies and attempting to provide effect size estimates corrected for publication bias. Research on the reproducibility of meta-analyses reveals that errors are common, and the percentage of effect si

  3. Recent Trends in Conducting School-Based Experimental Functional Analyses

    Science.gov (United States)

    Carter, Stacy L.

    2009-01-01

    Demonstrations of school-based experimental functional analyses have received limited attention within the literature. School settings present unique practical and ethical concerns related to the implementation of experimental analyses which were originally developed within clinical settings. Recent examples have made definite contributions toward…

  4. What can we do about exploratory analyses in clinical trials?

    Science.gov (United States)

    Moyé, Lem

    2015-11-01

    The research community has alternatively embraced then repudiated exploratory analyses since the inception of clinical trials in the middle of the twentieth century. After a series of important but ultimately unreproducible findings, these non-prospectively declared evaluations were relegated to hypothesis generating. Since the majority of evaluations conducted in clinical trials with their rich data sets are exploratory, the absence of their persuasive power adds to the inefficiency of clinical trial analyses in an atmosphere of fiscal frugality. However, the principle argument against exploratory analyses is not based in statistical theory, but pragmatism and observation. The absence of any theoretical treatment of exploratory analyses postpones the day when their statistical weaknesses might be repaired. Here, we introduce examination of the characteristics of exploratory analyses from a probabilistic and statistical framework. Setting the obvious logistical concerns aside (i.e., the absence of planning produces poor precision), exploratory analyses do not appear to suffer from estimation theory weaknesses. The problem appears to be a difficulty in what is actually reported as the p-value. The use of Bayes Theorem provides p-values that are more in line with confirmatory analyses. This development may inaugurate a body of work that would lead to the readmission of exploratory analyses to a position of persuasive power in clinical trials.

  5. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    To improve the productivity of the development process, more and more tools for static software analysis are tightly integrated into the incremental build process of an IDE. If multiple interdependent analyses are used simultaneously, the coordination between the analyses becomes a major obstacle...

  6. The EADGENE and SABRE post-analyses workshop

    DEFF Research Database (Denmark)

    Jaffrezic, Florence; Hedegaard, Jakob; Sancristobal, Magali;

    2009-01-01

    on the statistical analyses of a microarray experiment (i.e. getting a gene list), the subsequently analysis of the gene list is still an area of much confusion to many scientists. During a three-day workshop in November 2008, we discussed five aspects of these so-called post analyses of microarray data: 1) re...

  7. 9 CFR 590.580 - Laboratory tests and analyses.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Laboratory tests and analyses. 590.580... EGG PRODUCTS INSPECTION INSPECTION OF EGGS AND EGG PRODUCTS (EGG PRODUCTS INSPECTION ACT) Laboratory § 590.580 Laboratory tests and analyses. The official plant, at their expense, shall make tests...

  8. Training Residential Staff to Conduct Trial-Based Functional Analyses

    Science.gov (United States)

    Lambert, Joseph M.; Bloom, Sarah E.; Kunnavatana, S. Shanun; Collins, Shawnee D.; Clay, Casey J.

    2013-01-01

    We taught 6 supervisors of a residential service provider for adults with developmental disabilities to train 9 house managers to conduct trial-based functional analyses. Effects of the training were evaluated with a nonconcurrent multiple baseline. Results suggest that house managers can be trained to conduct trial-based functional analyses with…

  9. Restricted versus Unrestricted Learning: Synthesis of Recent Meta-Analyses

    Science.gov (United States)

    Johnson, Genevieve

    2007-01-01

    Meta-analysis is a method of quantitatively summarizing the results of experimental research. This article summarizes four meta-analyses published since 2003 that compare the effect of DE and traditional education (TE) on student learning. Despite limitations, synthesis of these meta-analyses establish, at the very least, equivalent learning…

  10. Integral analyses of fission product retention at mitigated thermally-induced SGTR using ARTIST experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Rýdl, Adolf, E-mail: adolf.rydl@psi.ch; Lind, Terttaliisa, E-mail: terttaliisa.lind@psi.ch; Birchley, Jonathan, E-mail: jonathan.birchley@psi.ch

    2016-02-15

    Highlights: • Source term analyses in a PWR of mitigated thermally-induced SGTR scenario performed. • Experimental ARTIST program results on aerosol scrubbing efficiency used in analyses. • Results demonstrate enhanced aerosol retention in a flooded steam generator. • High aerosol retention cannot be predicted by current theoretical scrubbing models. - Abstract: Integral source-term analyses are performed using MELCOR for a PWR Station Blackout (SBO) sequence leading to induced steam generator tube rupture (SGTR). In the absence of any mitigation measures, such a sequence can result in a containment bypass where the radioactive materials can be released directly to the environment. In some SGTR scenarios flooding of the faulted SG secondary side with water can mitigate the accident escalation and also the release of aerosol-borne and volatile radioactive materials. Data on the efficiency of aerosol scrubbing in an SG tube bundle were obtained in the international ARTIST project. In this paper ARTIST data are used directly with parametric MELCOR analyses of a mitigated SGTR sequence to provide more realistic estimates of the releases to environment in such a type of scenario or similar. Comparison is made with predictions using the default scrubbing model in MELCOR, as a representative of the aerosol scrubbing models in current integral codes. Specifically, simulations are performed for an unmitigated sequence and 2 cases where the SG secondary was refilled at different times after the tube rupture. The results, reflecting the experimental observations from ARTIST, demonstrate enhanced aerosol retention in the highly turbulent two-phase flow conditions caused by the complex geometry of the SG secondary side. This effect is not captured by any of the models currently available. The underlying physics remains only partly understood, indicating need for further studies to support a more mechanistic treatment of the retention process.

  11. Association between Adult Height and Risk of Colorectal, Lung, and Prostate Cancer: Results from Meta-analyses of Prospective Studies and Mendelian Randomization Analyses

    Science.gov (United States)

    Khankari, Nikhil K.; Shu, Xiao-Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei

    2016-01-01

    Background Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using height-associated genetic variants identified in a genome-wide association study (GWAS), to evaluate the association of adult height with these cancers. Methods and Findings A systematic review of prospective studies was conducted using the PubMed, Embase, and Web of Science databases. Using meta-analyses, results obtained from 62 studies were summarized for the association of a 10-cm increase in height with cancer risk. Mendelian randomization analyses were conducted using summary statistics obtained for 423 genetic variants identified from a recent GWAS of adult height and from a cancer genetics consortium study of multiple cancers that included 47,800 cases and 81,353 controls. For a 10-cm increase in height, the summary relative risks derived from the meta-analyses of prospective studies were 1.12 (95% CI 1.10, 1.15), 1.07 (95% CI 1.05, 1.10), and 1.06 (95% CI 1.02, 1.11) for colorectal, prostate, and lung cancers, respectively. Mendelian randomization analyses showed increased risks of colorectal (odds ratio [OR] = 1.58, 95% CI 1.14, 2.18) and lung cancer (OR = 1.10, 95% CI 1.00, 1.22) associated with each 10-cm increase in genetically predicted height. No association was observed for prostate cancer (OR = 1.03, 95% CI 0.92, 1.15). Our meta-analysis was limited to published studies. The sample size for the Mendelian randomization analysis of colorectal cancer was relatively small, thus affecting the precision of the point estimate. Conclusions Our study provides evidence for a potential causal association of adult height with the risk of colorectal and lung cancers and suggests that certain genetic factors and biological pathways affecting adult height may also affect the

  12. A modified Lee-Carter model for analysing short-base-period data.

    Science.gov (United States)

    Zhao, Bojuan Barbara

    2012-03-01

    This paper introduces a new modified Lee-Carter model for analysing short-base-period mortality data, for which the original Lee-Carter model produces severely fluctuating predicted age-specific mortality. Approximating the unknown parameters in the modified model by linearized cubic splines and other additive functions, the model can be simplified into a logistic regression when fitted to binomial data. The expected death rate estimated from the modified model is smooth, not only over ages but also over years. The analysis of mortality data in China (2000-08) demonstrates the advantages of the new model over existing models.

  13. Entropy analyses of spatiotemporal synchronizations in brain signals from patients with focal epilepsies

    CERN Document Server

    Tuncay, Caglar

    2010-01-01

    The electroencephalographic (EEG) data intracerebrally recorded from 20 epileptic humans with different brain origins of focal epilepsies or types of seizures, ages and sexes are investigated (nearly 700 million data). Multi channel univariate amplitude analyses are performed and it is shown that time dependent Shannon entropies can be used to predict focal epileptic seizure onsets in different epileptogenic brain zones of different patients. Formations or time evolutions of the synchronizations in the brain signals from epileptogenic or non epileptogenic areas of the patients in ictal interval or inter-ictal interval are further investigated employing spatial or temporal differences of the entropies.

  14. Uncertainty and Sensitivity Analyses Plan. Draft for Peer Review: Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  15. Flutter and Forced Response Analyses of Cascades using a Two-Dimensional Linearized Euler Solver

    Science.gov (United States)

    Reddy, T. S. R.; Srivastava, R.; Mehmed, O.

    1999-01-01

    Flutter and forced response analyses for a cascade of blades in subsonic and transonic flow is presented. The structural model for each blade is a typical section with bending and torsion degrees of freedom. The unsteady aerodynamic forces due to bending and torsion motions. and due to a vortical gust disturbance are obtained by solving unsteady linearized Euler equations. The unsteady linearized equations are obtained by linearizing the unsteady nonlinear equations about the steady flow. The predicted unsteady aerodynamic forces include the effect of steady aerodynamic loading due to airfoil shape, thickness and angle of attack. The aeroelastic equations are solved in the frequency domain by coupling the un- steady aerodynamic forces to the aeroelastic solver MISER. The present unsteady aerodynamic solver showed good correlation with published results for both flutter and forced response predictions. Further improvements are required to use the unsteady aerodynamic solver in a design cycle.

  16. Application of Rapid Visco Analyser (RVA) viscograms and chemometrics for maize hardness characterisation.

    Science.gov (United States)

    Guelpa, Anina; Bevilacqua, Marta; Marini, Federico; O'Kennedy, Kim; Geladi, Paul; Manley, Marena

    2015-04-15

    It has been established in this study that the Rapid Visco Analyser (RVA) can describe maize hardness, irrespective of the RVA profile, when used in association with appropriate multivariate data analysis techniques. Therefore, the RVA can complement or replace current and/or conventional methods as a hardness descriptor. Hardness modelling based on RVA viscograms was carried out using seven conventional hardness methods (hectoliter mass (HLM), hundred kernel mass (HKM), particle size index (PSI), percentage vitreous endosperm (%VE), protein content, percentage chop (%chop) and near infrared (NIR) spectroscopy) as references and three different RVA profiles (hard, soft and standard) as predictors. An approach using locally weighted partial least squares (LW-PLS) was followed to build the regression models. The resulted prediction errors (root mean square error of cross-validation (RMSECV) and root mean square error of prediction (RMSEP)) for the quantification of hardness values were always lower or in the same order of the laboratory error of the reference method.

  17. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Suetens, Sigrid; Galbo-Jørgensen, Claus B.; Tyran, Jean-Robert Karl

    2016-01-01

    as formalized in recent behavioral theory. In particular, players tend to bet less on numbers that have been drawn in the preceding week, as suggested by the ‘gambler’s fallacy’, and bet more on a number if it was frequently drawn in the recent past, consistent with the ‘hot-hand fallacy’.......We investigate the ‘law of small numbers’ using a data set on lotto gambling that allows us to measure players’ reactions to draws. While most players pick the same set of numbers week after week, we find that those who do change react on average as predicted by the law of small numbers...

  18. Towards Predictive Association Theories

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios; Tsivintzelis, Ioannis; Michelsen, Michael Locht

    2011-01-01

    Association equations of state like SAFT, CPA and NRHB have been previously applied to many complex mixtures. In this work we focus on two of these models, the CPA and the NRHB equations of state and the emphasis is on the analysis of their predictive capabilities for a wide range of applications...... and water–MEG–aliphatic hydrocarbons LLE using interaction parameters obtained from the binary data alone. Moreover, it is demonstrated that the NRHB equation of state is a versatile tool which can be employed equally well to mixtures with pharmaceuticals and solvents, including mixed solvents, as well...

  19. Chloride ingress prediction

    DEFF Research Database (Denmark)

    Frederiksen, Jens Mejer; Geiker, Mette Rica

    2008-01-01

    Prediction of chloride ingress into concrete is an important part of durability design of reinforced concrete structures exposed to chloride containing environment. This paper presents experimentally based design parameters for Portland cement concretes with and without silica fume and fly ash...... in marine atmospheric and submersed South Scandinavian environment. The design parameters are based on sequential measurements of 86 chloride profiles taken over ten years from 13 different types of concrete. The design parameters provide the input for an analytical model for chloride profiles as function...

  20. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  1. Predicting Sustainable Work Behavior

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft

    2013-01-01

    Sustainable work behavior is an important issue for operations managers – it has implications for most outcomes of OM. This research explores the antecedents of sustainable work behavior. It revisits and extends the sociotechnical model developed by Brown et al. (2000) on predicting safe behavior....... Employee characteristics and general attitudes towards safety and work condition are included in the extended model. A survey was handed out to 654 employees in Chinese factories. This research contributes by demonstrating how employee- characteristics and general attitudes towards safety and work...... condition influence their sustainable work behavior. A new definition of sustainable work behavior is proposed....

  2. Consciousness -- A Verifiable Prediction

    Science.gov (United States)

    Panchapakesan, N.

    2014-07-01

    Consciousness may or may not be completely within the realm of science. We have argued elsewhere that there is a high probability that it is not within the purview of science, just like humanities and arts are outside science. Even social sciences do not come under science when human interactions are involved. Here, we suggest a possible experiment to decide whether it is part of science. We suggest that a scientific signal may be available to investigate the prediction in the form of an electromagnetic brainwave background radiation.

  3. Predicting photothermal field performance

    Science.gov (United States)

    Gonzalez, C. C.; Ross, R. G., Jr.

    1984-01-01

    Photothermal field performance in flat plate solar collectors was predicted. An analytical model which incorporates the measured dependency between transmittance loss and UV and temperature exposure levels was developed. The model uses SOLMET weather data extrapolated to 30 years for various sites and module mounting configurations. It is concluded that the temperature is the key to photothermally induced transmittance loss. The sensitivity of transmittance loss to UV level is nonlinear with minimum in curve near one sun. The ethylene vinyl acetate (EVA) results are consistent with 30 year life allocation.

  4. Predicting Alloreactivity in Transplantation

    Directory of Open Access Journals (Sweden)

    Kirsten Geneugelijk

    2014-01-01

    Full Text Available Human leukocyte Antigen (HLA mismatching leads to severe complications after solid-organ transplantation and hematopoietic stem-cell transplantation. The alloreactive responses underlying the posttransplantation complications include both direct recognition of allogeneic HLA by HLA-specific alloantibodies and T cells and indirect T-cell recognition. However, the immunogenicity of HLA mismatches is highly variable; some HLA mismatches lead to severe clinical B-cell- and T-cell-mediated alloreactivity, whereas others are well tolerated. Definition of the permissibility of HLA mismatches prior to transplantation allows selection of donor-recipient combinations that will have a reduced chance to develop deleterious host-versus-graft responses after solid-organ transplantation and graft-versus-host responses after hematopoietic stem-cell transplantation. Therefore, several methods have been developed to predict permissible HLA-mismatch combinations. In this review we aim to give a comprehensive overview about the current knowledge regarding HLA-directed alloreactivity and several developed in vitro and in silico tools that aim to predict direct and indirect alloreactivity.

  5. Predictive Analyses of Biological Effects of Natural Products: From Plant Extracts to Biomolecular Laboratory and Computer Modeling

    Directory of Open Access Journals (Sweden)

    Roberto Gambari

    2011-01-01

    Full Text Available Year by year, the characterization of the biological activity of natural products is becoming more competitive and complex, with the involvement in this research area of experts belonging to different scientific fields, including chemistry, biochemistry, molecular biology, immunology and bioinformatics. These fields are becoming of great interest for several high-impact scientific journals, including eCAM. The available literature in general, and a survey of reviews and original articles recently published, establishes that natural products, including extracts from medicinal plants and essential oils, retain interesting therapeutic activities, including antitumor, antiviral, anti-inflammatory, pro-apoptotic and differentiating properties. In this commentary, we focus attention on interest in networks based on complementary activation and comparative evaluation of different experimental strategies applied to the discovery and characterization of bioactive natural products. A representative flow chart is shown in the paper.

  6. The effects of clinical and statistical heterogeneity on the predictive values of results from meta-analyses

    NARCIS (Netherlands)

    Melsen, W G; Rovers, M M; Bonten, M J M; Bootsma, M C J

    2014-01-01

    Variance between studies in a meta-analysis will exist. This heterogeneity may be of clinical, methodological or statistical origin. The last of these is quantified by the I(2) -statistic. We investigated, using simulated studies, the accuracy of I(2) in the assessment of heterogeneity and the effec

  7. [PK/PD Modeling as a Tool for Predicting Bacterial Resistance to Antibiotics: Alternative Analyses of Experimental Data].

    Science.gov (United States)

    Golikova, M V; Strukova, E N; Portnoy, Y A; Firsov, A A

    2015-01-01

    Postexposure number of mutants (NM) is a conventional endpoint in bacterial resistance studies using in vitro dynamic models that simulate antibiotic pharmacokinetics. To compare NM with a recently introduced integral parameter AUBC(M), the area under the time course of resistance mutants, the enrichment of resistant Staphylococcus aureus was studied in vitro by simulation of mono(daptomycin, doxycycline) and combined treatments (daptomycin + rifampicin, rifampicin + linezolid). Differences in the time courses of resistant S. aureus could be reflected by AUBC(M) but not N(M). Moreover, unlike AUBC(M), N(M) did not reflect the pronounced differences in the time courses of S. aureus mutants resistant to 2x, 4x, 8x and 16xMIC of doxycycline and rifampicin. The findings suggested that AUBC(M) was a more appropriate endpoint of the amplification of resistant mutants than N(M).

  8. Intentions, planning, and self-efficacy predict physical activity in Chinese and Polish adolescents: Two moderated mediation analyses

    Directory of Open Access Journals (Sweden)

    Aleksandra Luszczynska

    2010-01-01

    Full Text Available Se cree que la planificación traslada las intenciones en conductas saludables. Sin embargo, esto puede fallar debido a la falta de autoeficacia percibida. Las personas no afrontan tareas difíciles si guardan auto-dudas, incluso si han hecho un buen plan de acción. Los presentes dos estudios longitudinales descriptivos se diseñaron para examinar el supuesto rol moderador de la autoeficacia en relación planificaciónconducta. En el Estudio I (N = 534 adolescentes chinos, se evaluaron las intenciones en la línea base, mientras que la autoeficacia y la actividad física fueron medidas cuatro semanas más tarde. En el Estudio II, 620 adolescentes polacos rellenaron cuestionarios que evaluaban la actividad física, intenciones, planificación y la autoeficacia en un seguimiento de la actividad física de 10 semanas. Un modelo de mediación moderada fue estudiado. Se especificó la planificación como mediadora entre las intenciones y el comportamiento, mientras que la autoeficacia se especificó como mediadora de la relación planificación-conducta. Los resultados confirman que los niveles de autoeficacia moderan el proceso de mediación. La fuerza del efecto mediado (intención vía planificación del comportamiento se incrementó junto con los niveles de autoeficacia. Estos resultados permanecieron válidos después de haber contabilizado la actividad física de la línea base. Para que la planificación sea mediadora de la relación intención-conducta es necesario que los adolescentes tengan los niveles de autoeficacia suficientemente altos. De otra manera, la planificación puede hacerse en vano. Se discuten las implicaciones para el desarrollo teórico y de intervenciones.

  9. Prediction of neural differentiation fate of rat mesenchymal stem cells by quantitative morphological analyses using image processing techniques.

    Science.gov (United States)

    Kazemimoghadam, Mahdieh; Janmaleki, Mohsen; Fouani, Mohamad Hassan; Abbasi, Sara

    2015-02-01

    Differentiation of bone marrow mesenchymal stem cells (BMSCs) into neural cells has received significant attention in recent years. However, there is still no practical method to evaluate differentiation process non-invasively and practically. The cellular quality evaluation method is still limited to conventional techniques, which are based on extracting genes or proteins from the cells. These techniques are invasive, costly, time consuming, and should be performed by relevant experts in equipped laboratories. Moreover, they cannot anticipate the future status of cells. Recently, cell morphology has been introduced as a feasible way of monitoring cell behavior because of its relationship with cell proliferation, functions and differentiation. In this study, rat BMSCs were induced to differentiate into neurons. Subsequently, phase contrast images of cells taken at certain intervals were subjected to a series of image processing steps and cell morphology features were calculated. In order to validate the viability of applying image-based approaches for estimating the quality of differentiation process, neural-specific markers were measured experimentally throughout the induction. The strong correlation between quantitative imaging metrics and experimental outcomes revealed the capability of the proposed approach as an auxiliary method of assessing cell behavior during differentiation.

  10. The Predictive Utility of Narcissism among Children and Adolescents: Evidence for a Distinction between Adaptive and Maladaptive Narcissism

    Science.gov (United States)

    Barry, Christopher T.; Frick, Paul J.; Adler, Kristy K.; Grafeman, Sarah J.

    2007-01-01

    We examined the predictive utility of narcissism among a community sample of children and adolescents (N=98) longitudinally. Analyses focused on the differential utility between maladaptive and adaptive narcissism for predicting later delinquency. Maladaptive narcissism significantly predicted self-reported delinquency at one-, two-, and…

  11. The role of CFD computer analyses in hydrogen safety management

    Energy Technology Data Exchange (ETDEWEB)

    Komen, Ed M.J.; Visser, Dirk C.; Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands); Te Lintelo, Jos G.T. [N.V. Elekticiteits-Productiemaatschappij Zuid-Nederland EPZ, Borssele (Netherlands)

    2015-11-15

    The risks of hydrogen release and combustion during a severe accident in a light water reactor have attracted considerable attention after the Fukushima accident in Japan. Reliable computer analyses are needed for the optimal design of hydrogen mitigation systems. In the last decade, significant progress has been made in the development, validation, and application of more detailed, three-dimensional Computational Fluid Dynamics (CFD) simulations for hydrogen safety analyses. The validation status and reliability of CFD code simulations will be illustrated by validation analyses performed for experiments executed in the PANDA, THAI, and ENACCEF facilities.

  12. Analyses of beyond design basis accident homogeneous boron dilution scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Kereszturi, Andras; Hegyi, Gyoergy; Maraczy, Csaba; Trosztel, Istvan; Tota, Adam [Hungarian Academy of Sciences, Centre for Energy Research, Budapest (Hungary); Karsa, Zoltan [NUBIKI Nuclear Safety Research Institute, Ltd., Budapest (Hungary)

    2015-09-15

    Homogeneous boron dilution scenarios in a VVER-440 reactor were analyzed using the coupled KIKO3D-ATHLET code. The scenarios are named ''homogeneous'' because of the very slow dilution caused by a rupture in the heat exchanger of the makeup system. Without the presented analyses, a significant contribution of the homogeneous boron dilution to the Core Damage Frequency (CDF) had to be assumed in the Probabilistic Safety Analyses (PSA). According to the combined results of the presented deterministic and probabilistic analyses, the final conclusion is that boron dilution transients don't give significant contribution to the CDF for the investigated VVER-440 NPP.

  13. Quelques aspects de l'analyse des donnees symboliques

    OpenAIRE

    Diday, E.

    1993-01-01

    Projet CLOREC; Savoir representer nos connaissances par des expressions a la fois symboliques et numeriques, savoir manipuler et utiliser ces expressions dans le but d'aider a decider, de mieux analyser, synthetiser et organiser notre experience et nos observations, tel est l'objectif que s'assigne l'analyse des donnees symboliques. On presente d'abord les "objets symboliques" (sortes d'atomes de connaissances et ce qui les distingue des objets classiques de l'analyse des donnees usuelles. Ce...

  14. Fracture Mechanics Prediction of Fatigue Life of Aluminum Highway Bridges

    DEFF Research Database (Denmark)

    Rom, Søren; Agerskov, Henning

    2015-01-01

    Fracture mechanics prediction of the fatigue life of aluminum highway bridges under random loading is studied. The fatigue life of welded joints has been determined from fracture mechanics analyses and the results obtained have been compared with results from experimental investigations....... The fatigue life of welded plate specimens has been investigated. Both the fracture mechanics analyses and the fatigue tests have been carried out using load histories, which correspond to one week's traffic loading, determined by means of strain gauge measurements on the deck structure of the Farø Bridges...... in Denmark. The results obtained from the fracture mechanics analyses show a significant difference between constant amplitude and variable amplitude results. Both the fracture mechanics analyses and the results of the fatigue tests carried out indicate that Miner's rule, which is normally used in the design...

  15. Theory use in social predictions.

    Science.gov (United States)

    Bazinger, Claudia; Kühberger, Anton

    2012-12-01

    In a commentary to our article on the role of theory and simulation in social predictions, Krueger (2012) argues that the role of theory is neglected in social psychology for a good reason. He considers evidence indicating that people readily generalize from themselves to others. In response, we stress the role of theoretical knowledge in predicting other people's behavior. Importantly, prediction by simulation and prediction by theory can lead to high as well as to low correlations between own and predicted behavior. This renders correlations largely useless for identifying the prediction strategy. We argue that prediction by theory is a serious alternative to prediction by simulation, and that reliance on correlation has led to a bias toward simulation.

  16. Theory use in social predictions

    OpenAIRE

    Bazinger, Claudia; Kühberger, Anton

    2012-01-01

    In a commentary to our article on the role of theory and simulation in social predictions, Krueger (2012) argues that the role of theory is neglected in social psychology for a good reason. He considers evidence indicating that people readily generalize from themselves to others. In response, we stress the role of theoretical knowledge in predicting other people’s behavior. Importantly, prediction by simulation and prediction by theory can lead to high as well as to low correlations between o...

  17. Duurzaam concurreren in de Nederlandse melkveehouderij: een eerste verkennende analyse

    NARCIS (Netherlands)

    Bergevoet, R.H.M.; Calker, van K.J.; Goddijn, S.T.

    2006-01-01

    Dit rapport bevat het resultaat van een eerste verkennende analyse van de positie op het gebied van duurzaamheid van de Nederlandse melkveehouderij. Onderzocht zijn maatschappelijke en ecologische duurzaamheid van de Nederlandse melkveehouderij in vergelijking met de duurzaamheid van de melkveehoude

  18. New insights into domestication of carrot from root transcriptome analyses

    NARCIS (Netherlands)

    Rong, J.; Lammers, Y.; Strasburg, J.L.; Schidlo, N.S.; Ariyurek, Y.; Jong, de T.J.; Klinkhamer, P.G.L.; Smulders, M.J.M.; Vrieling, K.

    2014-01-01

    Background - Understanding the molecular basis of domestication can provide insights into the processes of rapid evolution and crop improvement. Here we demonstrated the processes of carrot domestication and identified genes under selection based on transcriptome analyses. Results - The root transcr

  19. An early "Atkins' Diet": RA Fisher analyses a medical "experiment".

    Science.gov (United States)

    Senn, Stephen

    2006-04-01

    A study on vitamin absorption which RA Fisher analysed for WRG Atkins and co-authored with him is critically examined. The historical background as well as correspondence between Atkins and Fisher is presented.

  20. Multielement trace analyses of SINQ materials by ICP-OES

    Energy Technology Data Exchange (ETDEWEB)

    Keil, R.; Schwikowski, M. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Inductively Coupled Plasma Optical Emission Spectrometry was used to analyse 70 elements in various materials used for construction of the SINQ. Detection limits for individual elements depend strongly on the matrix and had to be determined separately. (author) 1 tab.

  1. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in the...

  2. Proteomic Analyses of NF1-Interacting Proteins in Keratinocytes

    Science.gov (United States)

    2015-04-01

    AWARD NUMBER: W81XWH-14-1-0070 TITLE: Proteomic Analyses of NF1 -Interacting Proteins in Keratinocytes PRINCIPAL INVESTIGATOR: Shyni Varghese...TITLE AND SUBTITLE 5a.CONTRACT NUMBER Proteomic Analyses of NF1 -Interacting Proteins in Keratinocytes 5b. GRANT NUMBER W81XWH-14-1-0070 5c. PROGRAM...in the NF1 null epidermis, we analyzed NF1 expression in a mouse model of psoriasis (imiquimod-induced psoriasis-like skin inflammation) and

  3. Modelling longevity bonds: Analysing the Swiss Re Kortis bond

    OpenAIRE

    2015-01-01

    A key contribution to the development of the traded market for longevity risk was the issuance of the Kortis bond, the world's first longevity trend bond, by Swiss Re in 2010. We analyse the design of the Kortis bond, develop suitable mortality models to analyse its payoff and discuss the key risk factors for the bond. We also investigate how the design of the Kortis bond can be adapted and extended to further develop the market for longevity risk.

  4. Finite strain analyses of deformations in polymer specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2016-01-01

    Analyses of the stress and strain state in test specimens or structural components made of polymer are discussed. This includes the Izod impact test, based on full 3D transient analyses. Also a long thin polymer tube under internal pressure has been studied, where instabilities develop...... viscoplastic flow on the indentation response. Also, the ability of the simpler expanding spherical cavity model to reproduce the trends from the 3D finite element solutions has been assessed....

  5. Predicting Clinical Outcomes Using Molecular Biomarkers.

    Science.gov (United States)

    Burke, Harry B

    2016-01-01

    Over the past 20 years, there has been an exponential increase in the number of biomarkers. At the last count, there were 768,259 papers indexed in PubMed.gov directly related to biomarkers. Although many of these papers claim to report clinically useful molecular biomarkers, embarrassingly few are currently in clinical use. It is suggested that a failure to properly understand, clinically assess, and utilize molecular biomarkers has prevented their widespread adoption in treatment, in comparative benefit analyses, and their integration into individualized patient outcome predictions for clinical decision-making and therapy. A straightforward, general approach to understanding how to predict clinical outcomes using risk, diagnostic, and prognostic molecular biomarkers is presented. In the future, molecular biomarkers will drive advances in risk, diagnosis, and prognosis, they will be the targets of powerful molecular therapies, and they will individualize and optimize therapy. Furthermore, clinical predictions based on molecular biomarkers will be displayed on the clinician's screen during the physician-patient interaction, they will be an integral part of physician-patient-shared decision-making, and they will improve clinical care and patient outcomes.

  6. Measurement and prediction of pork colour.

    Science.gov (United States)

    Van Oeckel, M J; Warnants, N; Boucqué, C V

    1999-08-01

    The extent to which instrumental colour determinations by FOPu (light scattering), Göfo (reflectance) and Labscan II (CIE L*, CIE a* and CIE b*, hue and chroma) are related to the Japanese colour grades was studied. Additionally, four on-line methods: pH1, FOP1, PQM1 (conductivity) and DDLT (Double Density Light Transmission, analogous to Capteur Gras/Maigre), were evaluated for their ability to predict subjectively and objectively colour. One hundred and twenty samples of m. longissimus thoracis et lumborum, from animals of different genotypes, were analysed. Of the instrumental colour determinations, CIE L* (r=-0.82), FOPu (r=-0.70) and Göfo (r=0.70) were best correlated with the Japanese colour scores. The Japanese colour grades could be predicted by the on-line instruments, pH1, FOP1, PQM1 and DDLT, with determination coefficients between 15 and 28%. Ultimate meat colour, determined by Japanese colour standards, FOPu, Göfo and CIE L*, was better predicted by DDLT than by the classic on-line instruments: FOP1, pH1 and PQM1, although the standard error of the estimate was similar for all instruments. This means that DDLT, although originally designed for estimating lean meat percentage, can additionally give information about meat quality, in particular colour. However, it must be stressed that the colour estimate by DDLT refers to a population of animals, rather than to individual pigs, because of the number of erroneously assigned samples.

  7. Epitope prediction methods

    DEFF Research Database (Denmark)

    Karosiene, Edita

    Major histocompatibility complex (MHC) molecules play a crucial role in adaptive immunity by sampling peptides from self and non-self proteins to be recognised by the immune system. MHC molecules present peptides on cell surfaces for recognition by CD8+ and CD4+ T lymphocytes that can initiate...... immune responses. Therefore, it is of great importance to be able to identify peptides that bind to MHC molecules, in order to understand the nature of immune responses and discover T cell epitopes useful for designing new vaccines and immunotherapies. MHC molecules in humans, referred to as human...... on machine learning techniques. Several MHC class I binding prediction algorithms have been developed and due to their high accuracy they are used by many immunologists to facilitate the conventional experimental process of epitope discovery. However, the accuracy of these methods depends on data defining...

  8. Permeability prediction in chalks

    DEFF Research Database (Denmark)

    Alam, Mohammad Monzurul; Fabricius, Ida Lykke; Prasad, Manika

    2011-01-01

    The velocity of elastic waves is the primary datum available for acquiring information about subsurface characteristics such as lithology and porosity. Cheap and quick (spatial coverage, ease of measurement) information of permeability can be achieved, if sonic velocity is used for permeability....... The relationships between permeability and porosity from core data were first examined using Kozeny’s equation. The data were analyzed for any correlations to the specific surface of the grain, Sg, and to the hydraulic property defined as the flow zone indicator (FZI). These two methods use two different approaches...... to enhance permeability prediction fromKozeny’s equation. The FZI is based on a concept of a tortuous flow path in a granular bed. The Sg concept considers the pore space that is exposed to fluid flow and models permeability resulting from effective flow parallel to pressure drop. The porosity-permeability...

  9. Chloride ingress prediction

    DEFF Research Database (Denmark)

    Frederiksen, Jens Mejer; Geiker, Mette Rica

    2008-01-01

    Prediction of chloride ingress into concrete is an important part of durability design of reinforced concrete structures exposed to chloride containing environment. This paper presents the state-of-the art: an analytical model which describes chloride profiles in concrete as function of depth...... makes physical sense for the design engineer, i.e. the achieved chloride diffusion coefficients at 1 year and 100 years, D1 and D100 respectively, and the corresponding achieved chloride concentrations at the exposed concrete surface, C1 and C100. Data from field exposure supports the assumption of time...... dependent surface chloride concentrations and the diffusion coefficients. Model parameters for Portland cement concretes with and without silica fume and fly ash in marine atmospheric and submerged South Scandinavian environment are suggested in a companion paper based on 10 years field exposure data....

  10. Motor degradation prediction methods

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-12-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor`s duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures.

  11. Evaluation of residue-residue contact prediction in CASP10

    KAUST Repository

    Monastyrskyy, Bohdan

    2013-08-31

    We present the results of the assessment of the intramolecular residue-residue contact predictions from 26 prediction groups participating in the 10th round of the CASP experiment. The most recently developed direct coupling analysis methods did not take part in the experiment likely because they require a very deep sequence alignment not available for any of the 114 CASP10 targets. The performance of contact prediction methods was evaluated with the measures used in previous CASPs (i.e., prediction accuracy and the difference between the distribution of the predicted contacts and that of all pairs of residues in the target protein), as well as new measures, such as the Matthews correlation coefficient, the area under the precision-recall curve and the ranks of the first correctly and incorrectly predicted contact. We also evaluated the ability to detect interdomain contacts and tested whether the difficulty of predicting contacts depends upon the protein length and the depth of the family sequence alignment. The analyses were carried out on the target domains for which structural homologs did not exist or were difficult to identify. The evaluation was performed for all types of contacts (short, medium, and long-range), with emphasis placed on long-range contacts, i.e. those involving residues separated by at least 24 residues along the sequence. The assessment suggests that the best CASP10 contact prediction methods perform at approximately the same level, and comparably to those participating in CASP9.

  12. Predictive coarse-graining

    Science.gov (United States)

    Schöberl, Markus; Zabaras, Nicholas; Koutsourelakis, Phaedon-Stelios

    2017-03-01

    We propose a data-driven, coarse-graining formulation in the context of equilibrium statistical mechanics. In contrast to existing techniques which are based on a fine-to-coarse map, we adopt the opposite strategy by prescribing a probabilistic coarse-to-fine map. This corresponds to a directed probabilistic model where the coarse variables play the role of latent generators of the fine scale (all-atom) data. From an information-theoretic perspective, the framework proposed provides an improvement upon the relative entropy method [1] and is capable of quantifying the uncertainty due to the information loss that unavoidably takes place during the coarse-graining process. Furthermore, it can be readily extended to a fully Bayesian model where various sources of uncertainties are reflected in the posterior of the model parameters. The latter can be used to produce not only point estimates of fine-scale reconstructions or macroscopic observables, but more importantly, predictive posterior distributions on these quantities. Predictive posterior distributions reflect the confidence of the model as a function of the amount of data and the level of coarse-graining. The issues of model complexity and model selection are seamlessly addressed by employing a hierarchical prior that favors the discovery of sparse solutions, revealing the most prominent features in the coarse-grained model. A flexible and parallelizable Monte Carlo - Expectation-Maximization (MC-EM) scheme is proposed for carrying out inference and learning tasks. A comparative assessment of the proposed methodology is presented for a lattice spin system and the SPC/E water model.

  13. Protein docking prediction using predicted protein-protein interface

    Directory of Open Access Journals (Sweden)

    Li Bin

    2012-01-01

    Full Text Available Abstract Background Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. Results We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm, is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. Conclusion We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  14. Predicting the Creativity of Design Majors Based on the Interaction of Diverse Personality Traits

    Science.gov (United States)

    Chang, Chi-Cheng; Peng, Li-Pei; Lin, Ju-Sen; Liang, Chaoyun

    2015-01-01

    In this study, design majors were analysed to examine how diverse personality traits interact and influence student creativity. The study participants comprised 476 design majors. The results indicated that openness predicted the originality of creativity, whereas openness, conscientiousness and agreeableness predicted the usefulness of…

  15. Predicting Adolescent Sexual and Contraceptive Behavior: An Application and Test of the Fishbein Model.

    Science.gov (United States)

    Jorgensen, Stephen R.; Sonstegard, Janet S.

    1984-01-01

    Presents a test of the Fishbein model of behavior prediction applied to predict the pregnancy risk-taking behavior of adolescent females (N=244). Analyses of data showed that the Fishbein model of attitude-behavior consistency seems to be applicable to the fertility-related behavior of adolescent females. (LLL)

  16. Cancer predictive value of cytogenetic markers used in occupational health surveillance programs

    DEFF Research Database (Denmark)

    Hagmar, L; Bonassi, S; Strömberg, U;

    1998-01-01

    It has not previously been clear whether cytogenetic biomarkers in healthy subjects will predict cancer. Earlier analyses of a Nordic and an Italian cohort indicated predictivity for chromosomal aberrations (CAS) but not for sister chromatid exchanges (SCES). A pooled analysis of the updated...

  17. The Diagnostic Apathia Scale predicts the ability to return to work following depression or anxiety

    DEFF Research Database (Denmark)

    Hellström, Lc; Eplov, Lf; Nordentoft, M

    2014-01-01

    , tiredness/fatigue, insomnia, and reduced ability to work and engage in personal interests. The scale was analysed for psychometric validity (scalability) and for its ability to predict RTW. Finally, the predictive validity of the Diagnostic Apathia Scale regarding RTW was compared with scales measuring...

  18. Practices for predicting and preventing preterm birth in Ireland: a national survey.

    LENUS (Irish Health Repository)

    Smith, V

    2011-03-01

    Preterm birth can result in adverse outcomes for the neonate and\\/or his\\/her family. The accurate prediction and prevention of preterm birth is paramount. This study describes and critically analyses practices for predicting and preventing preterm birth in Ireland.

  19. Predictive modeling of respiratory tumor motion for real-time prediction of baseline shifts

    Science.gov (United States)

    Balasubramanian, A.; Shamsuddin, R.; Prabhakaran, B.; Sawant, A.

    2017-03-01

    Baseline shifts in respiratory patterns can result in significant spatiotemporal changes in patient anatomy (compared to that captured during simulation), in turn, causing geometric and dosimetric errors in the administration of thoracic and abdominal radiotherapy. We propose predictive modeling of the tumor motion trajectories for predicting a baseline shift ahead of its occurrence. The key idea is to use the features of the tumor motion trajectory over a 1 min window, and predict the occurrence of a baseline shift in the 5 s that immediately follow (lookahead window). In this study, we explored a preliminary trend-based analysis with multi-class annotations as well as a more focused binary classification analysis. In both analyses, a number of different inter-fraction and intra-fraction training strategies were studied, both offline as well as online, along with data sufficiency and skew compensation for class imbalances. The performance of different training strategies were compared across multiple machine learning classification algorithms, including nearest neighbor, Naïve Bayes, linear discriminant and ensemble Adaboost. The prediction performance is evaluated using metrics such as accuracy, precision, recall and the area under the curve (AUC) for repeater operating characteristics curve. The key results of the trend-based analysis indicate that (i) intra-fraction training strategies achieve highest prediction accuracies (90.5–91.4%) (ii) the predictive modeling yields lowest accuracies (50–60%) when the training data does not include any information from the test patient; (iii) the prediction latencies are as low as a few hundred milliseconds, and thus conducive for real-time prediction. The binary classification performance is promising, indicated by high AUCs (0.96–0.98). It also confirms the utility of prior data from previous patients, and also the necessity of training the classifier on some initial data from the new patient for reasonable

  20. Advanced Analyses of Technology Innovations, Patents, and Intellectual Property

    Directory of Open Access Journals (Sweden)

    Amy J. C. Trappey

    2013-08-01

    conduct their hybrid analyses. Their findings provide recommendations for technology development strategies. The paper, entitled “Ontology-based Patent Licensing and Litigation Strategic Knowledge System for the Light Emitting Diode Industry” co-authored by Amy J.C. Trappey, Yu-Hui Wang, Charles V. Trappey,    Chun-Yi Wu, and Tzu-Hsuan Lin, develops a scientific and analytical approach for strategic cross-licensing decision support, particularly for the extremely competitive global companies in LED industry. Yu-Hui Wang and Benjamin Liu present their research in the paper entitled “Innovation Effect on Patent Pool Formation: Empirical Case of Philips’ Patents in Digital Versatile Disc 3C.” The paper evaluates both the quantity and quality of Philips’ patents in a leading DVD patent pool. The legitimacy of patent pools and their strategic value are discussed and investigated using the DVD 3C patent pool as the case example. In the paper “Application of the Honeybee Mating Optimization Algorithm to Patent Document Classification in Combination with the Support Vector Machine (SVM,” co-authors Chui-Yu Chiu and Pei-Ting Huang combine the honeybee mating optimization algorithm with SVM as a novel  patent categorization approach with accurate results. The paper, entitled “Exploring the Innovative Value of the RFID Industry” co-authored by Pei-Shu Fan, Cheng-Chin Tsao and Yi-Ching Liaw, examines a number of patent indicators to predict and explain future technological developments in the RFID industry. The research identifies the relative regional technological advantages in RFID development using relevant patent indicators and clustering analysis.

  1. An Integrated Approach for Urban Earthquake Vulnerability Analyses

    Science.gov (United States)

    Düzgün, H. S.; Yücemen, M. S.; Kalaycioglu, H. S.

    2009-04-01

    The earthquake risk for an urban area has increased over the years due to the increasing complexities in urban environments. The main reasons are the location of major cities in hazard prone areas, growth in urbanization and population and rising wealth measures. In recent years physical examples of these factors are observed through the growing costs of major disasters in urban areas which have stimulated a demand for in-depth evaluation of possible strategies to manage the large scale damaging effects of earthquakes. Understanding and formulation of urban earthquake risk requires consideration of a wide range of risk aspects, which can be handled by developing an integrated approach. In such an integrated approach, an interdisciplinary view should be incorporated into the risk assessment. Risk assessment for an urban area requires prediction of vulnerabilities related to elements at risk in the urban area and integration of individual vulnerability assessments. However, due to complex nature of an urban environment, estimating vulnerabilities and integrating them necessities development of integrated approaches in which vulnerabilities of social, economical, structural (building stock and infrastructure), cultural and historical heritage are estimated for a given urban area over a given time period. In this study an integrated urban earthquake vulnerability assessment framework, which considers vulnerability of urban environment in a holistic manner and performs the vulnerability assessment for the smallest administrative unit, namely at neighborhood scale, is proposed. The main motivation behind this approach is the inability to implement existing vulnerability assessment methodologies for countries like Turkey, where the required data are usually missing or inadequate and decision makers seek for prioritization of their limited resources in risk reduction in the administrative districts from which they are responsible. The methodology integrates socio

  2. Evaluation of Leymus chinensis quality using near-infrared reflectance spectroscopy with three different statistical analyses

    Directory of Open Access Journals (Sweden)

    Jishan Chen

    2015-12-01

    Full Text Available Due to a boom in the dairy industry in Northeast China, the hay industry has been developing rapidly. Thus, it is very important to evaluate the hay quality with a rapid and accurate method. In this research, a novel technique that combines near infrared spectroscopy (NIRs with three different statistical analyses (MLR, PCR and PLS was used to predict the chemical quality of sheepgrass (Leymus chinensis in Heilongjiang Province, China including the concentrations of crude protein (CP, acid detergent fiber (ADF, and neutral detergent fiber (NDF. Firstly, the linear partial least squares regression (PLS was performed on the spectra and the predictions were compared to those with laboratory-based recorded spectra. Then, the MLR evaluation method for CP has a potential to be used for industry requirements, as it needs less sophisticated and cheaper instrumentation using only a few wavelengths. Results show that in terms of CP, ADF and NDF, (i the prediction accuracy in terms of CP, ADF and NDF using PLS was obviously improved compared to the PCR algorithm, and comparable or even better than results generated using the MLR algorithm; (ii the predictions were worse compared to laboratory-based spectra with the MLR algorithmin, and poor predictions were obtained (R2, 0.62, RPD, 0.9 using MLR in terms of NDF; (iii a satisfactory accuracy with R2 and RPD by PLS method of 0.91, 3.2 for CP, 0.89, 3.1 for ADF and 0.88, 3.0 for NDF, respectively, was obtained. Our results highlight the use of the combined NIRs-PLS method could be applied as a valuable technique to rapidly and accurately evaluate the quality of sheepgrass hay.

  3. HostPhinder: A Phage Host Prediction Tool

    DEFF Research Database (Denmark)

    Villarroel, Julia; Kleinheinz, Kortine Annina; Jurtz, Vanessa Isabell

    2016-01-01

    The current dramatic increase of antibiotic resistant bacteria has revitalised the interest in bacteriophages as alternative antibacterial treatment. Meanwhile, the development of bioinformatics methods for analysing genomic data places high-throughput approaches for phage characterization within...... reach. Here, we present HostPhinder, a tool aimed at predicting the bacterial host of phages by examining the phage genome sequence. Using a reference database of 2196 phages with known hosts, HostPhinder predicts the host species of a query phage as the host of the most genomically similar reference...... phages. As a measure of genomic similarity the number of co-occurring k-mers (DNA sequences of length k) is used. Using an independent evaluation set, HostPhinder was able to correctly predict host genus and species for 81% and 74% of the phages respectively, giving predictions for more phages than BLAST...

  4. Chaotic time series. Part II. System Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Bjørn Lillekjendlie

    1994-10-01

    Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  5. Chaotic time series; 2, system identification and prediction

    CERN Document Server

    Lillekjendlie, B

    1994-01-01

    This paper is the second in a series of two, and describes the current state of the art in modelling and prediction of chaotic time series. Sampled data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multi layer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  6. Prediction of cereal feed value by near infrared spectroscopy

    DEFF Research Database (Denmark)

    Jørgensen, Johannes Ravn

    and problems that crop variety choices and cropping practices have on feeding value of winter wheat, triticale and spring barley. A successful development of an EDOM, EDOMi, FEso and FEsv calibration to NIRS will be a relatively cheap tool to monitor, diversify and evaluate the quality of cereals for animal...... feed, a possible tool to assess the feed value of new varieties in the variety testing and a useful, cheap and rapid tool for cereal breeders. A bank of 1213 grain samples of wheat, triticale, barley and rye, and related chemical reference analyses to describe the feed value have been established...... with the error in the chemical analysis. Prediction error by NIRS prediction of feed value has been shown to be above the error of the chemical measurement. The conclusion is that it has proved possible to predict the feed value in cereals with NIRS quickly and cheaply, but prediction error with this method...

  7. Temporal prediction errors modulate cingulate-insular coupling.

    Science.gov (United States)

    Limongi, Roberto; Sutherland, Steven C; Zhu, Jian; Young, Michael E; Habib, Reza

    2013-05-01

    Prediction error (i.e., the difference between the expected and the actual event's outcome) mediates adaptive behavior. Activity in the anterior mid-cingulate cortex (aMCC) and in the anterior insula (aINS) is associated with the commission of prediction errors under uncertainty. We propose a dynamic causal model of effective connectivity (i.e., neuronal coupling) between the aMCC, the aINS, and the striatum in which the task context drives activity in the aINS and the temporal prediction errors modulate extrinsic cingulate-insular connections. With functional magnetic resonance imaging, we scanned 15 participants when they performed a temporal prediction task. They observed visual animations and predicted when a stationary ball began moving after being contacted by another moving ball. To induced uncertainty-driven prediction errors, we introduced spatial gaps and temporal delays between the balls. Classical and Bayesian fMRI analyses provided evidence to support that the aMCC-aINS system along with the striatum not only responds when humans predict whether a dynamic event occurs but also when it occurs. Our results reveal that the insula is the entry port of a three-region pathway involved in the processing of temporal predictions. Moreover, prediction errors rather than attentional demands, task difficulty, or task duration exert an influence in the aMCC-aINS system. Prediction errors debilitate the effect of the aMCC on the aINS. Finally, our computational model provides a way forward to characterize the physiological parallel of temporal prediction errors elicited in dynamic tasks.

  8. Data-Based Predictive Control with Multirate Prediction Step

    Science.gov (United States)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  9. Cochrane reviews compared with industry supported meta-analyses and other meta-analyses of the same drugs: systematic review

    DEFF Research Database (Denmark)

    Jørgensen, Anders W; Hilden, Jørgen; Gøtzsche, Peter C

    2006-01-01

    OBJECTIVE: To compare the methodological quality and conclusions in Cochrane reviews with those in industry supported meta-analyses and other meta-analyses of the same drugs. DESIGN: Systematic review comparing pairs of meta-analyses that studied the same two drugs in the same disease and were...... reviews had a meta-analysis that compared two drugs. Twenty four meta-analyses that matched the Cochrane reviews were found: eight were industry supported, nine had undeclared support, and seven had no support or were supported by non-industry sources. On a 0-7 scale, the median quality score was 7...... patients or studies. The seven industry supported reviews that had conclusions recommended the experimental drug without reservations, compared with none of the Cochrane reviews (P = 0.02), although the estimated treatment effect was similar on average (z = 0.46, P = 0.64). Reviews with undeclared support...

  10. Helium analyses of 1-mm beryllium microspheres from COBRA-1A2

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, B.M. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-03-01

    Multiple helium analyses on four beryllium microspheres irradiated in the Experimental Breeder Reactor-II (EBR-II) at Argonne National Laboratory-West (ANL-W), are reported. The purpose of the analyses was to determine the total helium content of the beryllium, and to determine the helium release characteristics of the beryllium as a function of time and temperature. For the helium release measurements, sequential helium analyses were conducted on two of the samples over a temperature range from 500 C to 1100 C in 100 C increments. Total helium measurements were conducted separately using the normal analysis method of vaporizing the material in a single analysis run. Observed helium release in the two beryllium samples was nonlinear with time at each temperature interval, with each step being characterized by a rather rapid initial release rate, followed by a gradual slowing of the rate over time. Sample Be-C03-1 released virtually all of its helium after approximately 30 minutes at 1000 C, reaching a final value of 2722 appm. Sample Be-D03-1, on the other hand, released only about 62% of its helium after about 1 hour at 1100 c, reaching a final value of 1519 appm. Combining these results with subsequent vaporization runs on the two samples, yielded total helium concentrations of 2724 and 2459 appm. Corresponding helium concentrations measured in the two other C03 and D03 samples, by vaporization alone, were 2941 and 2574 appm. Both sets of concentrations are in reasonable agreement with predicted values of 2723 and 2662 appm. Helium-3 levels measured during the latter two vaporization runs were 2.80 appm for Be-C03-2, and 2.62 appm for Be-D03-2. Calculated {sup 3}He values are slightly lower at 2.55 and 2.50 appm, respectively, suggesting somewhat higher tritium levels in the beryllium than predicted.

  11. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  12. Emerging approaches in predictive toxicology.

    Science.gov (United States)

    Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2014-12-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described.

  13. Risk Factor Analyses for the Return of Spontaneous Circulation in the Asphyxiation Cardiac Arrest Porcine Model

    Directory of Open Access Journals (Sweden)

    Cai-Jun Wu

    2015-01-01

    Full Text Available Background: Animal models of asphyxiation cardiac arrest (ACA are frequently used in basic research to mirror the clinical course of cardiac arrest (CA. The rates of the return of spontaneous circulation (ROSC in ACA animal models are lower than those from studies that have utilized ventricular fibrillation (VF animal models. The purpose of this study was to characterize the factors associated with the ROSC in the ACA porcine model. Methods: Forty-eight healthy miniature pigs underwent endotracheal tube clamping to induce CA. Once induced, CA was maintained untreated for a period of 8 min. Two minutes following the initiation of cardiopulmonary resuscitation (CPR, defibrillation was attempted until ROSC was achieved or the animal died. To assess the factors associated with ROSC in this CA model, logistic regression analyses were performed to analyze gender, the time of preparation, the amplitude spectrum area (AMSA from the beginning of CPR and the pH at the beginning of CPR. A receiver-operating characteristic (ROC curve was used to evaluate the predictive value of AMSA for ROSC. Results: ROSC was only 52.1% successful in this ACA porcine model. The multivariate logistic regression analyses revealed that ROSC significantly depended on the time of preparation, AMSA at the beginning of CPR and pH at the beginning of CPR. The area under the ROC curve in for AMSA at the beginning of CPR was 0.878 successful in predicting ROSC (95% confidence intervals: 0.773∼0.983, and the optimum cut-off value was 15.62 (specificity 95.7% and sensitivity 80.0%. Conclusions: The time of preparation, AMSA and the pH at the beginning of CPR were associated with ROSC in this ACA porcine model. AMSA also predicted the likelihood of ROSC in this ACA animal model.

  14. Genome-Wide Gene Expression Profile Analyses Identify CTTN as a Potential Prognostic Marker in Esophageal Cancer

    OpenAIRE

    2014-01-01

    Aim Esophageal squamous cell carcinoma (ESCC) is one of the most common fatal malignances of the digestive tract. Its prognosis is poor mainly due to the lack of reliable markers for early detection and prognostic prediction. Here we aim to identify the molecules involved in ESCC carcinogenesis and those as potential markers for prognosis and as new molecular therapeutic targets. Methods We performed genome-wide gene expression profile analyses of 10 primary ESCCs and their adjacent normal ti...

  15. Useful theories make predictions.

    Science.gov (United States)

    Howes, Andrew

    2012-01-01

    Stephen and Van Orden (this issue) propose that there is a complex system approach to cognitive science, and collectively the authors of the papers presented in this issue believe that this approach provides the means to drive a revolution in the science of the mind. Unfortunately, however illuminating, this explanation is absent and hyperbole is all too extensive. In contrast, I argue (1) that dynamic systems theory is not new to cognitive science and does not provide a basis for a revolution, (2) it is not necessary to reject cognitive science in order to explain the constraints imposed by the body and the environment, (3) it is not necessary, as Silberstein and Chemero (this issue) appear to do, to reject cognitive science in order to explain consciousness, and (4) our understanding of pragmatics is not advanced by Gibbs and Van Orden's (this issue) "self-organized criticality".? Any debate about the future of cognitive science could usefully focus on predictive adequacy. Unfortunately, this is not the approach taken by the authors of this issue.

  16. Protein Chemical Shift Prediction

    CERN Document Server

    Larsen, Anders S

    2014-01-01

    The protein chemical shifts holds a large amount of information about the 3-dimensional structure of the protein. A number of chemical shift predictors based on the relationship between structures resolved with X-ray crystallography and the corresponding experimental chemical shifts have been developed. These empirical predictors are very accurate on X-ray structures but tends to be insensitive to small structural changes. To overcome this limitation it has been suggested to make chemical shift predictors based on quantum mechanical(QM) calculations. In this thesis the development of the QM derived chemical shift predictor Procs14 is presented. Procs14 is based on 2.35 million density functional theory(DFT) calculations on tripeptides and contains corrections for hydrogen bonding, ring current and the effect of the previous and following residue. Procs14 is capable at performing predictions for the 13CA, 13CB, 13CO, 15NH, 1HN and 1HA backbone atoms. In order to benchmark Procs14, a number of QM NMR calculatio...

  17. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Directory of Open Access Journals (Sweden)

    Jonathan I. Levy

    2013-08-01

    Full Text Available Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest.

  18. Subgroup analyses of clinical effectiveness to support health technology assessments.

    Science.gov (United States)

    Paget, Marie-Ange; Chuang-Stein, Christy; Fletcher, Christine; Reid, Carol

    2011-01-01

    Subgroup analysis is an integral part of access and reimbursement dossiers, in particular health technology assessment (HTA), and their HTA recommendations are often limited to subpopulations. HTA recommendations for subpopulations are not always clear and without controversies. In this paper, we review several HTA guidelines regarding subgroup analyses. We describe good statistical principles for subgroup analyses of clinical effectiveness to support HTAs and include case examples where HTA recommendations were given to subpopulations only. Unlike regulatory submissions, pharmaceutical statisticians in most companies have had limited involvement in the planning, design and preparation of HTA/payers submissions. We hope to change this by highlighting how pharmaceutical statisticians should contribute to payers' submissions. This includes early engagement in reimbursement strategy discussions to influence the design, analysis and interpretation of phase III randomized clinical trials as well as meta-analyses/network meta-analyses. The focus on this paper is on subgroup analyses relating to clinical effectiveness as we believe this is the first key step of statistical involvement and influence in the preparation of HTA and reimbursement submissions.

  19. HLA region excluded by linkage analyses of early onset periodontitis

    Energy Technology Data Exchange (ETDEWEB)

    Sun, C.; Wang, S.; Lopez, N.

    1994-09-01

    Previous studies suggested that HLA genes may influence susceptibility to early-onset periodontitis (EOP). Segregation analyses indicate that EOP may be due to a single major gene. We conducted linkage analyses to assess possible HLA effects on EOP. Fifty families with two or more close relatives affected by EOP were ascertained in Virginia and Chile. A microsatellite polymorphism within the HLA region (at the tumor necrosis factor beta locus) was typed using PCR. Linkage analyses used a donimant model most strongly supported by previous studies. Assuming locus homogeneity, our results exclude a susceptibility gene within 10 cM on either side of our marker locus. This encompasses all of the HLA region. Analyses assuming alternative models gave qualitatively similar results. Allowing for locus heterogeneity, our data still provide no support for HLA-region involvement. However, our data do not statistically exclude (LOD <-2.0) hypotheses of disease-locus heterogeneity, including models where up to half of our families could contain an EOP disease gene located in the HLA region. This is due to the limited power of even our relatively large collection of families and the inherent difficulties of mapping genes for disorders that have complex and heterogeneous etiologies. Additional statistical analyses, recruitment of families, and typing of flanking DNA markers are planned to more conclusively address these issues with respect to the HLA region and other candidate locations in the human genome. Additional results for markers covering most of the human genome will also be presented.

  20. Sensitivity Analyses for Cross-Coupled Parameters in Automotive Powertrain Optimization

    Directory of Open Access Journals (Sweden)

    Pongpun Othaganont

    2014-06-01

    Full Text Available When vehicle manufacturers are developing new hybrid and electric vehicles, modeling and simulation are frequently used to predict the performance of the new vehicles from an early stage in the product lifecycle. Typically, models are used to predict the range, performance and energy consumption of their future planned production vehicle; they also allow the designer to optimize a vehicle’s configuration. Another use for the models is in performing sensitivity analysis, which helps us understand which parameters have the most influence on model predictions and real-world behaviors. There are various techniques for sensitivity analysis, some are numerical, but the greatest insights are obtained analytically with sensitivity defined in terms of partial derivatives. Existing methods in the literature give us a useful, quantified measure of parameter sensitivity, a first-order effect, but they do not consider second-order effects. Second-order effects could give us additional insights: for example, a first order analysis might tell us that a limiting factor is the efficiency of the vehicle’s prime-mover; our new second order analysis will tell us how quickly the efficiency of the powertrain will become of greater significance. In this paper, we develop a method based on formal optimization mathematics for rapid second-order sensitivity analyses and illustrate these through a case study on a C-segment electric vehicle.