WorldWideScience

Sample records for model significantly predicted

  1. Does Statistical Significance Help to Evaluate Predictive Performance of Competing Models?

    Directory of Open Access Journals (Sweden)

    Levent Bulut

    2016-04-01

    Full Text Available In Monte Carlo experiment with simulated data, we show that as a point forecast criterion, the Clark and West's (2006 unconditional test of mean squared prediction errors does not reflect the relative performance of a superior model over a relatively weaker one. The simulation results show that even though the mean squared prediction errors of a constructed superior model is far below a weaker alternative, the Clark- West test does not reflect this in their test statistics. Therefore, studies that use this statistic in testing the predictive accuracy of alternative exchange rate models, stock return predictability, inflation forecasting, and unemployment forecasting should not weight too much on the magnitude of the statistically significant Clark-West tests statistics.

  2. Predicting significant torso trauma.

    Science.gov (United States)

    Nirula, Ram; Talmor, Daniel; Brasel, Karen

    2005-07-01

    Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.

  3. Significantly improved HIV inhibitor efficacy prediction employing proteochemometric models generated from antivirogram data.

    Directory of Open Access Journals (Sweden)

    Gerard J P van Westen

    Full Text Available Infection with HIV cannot currently be cured; however it can be controlled by combination treatment with multiple anti-retroviral drugs. Given different viral genotypes for virtually each individual patient, the question now arises which drug combination to use to achieve effective treatment. With the availability of viral genotypic data and clinical phenotypic data, it has become possible to create computational models able to predict an optimal treatment regimen for an individual patient. Current models are based only on sequence data derived from viral genotyping; chemical similarity of drugs is not considered. To explore the added value of chemical similarity inclusion we applied proteochemometric models, combining chemical and protein target properties in a single bioactivity model. Our dataset was a large scale clinical database of genotypic and phenotypic information (in total ca. 300,000 drug-mutant bioactivity data points, 4 (NNRTI, 8 (NRTI or 9 (PI drugs, and 10,700 (NNRTI 10,500 (NRTI or 27,000 (PI mutants. Our models achieved a prediction error below 0.5 Log Fold Change. Moreover, when directly compared with previously published sequence data, derived models PCM performed better in resistance classification and prediction of Log Fold Change (0.76 log units versus 0.91. Furthermore, we were able to successfully confirm both known and identify previously unpublished, resistance-conferring mutations of HIV Reverse Transcriptase (e.g. K102Y, T216M and HIV Protease (e.g. Q18N, N88G from our dataset. Finally, we applied our models prospectively to the public HIV resistance database from Stanford University obtaining a correct resistance prediction rate of 84% on the full set (compared to 80% in previous work on a high quality subset. We conclude that proteochemometric models are able to accurately predict the phenotypic resistance based on genotypic data even for novel mutants and mixtures. Furthermore, we add an applicability domain to

  4. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    Uncertainty derived from one of the process models – such as one-dimensional secondary settling tank (SST) models – can impact the output of the other process models, e.g., biokinetic (ASM1), as well as the integrated wastewater treatment plant (WWTP) models. The model structure and parameter...... uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based....... The outcome of this study contributes to a better understanding of uncertainty in WWTPs, and explicitly demonstrates the significance of secondary settling processes that are crucial elements of model prediction under dry and wet-weather loading conditions....

  5. Serum NX-DCP as a New Noninvasive Model to Predict Significant Liver Fibrosis in Chronic Hepatitis C.

    Science.gov (United States)

    Saito, Masaya; Yano, Yoshihiko; Hirano, Hirotaka; Momose, Kenji; Yoshida, Masaru; Azuma, Takeshi

    2015-02-01

    Finding a noninvasive method to predict liver fibrosis using inexpensive and easy-to-use markers is important. We aimed to clarify whether NX-des-γ-carboxyprothrombin (NX-DCP) could become a new noninvasive model to predict liver fibrosis in hepatitis C virus (HCV) related liver disease. We performed a prospective cohort study on a consecutive group of 101 patients who underwent liver biopsy for HCV-related liver disease at Kobe University Hospital. Laboratory measurements were performed on the same day as the biopsy. Factors associated with significant fibrosis (F3-4) were assessed by multivariate analyses. A comparison of predictive ability between multivariate factors and abovementioned noninvasive models was also performed. Increase in serum NX-DCP was significantly related to increase in fibrosis stage (P = 0.006). Moreover, NX-DCP was a multivariate factor associated with the presence of significant fibrosis F 3-4 (median 21 of F0-2 group vs. median 22 of F3-4 group with P = 0.002). The AUC of NX-DCP showed no significant differences compared with those of the AST-to-platelet ratio index (APRI), modified-APRI, the Göteborg University Cirrhosis Index (GUCI), the Lok index, the Hui score, cirrhosis discriminating score (CDS) and the Pohl score (P > 0.05). NX-DCP correlated positively with fibrosis stage and could discriminate well between HCV-related patients with or without significant fibrosis. Moreover, NX-DCP had a similar predictive ability to the abovementioned models, and thereby could be a new noninvasive prediction tool for fibrosis.

  6. Surface tensions of multi-component mixed inorganic/organic aqueous systems of atmospheric significance: measurements, model predictions and importance for cloud activation predictions

    Directory of Open Access Journals (Sweden)

    D. O. Topping

    2006-11-01

    Full Text Available In order to predict the physical properties of aerosol particles, it is necessary to adequately capture the behaviour of the ubiquitous complex organic components. One of the key properties which may affect this behaviour is the contribution of the organic components to the surface tension of aqueous particles in the moist atmosphere. Whilst the qualitative effect of organic compounds on solution surface tensions has been widely reported, our quantitative understanding on mixed organic and mixed inorganic/organic systems is limited.  Furthermore, it is unclear whether models that exist in the literature can reproduce the surface tension variability for binary and higher order multi-component organic and mixed inorganic/organic systems of atmospheric significance. The current study aims to resolve both issues to some extent. Surface tensions of single and multiple solute aqueous solutions were measured and compared with predictions from a number of model treatments. On comparison with binary organic systems, two predictive models found in the literature provided a range of values resulting from sensitivity to calculations of pure component surface tensions.  Results indicate that a fitted model can capture the variability of the measured data very well, producing the lowest average percentage deviation for all compounds studied.  The performance of the other models varies with compound and choice of model parameters. The behaviour of ternary mixed inorganic/organic systems was unreliably captured by using a predictive scheme and this was composition dependent. For more "realistic" higher order systems, entirely predictive schemes performed poorly. It was found that use of the binary data in a relatively simple mixing rule, or modification of an existing thermodynamic model with parameters derived from binary data, was able to accurately capture the surface tension variation with concentration. Thus, it would appear that in order to model

  7. Surface tensions of multi-component mixed inorganic/organic aqueous systems of atmospheric significance: measurements, model predictions and importance for cloud activation predictions

    Directory of Open Access Journals (Sweden)

    D. O. Topping

    2007-01-01

    Full Text Available In order to predict the physical properties of aerosol particles, it is necessary to adequately capture the behaviour of the ubiquitous complex organic components. One of the key properties which may affect this behaviour is the contribution of the organic components to the surface tension of aqueous particles in the moist atmosphere. Whilst the qualitative effect of organic compounds on solution surface tensions has been widely reported, our quantitative understanding on mixed organic and mixed inorganic/organic systems is limited. Furthermore, it is unclear whether models that exist in the literature can reproduce the surface tension variability for binary and higher order multi-component organic and mixed inorganic/organic systems of atmospheric significance. The current study aims to resolve both issues to some extent. Surface tensions of single and multiple solute aqueous solutions were measured and compared with predictions from a number of model treatments. On comparison with binary organic systems, two predictive models found in the literature provided a range of values resulting from sensitivity to calculations of pure component surface tensions. Results indicate that a fitted model can capture the variability of the measured data very well, producing the lowest average percentage deviation for all compounds studied. The performance of the other models varies with compound and choice of model parameters. The behaviour of ternary mixed inorganic/organic systems was unreliably captured by using a predictive scheme and this was dependent on the composition of the solutes present. For more atmospherically representative higher order systems, entirely predictive schemes performed poorly. It was found that use of the binary data in a relatively simple mixing rule, or modification of an existing thermodynamic model with parameters derived from binary data, was able to accurately capture the surface tension variation with concentration. Thus

  8. The Real World Significance of Performance Prediction

    Science.gov (United States)

    Pardos, Zachary A.; Wang, Qing Yang; Trivedi, Shubhendu

    2012-01-01

    In recent years, the educational data mining and user modeling communities have been aggressively introducing models for predicting student performance on external measures such as standardized tests as well as within-tutor performance. While these models have brought statistically reliable improvement to performance prediction, the real world…

  9. Predicting missing links via significant paths

    CERN Document Server

    Zhu, Xuzhen; Cai, Shimin; Huang, Junming; Zhou, Tao

    2014-01-01

    Link prediction plays an important role in understanding intrinsic evolving mechanisms of networks. With the belief that the likelihood of the existence of a link between two nodes is strongly related with their similarity, many methods have been proposed to calculate node similarity based on node attributes and/or topological structures. Among a large variety of methods that take into account paths connecting the target pair of nodes, most of which neglect the heterogeneity of those paths. Our hypothesis is that a path consisting of small-degree nodes provides a strong evidence of similarity between two ends, accordingly, we propose a so-called sig- nificant path index in this Letter to leverage intermediate nodes' degrees in similarity calculation. Empirical experiments on twelve disparate real networks demonstrate that the proposed index outperforms the mainstream link prediction baselines.

  10. Analysis of significant factors for dengue fever incidence prediction.

    Science.gov (United States)

    Siriyasatien, Padet; Phumee, Atchara; Ongruk, Phatsavee; Jampachaisri, Katechan; Kesorn, Kraisak

    2016-04-16

    Many popular dengue forecasting techniques have been used by several researchers to extrapolate dengue incidence rates, including the K-H model, support vector machines (SVM), and artificial neural networks (ANN). The time series analysis methodology, particularly ARIMA and SARIMA, has been increasingly applied to the field of epidemiological research for dengue fever, dengue hemorrhagic fever, and other infectious diseases. The main drawback of these methods is that they do not consider other variables that are associated with the dependent variable. Additionally, new factors correlated to the disease are needed to enhance the prediction accuracy of the model when it is applied to areas of similar climates, where weather factors such as temperature, total rainfall, and humidity are not substantially different. Such drawbacks may consequently lower the predictive power for the outbreak. The predictive power of the forecasting model-assessed by Akaike's information criterion (AIC), Bayesian information criterion (BIC), and the mean absolute percentage error (MAPE)-is improved by including the new parameters for dengue outbreak prediction. This study's selected model outperforms all three other competing models with the lowest AIC, the lowest BIC, and a small MAPE value. The exclusive use of climate factors from similar locations decreases a model's prediction power. The multivariate Poisson regression, however, effectively forecasts even when climate variables are slightly different. Female mosquitoes and seasons were strongly correlated with dengue cases. Therefore, the dengue incidence trends provided by this model will assist the optimization of dengue prevention. The present work demonstrates the important roles of female mosquito infection rates from the previous season and climate factors (represented as seasons) in dengue outbreaks. Incorporating these two factors in the model significantly improves the predictive power of dengue hemorrhagic fever forecasting

  11. Significance of bioindicators to predict survival in irradiated minipigs.

    Science.gov (United States)

    Moroni, Maria; Port, Matthias; Koch, Amory; Gulani, Jatinder; Meineke, Viktor; Abend, Michael

    2014-06-01

    The minipig is emerging as a potential alternative non-rodent animal model. Several biological markers (e.g., blood counts, laboratory parameters, and clinical signs) have been proposed for rapid triage of radiation victims. Here, the authors focus on the significance of bio-indicators for prediction of survivors after irradiation and compare it with human data; the relationship between these biomarkers and radiation dose is not part of this study. Male Göttingen minipigs (age 4-5 mo, weight 9-10 kg) were irradiated (or sham-irradiated) bilaterally with gamma-photons (⁶⁰Co, 0.5-0.6 Gy min⁻¹) in the dose range of 1.6-12 Gy. Peripheral blood cell counts, laboratory parameters, and clinical symptoms were collected up to 10 d after irradiation and analyzed using logistic regression analysis and calculating ROC curves. In moribund pigs, parameters such as decreased lymphocyte/granulocyte counts, increased C-reactive protein, alkaline phosphatase values, as well as increased citrulline values and body temperature, significantly (p 0.8). However, most predictive within the first 3 d after exposure was a combination of decreased lymphocyte counts and increased body temperature observed as early as 3 h after radiation exposure (ROC: 0.93-0.96, p < 0.0001). Sham-irradiated animals (corresponding to "worried wells") could be easily discriminated from dying pigs, thus pointing to the diagnostic significance of this analysis. These data corroborate with earlier findings performed on human radiation victims suffering from severe hematological syndrome and provide further evidence for the suitability of the minipig model as a potential alternative non-rodent animal model.

  12. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  13. Formalized prediction of clinically significant prostate cancer: is it possible?

    Institute of Scientific and Technical Information of China (English)

    Carvell T Nguyen; Michael W Kattan

    2012-01-01

    Greater understanding of the biology and epidemiology of prostate cancer in the last several decades have led to significant advances in its management.Prostate cancer is now detected in greater numbers at lower stages of disease and is amenable to multiple forms of efficacious treatment.However,there is a lack of conclusive data demonstrating a definitive mortality benefit from this earlier diagnosis and treatment of prostate cancer.It is likely due to the treatment of a large proportion of indolent cancers that would have had little adverse impact on health or lifespan if left alone.Due to this overtreatment phenomenon,active surveillance with delayed intervention is gaining traction as a viable management approach in contemporary practice.The ability to distinguish clinically insignificant cancers from those with a high risk of progression and/or lethality is critical to the appropriate selection of patients for surveillance protocols versus immediate intervention.This chapter will review the ability of various prediction models,including risk groupings and nomograms,to predict indolent disease and determine their role in the contemporary management of clinically localized prostate cancer.

  14. Melanoma risk prediction models

    Directory of Open Access Journals (Sweden)

    Nikolić Jelena

    2014-01-01

    Full Text Available Background/Aim. The lack of effective therapy for advanced stages of melanoma emphasizes the importance of preventive measures and screenings of population at risk. Identifying individuals at high risk should allow targeted screenings and follow-up involving those who would benefit most. The aim of this study was to identify most significant factors for melanoma prediction in our population and to create prognostic models for identification and differentiation of individuals at risk. Methods. This case-control study included 697 participants (341 patients and 356 controls that underwent extensive interview and skin examination in order to check risk factors for melanoma. Pairwise univariate statistical comparison was used for the coarse selection of the most significant risk factors. These factors were fed into logistic regression (LR and alternating decision trees (ADT prognostic models that were assessed for their usefulness in identification of patients at risk to develop melanoma. Validation of the LR model was done by Hosmer and Lemeshow test, whereas the ADT was validated by 10-fold cross-validation. The achieved sensitivity, specificity, accuracy and AUC for both models were calculated. The melanoma risk score (MRS based on the outcome of the LR model was presented. Results. The LR model showed that the following risk factors were associated with melanoma: sunbeds (OR = 4.018; 95% CI 1.724- 9.366 for those that sometimes used sunbeds, solar damage of the skin (OR = 8.274; 95% CI 2.661-25.730 for those with severe solar damage, hair color (OR = 3.222; 95% CI 1.984-5.231 for light brown/blond hair, the number of common naevi (over 100 naevi had OR = 3.57; 95% CI 1.427-8.931, the number of dysplastic naevi (from 1 to 10 dysplastic naevi OR was 2.672; 95% CI 1.572-4.540; for more than 10 naevi OR was 6.487; 95%; CI 1.993-21.119, Fitzpatricks phototype and the presence of congenital naevi. Red hair, phototype I and large congenital naevi were

  15. Predictive models in urology.

    Science.gov (United States)

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology.

  16. Prediction of Extreme Significant Wave Height from Daily Maxima

    Institute of Scientific and Technical Information of China (English)

    刘德辅; 李华军; 温书勤; 宋艳; 王树青

    2001-01-01

    For prediction of the extreme significant wave height in the ocean areas where long term wave data are not available, the empirical method of extrapolating short term data (1 ~ 3 years) is used in design practice. In this paper two methods are proposed to predict extreme significant wave height based on short-term daily maxima. According to the daa recorded by the Oceanographic Station of Liaodong Bay at the Bohai Sea, it is supposed that daily maximum wave heights are statistically independent. The data show that daily maximum wave heights obey log-normal distribution, and that the numbers of daily maxima vary from year to year, obeying binomial distribution. Based on these statistical characteristics, the binomial-log-normal compound extremum distribution is derived for prediction of extreme significant wave heights (50~ 100 years). For examination of its accuracy and validity, the prediction of extreme wave heights is based on 12 years′ data at this station, and based on each 3 years′ data respectively. The results show that with consideration of confidence intervals, the predicted wave heights based on 3 years′ data are very close to those based on 12 years′data. The observed data in some ocean areas in the Atlantic Ocean and the North Sea show it is not correct to assume that daily maximum wave heights are statistically independent; they are subject to Markov chain condition, obeying log-normal distribution. In this paper an analytical method is derived to predict extreme wave heights in these cases. A comparison of the computations shows that the difference between the extreme wave heights based on the assumption that daily maxima are statistically independent and that they are subject to Markov Chain condition is smaller than 10%.

  17. Predicting survival outcomes using subsets of significant genes in prognostic marker studies with microarrays

    Directory of Open Access Journals (Sweden)

    Matsui Shigeyuki

    2006-03-01

    Full Text Available Abstract Background Genetic markers hold great promise for refining our ability to establish precise prognostic prediction for diseases. The development of comprehensive gene expression microarray technology has allowed the selection of relevant marker genes from a large pool of candidate genes in early-phased, developmental prognostic marker studies. The primary analytical task in such studies is to select a small fraction of relevant genes, typically from a list of significant genes, for further investigation in subsequent studies. Results We develop a methodology for predicting survival outcomes using subsets of significant genes in prognostic marker studies with microarrays. Key components in this methodology include building prediction models, assessing predictive performance of prediction models, and assessing significance of prediction results. As particular specifications, we assume Cox proportional hazard models with a compound covariate. For assessing predictive accuracy, we propose to use the cross-validated log partial likelihood. To assess significance of prediction results, we apply permutation procedures in cross-validated prediction. As an additional key component peculiar to prognostic prediction, we also consider incorporation of standard prognostic factors. The methodology is evaluated using both simulated and real data. Conclusion The developed methodology for prognostic prediction using a subset of significant genes can provide new insights based on predictive capability, possibly incorporating standard prognostic factors, in selecting a fraction of relevant genes for subsequent studies.

  18. MODEL PREDICTIVE CONTROL FUNDAMENTALS

    African Journals Online (AJOL)

    2012-07-02

    Jul 2, 2012 ... paper, we will present an introduction to the theory and application of MPC with Matlab codes written to ... model predictive control, linear systems, discrete-time systems, ... and then compute very rapidly for this open-loop con-.

  19. Nominal model predictive control

    OpenAIRE

    Grüne, Lars

    2013-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  20. Nominal Model Predictive Control

    OpenAIRE

    Grüne, Lars

    2014-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  1. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  2. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  3. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...

  4. Predicting sulfotyrosine sites using the random forest algorithm with significantly improved prediction accuracy

    Directory of Open Access Journals (Sweden)

    Yang Zheng

    2009-10-01

    Full Text Available Abstract Background Tyrosine sulfation is one of the most important posttranslational modifications. Due to its relevance to various disease developments, tyrosine sulfation has become the target for drug design. In order to facilitate efficient drug design, accurate prediction of sulfotyrosine sites is desirable. A predictor published seven years ago has been very successful with claimed prediction accuracy of 98%. However, it has a particularly low sensitivity when predicting sulfotyrosine sites in some newly sequenced proteins. Results A new approach has been developed for predicting sulfotyrosine sites using the random forest algorithm after a careful evaluation of seven machine learning algorithms. Peptides are formed by consecutive residues symmetrically flanking tyrosine sites. They are then encoded using an amino acid hydrophobicity scale. This new approach has increased the sensitivity by 22%, the specificity by 3%, and the total prediction accuracy by 10% compared with the previous predictor using the same blind data. Meanwhile, both negative and positive predictive powers have been increased by 9%. In addition, the random forest model has an excellent feature for ranking the residues flanking tyrosine sites, hence providing more information for further investigating the tyrosine sulfation mechanism. A web tool has been implemented at http://ecsb.ex.ac.uk/sulfotyrosine for public use. Conclusion The random forest algorithm is able to deliver a better model compared with the Hidden Markov Model, the support vector machine, artificial neural networks, and others for predicting sulfotyrosine sites. The success shows that the random forest algorithm together with an amino acid hydrophobicity scale encoding can be a good candidate for peptide classification.

  5. Significance of Classification Techniques in Prediction of Learning Disabilities

    CERN Document Server

    Balakrishnan, Julie M David And Kannan

    2010-01-01

    The aim of this study is to show the importance of two classification techniques, viz. decision tree and clustering, in prediction of learning disabilities (LD) of school-age children. LDs affect about 10 percent of all children enrolled in schools. The problems of children with specific learning disabilities have been a cause of concern to parents and teachers for some time. Decision trees and clustering are powerful and popular tools used for classification and prediction in Data mining. Different rules extracted from the decision tree are used for prediction of learning disabilities. Clustering is the assignment of a set of observations into subsets, called clusters, which are useful in finding the different signs and symptoms (attributes) present in the LD affected child. In this paper, J48 algorithm is used for constructing the decision tree and K-means algorithm is used for creating the clusters. By applying these classification techniques, LD in any child can be identified.

  6. Predictive Models for Music

    OpenAIRE

    Paiement, Jean-François; Grandvalet, Yves; Bengio, Samy

    2008-01-01

    Modeling long-term dependencies in time series has proved very difficult to achieve with traditional machine learning methods. This problem occurs when considering music data. In this paper, we introduce generative models for melodies. We decompose melodic modeling into two subtasks. We first propose a rhythm model based on the distributions of distances between subsequences. Then, we define a generative model for melodies given chords and rhythms based on modeling sequences of Narmour featur...

  7. Predictive Maintenance (PdM) Centralization for Significant Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Dale

    2010-09-15

    Cost effective predictive maintenance (PdM) technologies and basic energy calculations can mine energy savings form processes or maintenance activities. Centralizing and packaging this information correctly empowers facility maintenance and reliability professionals to build financial justification and support for strategies and personnel to weather global economic downturns and competition. Attendees will learn how to: Systematically build a 'pilot project' for applying PdM and tracking systems; Break down a typical electrical bill to calculate energy savings; Use return on investment (ROI) calculations to identify the best and highest value options, strategies and tips for substantiating your energy reduction maintenance strategies.

  8. Exploring the significance of human mobility patterns in social link prediction

    KAUST Repository

    Alharbi, Basma Mohammed

    2014-01-01

    Link prediction is a fundamental task in social networks. Recently, emphasis has been placed on forecasting new social ties using user mobility patterns, e.g., investigating physical and semantic co-locations for new proximity measure. This paper explores the effect of in-depth mobility patterns. Specifically, we study individuals\\' movement behavior, and quantify mobility on the basis of trip frequency, travel purpose and transportation mode. Our hybrid link prediction model is composed of two modules. The first module extracts mobility patterns, including travel purpose and mode, from raw trajectory data. The second module employs the extracted patterns for link prediction. We evaluate our method on two real data sets, GeoLife [15] and Reality Mining [5]. Experimental results show that our hybrid model significantly improves the accuracy of social link prediction, when comparing to primary topology-based solutions. Copyright 2014 ACM.

  9. Zephyr - the prediction models

    DEFF Research Database (Denmark)

    Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg

    2001-01-01

    This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Dani...

  10. Application of neural networks and support vector machine for significant wave height prediction

    Directory of Open Access Journals (Sweden)

    Jadran Berbić

    2017-07-01

    Full Text Available For the purposes of planning and operation of maritime activities, information about wave height dynamics is of great importance. In the paper, real-time prediction of significant wave heights for the following 0.5–5.5 h is provided, using information from 3 or more time points. In the first stage, predictions are made by varying the quantity of significant wave heights from previous time points and various ways of using data are discussed. Afterwards, in the best model, according to the criteria of practicality and accuracy, the influence of wind is taken into account. Predictions are made using two machine learning methods – artificial neural networks (ANN and support vector machine (SVM. The models were built using the built-in functions of software Weka, developed by Waikato University, New Zealand.

  11. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    modelling strategy is applied to different training sets. For each modelling strategy we estimate a confidence score based on the same repeated bootstraps. A new decomposition of the expected Brier score is obtained, as well as the estimates of population average confidence scores. The latter can be used...... to distinguish rival prediction models with similar prediction performances. Furthermore, on the subject level a confidence score may provide useful supplementary information for new patients who want to base a medical decision on predicted risk. The ideas are illustrated and discussed using data from cancer...

  12. Modelling, controlling, predicting blackouts

    CERN Document Server

    Wang, Chengwei; Baptista, Murilo S

    2016-01-01

    The electric power system is one of the cornerstones of modern society. One of its most serious malfunctions is the blackout, a catastrophic event that may disrupt a substantial portion of the system, playing havoc to human life and causing great economic losses. Thus, understanding the mechanisms leading to blackouts and creating a reliable and resilient power grid has been a major issue, attracting the attention of scientists, engineers and stakeholders. In this paper, we study the blackout problem in power grids by considering a practical phase-oscillator model. This model allows one to simultaneously consider different types of power sources (e.g., traditional AC power plants and renewable power sources connected by DC/AC inverters) and different types of loads (e.g., consumers connected to distribution networks and consumers directly connected to power plants). We propose two new control strategies based on our model, one for traditional power grids, and another one for smart grids. The control strategie...

  13. [Predictive significance of tissue eosinophilia for nasal polyp recurrence].

    Science.gov (United States)

    Wang, C S; Lou, H F; Meng, Y F; Piao, Y S; Zhang, L

    2016-04-07

    To evaluate the association between clinical parameters, especially tissue eosinophilia, and chronic sinusitis with nasal polyps (CRSwNP) recurrence. To identify optimal criteria of tissue eosinophilia as a predictor for recurrence. Two hundred and forty-eight CRSwNP patients were enrolled in this study. The demographic and clinical features were compared between recurrence and no recurrence groups. Mucosal specimens were assessed for the presence of tissue inflammatory cells. Factors associated with polyp recurrence were analyzed by Logistic regression analysis and optimal cutoff point of the predictor for nasal polyp recurrence was determined by receiver operating characteristic curve. SPSS 19.0 software was used to analyze the data. The recurrence rate was 55.6%(138/248 patients) in this cohort. Tissue and peripheral eosinophilia, comorbid asthma, olfactory score and Lund-Mackay score were significantly correlated with polyp recurrence(all Peosinophilia provides valuable information regarding to polyp recurrence. Tissue eosinophil proportion equal to or over 27% may be regarded as the prognostic criterion for nasal polyp recurrence.

  14. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Prediction models in complex terrain

    DEFF Research Database (Denmark)

    Marti, I.; Nielsen, Torben Skov; Madsen, Henrik

    2001-01-01

    are calculated using on-line measurements of power production as well as HIRLAM predictions as input thus taking advantage of the auto-correlation, which is present in the power production for shorter pediction horizons. Statistical models are used to discribe the relationship between observed energy production......The objective of the work is to investigatethe performance of HIRLAM in complex terrain when used as input to energy production forecasting models, and to develop a statistical model to adapt HIRLAM prediction to the wind farm. The features of the terrain, specially the topography, influence...... and HIRLAM predictions. The statistical models belong to the class of conditional parametric models. The models are estimated using local polynomial regression, but the estimation method is here extended to be adaptive in order to allow for slow changes in the system e.g. caused by the annual variations...

  16. Modelling vocal anatomy's significant effect on speech

    NARCIS (Netherlands)

    de Boer, B.

    2010-01-01

    This paper investigates the effect of larynx position on the articulatory abilities of a humanlike vocal tract. Previous work has investigated models that were built to resemble the anatomy of existing species or fossil ancestors. This has led to conflicting conclusions about the relation between

  17. Prediction models in complex terrain

    DEFF Research Database (Denmark)

    Marti, I.; Nielsen, Torben Skov; Madsen, Henrik

    2001-01-01

    The objective of the work is to investigatethe performance of HIRLAM in complex terrain when used as input to energy production forecasting models, and to develop a statistical model to adapt HIRLAM prediction to the wind farm. The features of the terrain, specially the topography, influence...

  18. Predictive models of forest dynamics.

    Science.gov (United States)

    Purves, Drew; Pacala, Stephen

    2008-06-13

    Dynamic global vegetation models (DGVMs) have shown that forest dynamics could dramatically alter the response of the global climate system to increased atmospheric carbon dioxide over the next century. But there is little agreement between different DGVMs, making forest dynamics one of the greatest sources of uncertainty in predicting future climate. DGVM predictions could be strengthened by integrating the ecological realities of biodiversity and height-structured competition for light, facilitated by recent advances in the mathematics of forest modeling, ecological understanding of diverse forest communities, and the availability of forest inventory data.

  19. Risk terrain modeling predicts child maltreatment.

    Science.gov (United States)

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children.

  20. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  1. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  2. Predictive model for segmented poly(urea

    Directory of Open Access Journals (Sweden)

    Frankl P.

    2012-08-01

    Full Text Available Segmented poly(urea has been shown to be of significant benefit in protecting vehicles from blast and impact and there have been several experimental studies to determine the mechanisms by which this protective function might occur. One suggested route is by mechanical activation of the glass transition. In order to enable design of protective structures using this material a constitutive model and equation of state are needed for numerical simulation hydrocodes. Determination of such a predictive model may also help elucidate the beneficial mechanisms that occur in polyurea during high rate loading. The tool deployed to do this has been Group Interaction Modelling (GIM – a mean field technique that has been shown to predict the mechanical and physical properties of polymers from their structure alone. The structure of polyurea has been used to characterise the parameters in the GIM scheme without recourse to experimental data and the equation of state and constitutive model predicts response over a wide range of temperatures and strain rates. The shock Hugoniot has been predicted and validated against existing data. Mechanical response in tensile tests has also been predicted and validated.

  3. Modelling language evolution: Examples and predictions.

    Science.gov (United States)

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines.

  4. Modelling language evolution: Examples and predictions

    Science.gov (United States)

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines.

  5. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions.

    Science.gov (United States)

    Hengl, Tomislav; Heuvelink, Gerard B M; Kempen, Bas; Leenaars, Johan G B; Walsh, Markus G; Shepherd, Keith D; Sila, Andrew; MacMillan, Robert A; Mendes de Jesus, Jorge; Tamene, Lulseged; Tondoh, Jérôme E

    2015-01-01

    80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS) project was established in 2008. Over the period 2008-2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy) database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management--organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na). We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15-75% in Root Mean Squared Error (RMSE) across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols) help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring pedological

  6. Atypia of undetermined significance and follicular lesions of undetermined significance: sonographic assessment for prediction of the final diagnosis.

    Science.gov (United States)

    Kamaya, Aya; Lewis, Gloria Huang; Liu, Yi; Akatsu, Haruko; Kong, Christina; Desser, Terry S

    2015-05-01

    To determine whether radiologic assessment of thyroid nodules can potentially help guide clinical management after a cytologic diagnosis of atypia of undetermined significance or a follicular lesion of undetermined significance. We identified 41 patients with 41 thyroid nodules initially diagnosed as atypia or follicular lesions of undetermined significance on fine-needle aspiration that were subsequently definitively diagnosed by either surgical resection or repeated fine-needle aspiration. All sonograms of nodules were reviewed by 2 blinded board-certifiedradiologists. Lesions were assessed in 3 ways: (1) Mayo pattern classification as benign, indeterminate, or worrisome for malignancy (Ultrasound Q 2005; 21:157-165); (2) thyroid imaging reporting and data system scores (scale of 1-5) based on 2 different previously published scoring criteria (Park et al [Thyroid 2009; 19:1257-1264] and Kwak et al [Radiology 2011; 260:892-899]); and (3) binary classification as benign or malignant. Of the 41 nodules, 25 had benign histologic findings, and 16 were malignant. Mayo pattern classification was 100% accurate for the benign score. Lesions with a Mayo score of indeterminate were malignant in 21% of cases (6 of 28) and benign in 79% (22 of 28). Lesions with a Mayo score of malignant were malignant in 91% of cases (10 of 11) and benign in 9% (1 of 11). Thyroid imaging reporting and data system scores had area under the receiver operating characteristic curve values of 0.827 for Park scores and 0.822 for Kwak scores. Radiologist binary classification of thyroid nodules showed 88% overall accuracy. Radiologist assessment of thyroid nodules in cases of atypia of undetermined significance or follicular lesions of undetermined significance is highly predictive of the final diagnosis and can help guide management of thyroid nodules of these pathologic types. © 2015 by the American Institute of Ultrasound in Medicine.

  7. PREDICT : model for prediction of survival in localized prostate cancer

    NARCIS (Netherlands)

    Kerkmeijer, Linda G W; Monninkhof, Evelyn M.; van Oort, Inge M.; van der Poel, Henk G.; de Meerleer, Gert; van Vulpen, Marco

    2016-01-01

    Purpose: Current models for prediction of prostate cancer-specific survival do not incorporate all present-day interventions. In the present study, a pre-treatment prediction model for patients with localized prostate cancer was developed.Methods: From 1989 to 2008, 3383 patients were treated with I

  8. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  9. Predictive Modeling of Cardiac Ischemia

    Science.gov (United States)

    Anderson, Gary T.

    1996-01-01

    The goal of the Contextual Alarms Management System (CALMS) project is to develop sophisticated models to predict the onset of clinical cardiac ischemia before it occurs. The system will continuously monitor cardiac patients and set off an alarm when they appear about to suffer an ischemic episode. The models take as inputs information from patient history and combine it with continuously updated information extracted from blood pressure, oxygen saturation and ECG lines. Expert system, statistical, neural network and rough set methodologies are then used to forecast the onset of clinical ischemia before it transpires, thus allowing early intervention aimed at preventing morbid complications from occurring. The models will differ from previous attempts by including combinations of continuous and discrete inputs. A commercial medical instrumentation and software company has invested funds in the project with a goal of commercialization of the technology. The end product will be a system that analyzes physiologic parameters and produces an alarm when myocardial ischemia is present. If proven feasible, a CALMS-based system will be added to existing heart monitoring hardware.

  10. Electrostatic ion thrusters - towards predictive modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kalentev, O.; Matyash, K.; Duras, J.; Lueskow, K.F.; Schneider, R. [Ernst-Moritz-Arndt Universitaet Greifswald, D-17489 (Germany); Koch, N. [Technische Hochschule Nuernberg Georg Simon Ohm, Kesslerplatz 12, D-90489 Nuernberg (Germany); Schirra, M. [Thales Electronic Systems GmbH, Soeflinger Strasse 100, D-89077 Ulm (Germany)

    2014-02-15

    The development of electrostatic ion thrusters so far has mainly been based on empirical and qualitative know-how, and on evolutionary iteration steps. This resulted in considerable effort regarding prototype design, construction and testing and therefore in significant development and qualification costs and high time demands. For future developments it is anticipated to implement simulation tools which allow for quantitative prediction of ion thruster performance, long-term behavior and space craft interaction prior to hardware design and construction. Based on integrated numerical models combining self-consistent kinetic plasma models with plasma-wall interaction modules a new quality in the description of electrostatic thrusters can be reached. These open the perspective for predictive modeling in this field. This paper reviews the application of a set of predictive numerical modeling tools on an ion thruster model of the HEMP-T (High Efficiency Multi-stage Plasma Thruster) type patented by Thales Electron Devices GmbH. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  11. Numerical weather prediction model tuning via ensemble prediction system

    Science.gov (United States)

    Jarvinen, H.; Laine, M.; Ollinaho, P.; Solonen, A.; Haario, H.

    2011-12-01

    This paper discusses a novel approach to tune predictive skill of numerical weather prediction (NWP) models. NWP models contain tunable parameters which appear in parameterizations schemes of sub-grid scale physical processes. Currently, numerical values of these parameters are specified manually. In a recent dual manuscript (QJRMS, revised) we developed a new concept and method for on-line estimation of the NWP model parameters. The EPPES ("Ensemble prediction and parameter estimation system") method requires only minimal changes to the existing operational ensemble prediction infra-structure and it seems very cost-effective because practically no new computations are introduced. The approach provides an algorithmic decision making tool for model parameter optimization in operational NWP. In EPPES, statistical inference about the NWP model tunable parameters is made by (i) generating each member of the ensemble of predictions using different model parameter values, drawn from a proposal distribution, and (ii) feeding-back the relative merits of the parameter values to the proposal distribution, based on evaluation of a suitable likelihood function against verifying observations. In the presentation, the method is first illustrated in low-order numerical tests using a stochastic version of the Lorenz-95 model which effectively emulates the principal features of ensemble prediction systems. The EPPES method correctly detects the unknown and wrongly specified parameters values, and leads to an improved forecast skill. Second, results with an atmospheric general circulation model based ensemble prediction system show that the NWP model tuning capacity of EPPES scales up to realistic models and ensemble prediction systems. Finally, a global top-end NWP model tuning exercise with preliminary results is published.

  12. Using a Prediction Model to Manage Cyber Security Threats

    National Research Council Canada - National Science Library

    Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya

    2015-01-01

    .... The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security...

  13. Predicting Intentions of a Familiar Significant Other Beyond the Mirror Neuron System

    Directory of Open Access Journals (Sweden)

    Stephanie Cacioppo

    2017-08-01

    Full Text Available Inferring intentions of others is one of the most intriguing issues in interpersonal interaction. Theories of embodied cognition and simulation suggest that this mechanism takes place through a direct and automatic matching process that occurs between an observed action and past actions. This process occurs via the reactivation of past self-related sensorimotor experiences within the inferior frontoparietal network (including the mirror neuron system, MNS. The working model is that the anticipatory representations of others' behaviors require internal predictive models of actions formed from pre-established, shared representations between the observer and the actor. This model suggests that observers should be better at predicting intentions performed by a familiar actor, rather than a stranger. However, little is known about the modulations of the intention brain network as a function of the familiarity between the observer and the actor. Here, we combined functional magnetic resonance imaging (fMRI with a behavioral intention inference task, in which participants were asked to predict intentions from three types of actors: A familiar actor (their significant other, themselves (another familiar actor, and a non-familiar actor (a stranger. Our results showed that the participants were better at inferring intentions performed by familiar actors than non-familiar actors and that this better performance was associated with greater activation within and beyond the inferior frontoparietal network i.e., in brain areas related to familiarity (e.g., precuneus. In addition, and in line with Hebbian principles of neural modulations, the more the participants reported being cognitively close to their partner, the less the brain areas associated with action self-other comparison (e.g., inferior parietal lobule, attention (e.g., superior parietal lobule, recollection (hippocampus, and pair bond (ventral tegmental area, VTA were recruited, suggesting that the

  14. A six-month longitudinal evaluation significantly improves accuracy of predicting incipient Alzheimer's disease in mild cognitive impairment.

    Science.gov (United States)

    Mubeen, Asim M; Asaei, Ali; Bachman, Alvin H; Sidtis, John J; Ardekani, Babak A

    2017-07-01

    Early prediction of incipient Alzheimer's disease (AD) dementia in individuals with mild cognitive impairment (MCI) is important for timely therapeutic intervention and identifying participants for clinical trials at greater risk of developing AD. Methods to predict incipient AD in MCI have mostly utilized cross-sectional data. Longitudinal data enables estimation of the rate of change of variables, which along with the variable levels have been shown to improve prediction power. While some efforts have already been made in this direction, all previous longitudinal studies have been based on observation periods longer than one year, hence limiting their practical utility. It remains to be seen if follow-up evaluations within shorter intervals can significantly improve the accuracy of prediction in this problem. Our aim was to determine the added value of incorporating 6-month longitudinal data for predicting progression from MCI to AD. Using 6-months longitudinal data from 247 participants with MCI, we trained two Random Forest classifiers to distinguish between progressive MCI (n=162) and stable MCI (n=85) cases. These models utilized structural MRI, neurocognitive assessments, and demographic information. The first model (cross-sectional) only used baseline data. The second model (longitudinal) used data from both baseline and a 6-month follow-up evaluation allowing the model to additionally incorporate biomarkers' rate of change. The longitudinal model (AUC=0.87; accuracy=80.2%) performed significantly better (P<0.05) than the cross-sectional model (AUC=0.82; accuracy=71.7%). Short-term longitudinal assessments significantly enhance the performance of AD prediction models. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  15. Seasonal Predictability in a Model Atmosphere.

    Science.gov (United States)

    Lin, Hai

    2001-07-01

    The predictability of atmospheric mean-seasonal conditions in the absence of externally varying forcing is examined. A perfect-model approach is adopted, in which a global T21 three-level quasigeostrophic atmospheric model is integrated over 21 000 days to obtain a reference atmospheric orbit. The model is driven by a time-independent forcing, so that the only source of time variability is the internal dynamics. The forcing is set to perpetual winter conditions in the Northern Hemisphere (NH) and perpetual summer in the Southern Hemisphere.A significant temporal variability in the NH 90-day mean states is observed. The component of that variability associated with the higher-frequency motions, or climate noise, is estimated using a method developed by Madden. In the polar region, and to a lesser extent in the midlatitudes, the temporal variance of the winter means is significantly greater than the climate noise, suggesting some potential predictability in those regions.Forecast experiments are performed to see whether the presence of variance in the 90-day mean states that is in excess of the climate noise leads to some skill in the prediction of these states. Ensemble forecast experiments with nine members starting from slightly different initial conditions are performed for 200 different 90-day means along the reference atmospheric orbit. The serial correlation between the ensemble means and the reference orbit shows that there is skill in the 90-day mean predictions. The skill is concentrated in those regions of the NH that have the largest variance in excess of the climate noise. An EOF analysis shows that nearly all the predictive skill in the seasonal means is associated with one mode of variability with a strong axisymmetric component.

  16. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...

  17. Statistical assessment of predictive modeling uncertainty

    Science.gov (United States)

    Barzaghi, Riccardo; Marotta, Anna Maria

    2017-04-01

    When the results of geophysical models are compared with data, the uncertainties of the model are typically disregarded. We propose a method for defining the uncertainty of a geophysical model based on a numerical procedure that estimates the empirical auto and cross-covariances of model-estimated quantities. These empirical values are then fitted by proper covariance functions and used to compute the covariance matrix associated with the model predictions. The method is tested using a geophysical finite element model in the Mediterranean region. Using a novel χ2 analysis in which both data and model uncertainties are taken into account, the model's estimated tectonic strain pattern due to the Africa-Eurasia convergence in the area that extends from the Calabrian Arc to the Alpine domain is compared with that estimated from GPS velocities while taking into account the model uncertainty through its covariance structure and the covariance of the GPS estimates. The results indicate that including the estimated model covariance in the testing procedure leads to lower observed χ2 values that have better statistical significance and might help a sharper identification of the best-fitting geophysical models.

  18. Mitral annular calcification and aortic valve calcification may help in predicting significant coronary artery disease.

    Science.gov (United States)

    Acartürk, Esmeray; Bozkurt, Abdi; Cayli, Murat; Demir, Mesut

    2003-01-01

    Mitral annular calcification (MAC) and aortic valve calcification (AVC) are manifestations of atherosclerosis. To determine whether mitral annular calcification and aortic valve calcification detected by transthoracic echocardiography (TTE) might help in predicting significant coronary artery disease (CAD), 123 patients with significant CAD and 93 patients without CAD detected by coronary angiography were investigated. MAC and AVC identified CAD with a sensitivity and specificity of 60.2%, 55.9% and 74.8%, 52.7%, respectively, and with a negative and a positive predictive values of 51.5%, 64.3% and 61.3% and 67.6%, respectively. The positive predictive value of MAC was greater than gender, hypertension, and hypercholesterolemia. AVC showed a positive predictive value greater than gender, hypertension, family history, and hypercholesterolemia. The negative predictive values of MAC and AVC for CAD were greater than those of all risk factors except diabetes mellitus. In conclusion, presence of MAC and AVC on TTE may help in predicting CAD and should be added to conventional risk factors. Absence of MVC and AVC is a stronger predictor for absence of CAD than all conventional risk factors, except diabetes mellitus. Patients with MAC and AVC should be taken into consideration for the presence of significant CAD and thereby for diagnostic and therapeutic interventions in order to improve the prognosis.

  19. Predictive Model Assessment for Count Data

    Science.gov (United States)

    2007-09-05

    critique count regression models for patent data, and assess the predictive performance of Bayesian age-period-cohort models for larynx cancer counts...the predictive performance of Bayesian age-period-cohort models for larynx cancer counts in Germany. We consider a recent suggestion by Baker and...Figure 5. Boxplots for various scores for patent data count regressions. 11 Table 1 Four predictive models for larynx cancer counts in Germany, 1998–2002

  20. Predictability of the Indian Ocean Dipole in the coupled models

    Science.gov (United States)

    Liu, Huafeng; Tang, Youmin; Chen, Dake; Lian, Tao

    2017-03-01

    In this study, the Indian Ocean Dipole (IOD) predictability, measured by the Indian Dipole Mode Index (DMI), is comprehensively examined at the seasonal time scale, including its actual prediction skill and potential predictability, using the ENSEMBLES multiple model ensembles and the recently developed information-based theoretical framework of predictability. It was found that all model predictions have useful skill, which is normally defined by the anomaly correlation coefficient larger than 0.5, only at around 2-3 month leads. This is mainly because there are more false alarms in predictions as leading time increases. The DMI predictability has significant seasonal variation, and the predictions whose target seasons are boreal summer (JJA) and autumn (SON) are more reliable than that for other seasons. All of models fail to predict the IOD onset before May and suffer from the winter (DJF) predictability barrier. The potential predictability study indicates that, with the model development and initialization improvement, the prediction of IOD onset is likely to be improved but the winter barrier cannot be overcome. The IOD predictability also has decadal variation, with a high skill during the 1960s and the early 1990s, and a low skill during the early 1970s and early 1980s, which is very consistent with the potential predictability. The main factors controlling the IOD predictability, including its seasonal and decadal variations, are also analyzed in this study.

  1. Significance of High Resolution GHRSST on prediction of Indian Summer Monsoon

    KAUST Repository

    Jangid, Buddhi Prakash

    2017-02-24

    In this study, the Weather Research and Forecasting (WRF) model was used to assess the importance of very high resolution sea surface temperature (SST) on seasonal rainfall prediction. Two different SST datasets available from the National Centers for Environmental Prediction (NCEP) global model analysis and merged satellite product from Group for High Resolution SST (GHRSST) are used as a lower boundary condition in the WRF model for the Indian Summer Monsoon (ISM) 2010. Before using NCEP SST and GHRSST for model simulation, an initial verification of NCEP SST and GHRSST are performed with buoy measurements. It is found that approximately 0.4 K root mean square difference (RMSD) in GHRSST and NCEP SST when compared with buoy observations available over the Indian Ocean during 01 May to 30 September 2010. Our analyses suggest that use of GHRSST as lower boundary conditions in the WRF model improve the low level temperature, moisture, wind speed and rainfall prediction over ISM region. Moreover, temporal evolution of surface parameters such as temperature, moisture and wind speed forecasts associated with monsoon is also improved with GHRSST forcing as a lower boundary condition. Interestingly, rainfall prediction is improved with the use of GHRSST over the Western Ghats, which mostly not simulated in the NCEP SST based experiment.

  2. Nonlinear chaotic model for predicting storm surges

    Directory of Open Access Journals (Sweden)

    M. Siek

    2010-09-01

    Full Text Available This paper addresses the use of the methods of nonlinear dynamics and chaos theory for building a predictive chaotic model from time series. The chaotic model predictions are made by the adaptive local models based on the dynamical neighbors found in the reconstructed phase space of the observables. We implemented the univariate and multivariate chaotic models with direct and multi-steps prediction techniques and optimized these models using an exhaustive search method. The built models were tested for predicting storm surge dynamics for different stormy conditions in the North Sea, and are compared to neural network models. The results show that the chaotic models can generally provide reliable and accurate short-term storm surge predictions.

  3. Nonlinear chaotic model for predicting storm surges

    NARCIS (Netherlands)

    Siek, M.; Solomatine, D.P.

    This paper addresses the use of the methods of nonlinear dynamics and chaos theory for building a predictive chaotic model from time series. The chaotic model predictions are made by the adaptive local models based on the dynamical neighbors found in the reconstructed phase space of the observables.

  4. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  5. EFFICIENT PREDICTIVE MODELLING FOR ARCHAEOLOGICAL RESEARCH

    OpenAIRE

    Balla, A.; Pavlogeorgatos, G.; Tsiafakis, D.; Pavlidis, G.

    2014-01-01

    The study presents a general methodology for designing, developing and implementing predictive modelling for identifying areas of archaeological interest. The methodology is based on documented archaeological data and geographical factors, geospatial analysis and predictive modelling, and has been applied to the identification of possible Macedonian tombs’ locations in Northern Greece. The model was tested extensively and the results were validated using a commonly used predictive gain,...

  6. How to Establish Clinical Prediction Models

    Directory of Open Access Journals (Sweden)

    Yong-ho Lee

    2016-03-01

    Full Text Available A clinical prediction model can be applied to several challenging clinical scenarios: screening high-risk individuals for asymptomatic disease, predicting future events such as disease or death, and assisting medical decision-making and health education. Despite the impact of clinical prediction models on practice, prediction modeling is a complex process requiring careful statistical analyses and sound clinical judgement. Although there is no definite consensus on the best methodology for model development and validation, a few recommendations and checklists have been proposed. In this review, we summarize five steps for developing and validating a clinical prediction model: preparation for establishing clinical prediction models; dataset selection; handling variables; model generation; and model evaluation and validation. We also review several studies that detail methods for developing clinical prediction models with comparable examples from real practice. After model development and vigorous validation in relevant settings, possibly with evaluation of utility/usability and fine-tuning, good models can be ready for the use in practice. We anticipate that this framework will revitalize the use of predictive or prognostic research in endocrinology, leading to active applications in real clinical practice.

  7. Sixth hour transcutaneous bilirubin predicting significant hyperbilirubinemia in ABO incompatible neonates

    Institute of Scientific and Technical Information of China (English)

    Ramesh Y Bhat; Pavan CG Kumar

    2014-01-01

    Background: Neonates with ABO hemolytic disease are at greater risk for developing significant hyperbilirubinemia. We aimed to determine whether sixth hour transcutaneous bilirubin (TcB) could predict such a risk. Methods: TcB measurements were obtained at the 6th hour of life in blood group A or B neonates born to blood group O, rhesus factor compatible mothers. Subsequent hyperbilirubinemia was monitored and considered significant if a neonate required phototherapy/exchange transfusion. The predictive role of sixth hour TcB was estimated. Results: Of 144 ABO incompatible neonates, 41(OA, 24; O-B, 17) had significant hyperbilirubinemia. Mean sixth hour TcB was significantly higher among neonates who developed significant hyperbilirubinemia than those who did not (5.83±1.35 mg/dL vs. 3.65±0.96 mg/dL, P4 mg/dL had the highest sensitivity of 93.5% and >6 mg/dL had the highest specifi city of 99%. Area under receiver operating characteristic curve was 0.898. Conclusion: Sixth hour TcB predicts subsequent significant hyperbilirubinemia in ABO incompatible neonates.

  8. Artificial Neural Network Model for Predicting Compressive

    Directory of Open Access Journals (Sweden)

    Salim T. Yousif

    2013-05-01

    Full Text Available   Compressive strength of concrete is a commonly used criterion in evaluating concrete. Although testing of the compressive strength of concrete specimens is done routinely, it is performed on the 28th day after concrete placement. Therefore, strength estimation of concrete at early time is highly desirable. This study presents the effort in applying neural network-based system identification techniques to predict the compressive strength of concrete based on concrete mix proportions, maximum aggregate size (MAS, and slump of fresh concrete. Back-propagation neural networks model is successively developed, trained, and tested using actual data sets of concrete mix proportions gathered from literature.    The test of the model by un-used data within the range of input parameters shows that the maximum absolute error for model is about 20% and 88% of the output results has absolute errors less than 10%. The parametric study shows that water/cement ratio (w/c is the most significant factor  affecting the output of the model.     The results showed that neural networks has strong potential as a feasible tool for predicting compressive strength of concrete.

  9. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    is a realization of a continuous-discrete multivariate stochastic transfer function model. The proposed prediction error-methods are demonstrated for a SISO system parameterized by the transfer functions with time delays of a continuous-discrete-time linear stochastic system. The simulations for this case suggest......Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which...... computational resources. The identification method is suitable for predictive control....

  10. Model predictive control of MSMPR crystallizers

    Science.gov (United States)

    Moldoványi, Nóra; Lakatos, Béla G.; Szeifert, Ferenc

    2005-02-01

    A multi-input-multi-output (MIMO) control problem of isothermal continuous crystallizers is addressed in order to create an adequate model-based control system. The moment equation model of mixed suspension, mixed product removal (MSMPR) crystallizers that forms a dynamical system is used, the state of which is represented by the vector of six variables: the first four leading moments of the crystal size, solute concentration and solvent concentration. Hence, the time evolution of the system occurs in a bounded region of the six-dimensional phase space. The controlled variables are the mean size of the grain; the crystal size-distribution and the manipulated variables are the input concentration of the solute and the flow rate. The controllability and observability as well as the coupling between the inputs and the outputs was analyzed by simulation using the linearized model. It is shown that the crystallizer is a nonlinear MIMO system with strong coupling between the state variables. Considering the possibilities of the model reduction, a third-order model was found quite adequate for the model estimation in model predictive control (MPC). The mean crystal size and the variance of the size distribution can be nearly separately controlled by the residence time and the inlet solute concentration, respectively. By seeding, the controllability of the crystallizer increases significantly, and the overshoots and the oscillations become smaller. The results of the controlling study have shown that the linear MPC is an adaptable and feasible controller of continuous crystallizers.

  11. Atypia of undetermined significance and follicular lesions of undetermined significance: sonographic assessment for prediction of the final diagnosis

    National Research Council Canada - National Science Library

    Kamaya, Aya; Lewis, Gloria Huang; Liu, Yi; Akatsu, Haruko; Kong, Christina; Desser, Terry S

    2015-01-01

    To determine whether radiologic assessment of thyroid nodules can potentially help guide clinical management after a cytologic diagnosis of atypia of undetermined significance or a follicular lesion...

  12. Case studies in archaeological predictive modelling

    NARCIS (Netherlands)

    Verhagen, Jacobus Wilhelmus Hermanus Philippus

    2007-01-01

    In this thesis, a collection of papers is put together dealing with various quantitative aspects of predictive modelling and archaeological prospection. Among the issues covered are the effects of survey bias on the archaeological data used for predictive modelling, and the complexities of testing p

  13. Childhood asthma prediction models: a systematic review.

    Science.gov (United States)

    Smit, Henriette A; Pinart, Mariona; Antó, Josep M; Keil, Thomas; Bousquet, Jean; Carlsen, Kai H; Moons, Karel G M; Hooft, Lotty; Carlsen, Karin C Lødrup

    2015-12-01

    Early identification of children at risk of developing asthma at school age is crucial, but the usefulness of childhood asthma prediction models in clinical practice is still unclear. We systematically reviewed all existing prediction models to identify preschool children with asthma-like symptoms at risk of developing asthma at school age. Studies were included if they developed a new prediction model or updated an existing model in children aged 4 years or younger with asthma-like symptoms, with assessment of asthma done between 6 and 12 years of age. 12 prediction models were identified in four types of cohorts of preschool children: those with health-care visits, those with parent-reported symptoms, those at high risk of asthma, or children in the general population. Four basic models included non-invasive, easy-to-obtain predictors only, notably family history, allergic disease comorbidities or precursors of asthma, and severity of early symptoms. Eight extended models included additional clinical tests, mostly specific IgE determination. Some models could better predict asthma development and other models could better rule out asthma development, but the predictive performance of no single model stood out in both aspects simultaneously. This finding suggests that there is a large proportion of preschool children with wheeze for which prediction of asthma development is difficult.

  14. Model predictive control classical, robust and stochastic

    CERN Document Server

    Kouvaritakis, Basil

    2016-01-01

    For the first time, a textbook that brings together classical predictive control with treatment of up-to-date robust and stochastic techniques. Model Predictive Control describes the development of tractable algorithms for uncertain, stochastic, constrained systems. The starting point is classical predictive control and the appropriate formulation of performance objectives and constraints to provide guarantees of closed-loop stability and performance. Moving on to robust predictive control, the text explains how similar guarantees may be obtained for cases in which the model describing the system dynamics is subject to additive disturbances and parametric uncertainties. Open- and closed-loop optimization are considered and the state of the art in computationally tractable methods based on uncertainty tubes presented for systems with additive model uncertainty. Finally, the tube framework is also applied to model predictive control problems involving hard or probabilistic constraints for the cases of multiplic...

  15. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  16. Parameter definition using vibration prediction software leads to significant drilling performance improvements

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Dalmo; Hanley, Chris Hanley; Fonseca, Isaac; Santos, Juliana [National Oilwell Varco, Houston TX (United States); Leite, Daltro J.; Borella, Augusto; Gozzi, Danilo [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    The understanding and mitigation of downhole vibration has been a heavily researched subject in the oil industry as it results in more expensive drilling operations, as vibrations significantly diminish the amount of effective drilling energy available to the bit and generate forces that can push the bit or the Bottom Hole Assembly (BHA) off its concentric axis of rotation, producing high magnitude impacts with the borehole wall. In order to drill ahead, a sufficient amount of energy must be supplied by the rig to overcome the resistance of the drilling system, including the reactive torque of the system, drag forces, fluid pressure losses and energy dissipated by downhole vibrations, then providing the bit with the energy required to fail the rock. If the drill string enters resonant modes of vibration, not only does it decreases the amount of available energy to drill, but increases the potential for catastrophic downhole equipment and drilling bit failures. In this sense, the mitigation of downhole vibrations will result in faster, smoother, and cheaper drilling operations. A software tool using Finite Element Analysis (FEA) has been developed to provide better understanding of downhole vibration phenomena in drilling environments. The software tool calculates the response of the drilling system at various input conditions, based on the design of the wellbore along with the geometry of the Bottom Hole Assembly (BHA) and the drill string. It identifies where undesired levels of resonant vibration will be driven by certain combinations of specific drilling parameters, and also which combinations of drilling parameters will result in lower levels of vibration, so the least shocks, the highest penetration rate and the lowest cost per foot can be achieved. With the growing performance of personal computers, complex software systems modeling the drilling vibrations using FEA has been accessible to a wider audience of field users, further complimenting with real time

  17. A predictive fitness model for influenza

    Science.gov (United States)

    Łuksza, Marta; Lässig, Michael

    2014-03-01

    The seasonal human influenza A/H3N2 virus undergoes rapid evolution, which produces significant year-to-year sequence turnover in the population of circulating strains. Adaptive mutations respond to human immune challenge and occur primarily in antigenic epitopes, the antibody-binding domains of the viral surface protein haemagglutinin. Here we develop a fitness model for haemagglutinin that predicts the evolution of the viral population from one year to the next. Two factors are shown to determine the fitness of a strain: adaptive epitope changes and deleterious mutations outside the epitopes. We infer both fitness components for the strains circulating in a given year, using population-genetic data of all previous strains. From fitness and frequency of each strain, we predict the frequency of its descendent strains in the following year. This fitness model maps the adaptive history of influenza A and suggests a principled method for vaccine selection. Our results call for a more comprehensive epidemiology of influenza and other fast-evolving pathogens that integrates antigenic phenotypes with other viral functions coupled by genetic linkage.

  18. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...

  19. Massive Predictive Modeling using Oracle R Enterprise

    CERN Document Server

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  20. Gaussian mixture models as flux prediction method for central receivers

    Science.gov (United States)

    Grobler, Annemarie; Gauché, Paul; Smit, Willie

    2016-05-01

    Flux prediction methods are crucial to the design and operation of central receiver systems. Current methods such as the circular and elliptical (bivariate) Gaussian prediction methods are often used in field layout design and aiming strategies. For experimental or small central receiver systems, the flux profile of a single heliostat often deviates significantly from the circular and elliptical Gaussian models. Therefore a novel method of flux prediction was developed by incorporating the fitting of Gaussian mixture models onto flux profiles produced by flux measurement or ray tracing. A method was also developed to predict the Gaussian mixture model parameters of a single heliostat for a given time using image processing. Recording the predicted parameters in a database ensures that more accurate predictions are made in a shorter time frame.

  1. Using a Prediction Model to Manage Cyber Security Threats

    Directory of Open Access Journals (Sweden)

    Venkatesh Jaganathan

    2015-01-01

    Full Text Available Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  2. Using a Prediction Model to Manage Cyber Security Threats.

    Science.gov (United States)

    Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  3. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Pancreatic Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Esophageal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Lung Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Breast Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Testicular Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  15. Analytic Methods for Predicting Significant Multi-Quanta Effects in Collisional Molecular Energy Transfer

    Science.gov (United States)

    Bieniek, Ronald J.

    1996-01-01

    Collision-induced transitions can significantly affect molecular vibrational-rotational populations and energy transfer in atmospheres and gaseous systems. This, in turn. can strongly influence convective heat transfer through dissociation and recombination of diatomics. and radiative heat transfer due to strong vibrational coupling. It is necessary to know state-to-state rates to predict engine performance and aerothermodynamic behavior of hypersonic flows, to analyze diagnostic radiative data obtained from experimental test facilities, and to design heat shields and other thermal protective systems. Furthermore, transfer rates between vibrational and translational modes can strongly influence energy flow in various 'disturbed' environments, particularly where the vibrational and translational temperatures are not equilibrated.

  16. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  17. A Course in... Model Predictive Control.

    Science.gov (United States)

    Arkun, Yaman; And Others

    1988-01-01

    Describes a graduate engineering course which specializes in model predictive control. Lists course outline and scope. Discusses some specific topics and teaching methods. Suggests final projects for the students. (MVL)

  18. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel;

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions...... day (using the area under the receiver operating characteristic curve (AUC) and kappa statistics) and by assessing consistency in predictions of range size changes under future climate (using cluster analysis). Results Our analyses show significant differences between predictions from different models......, with predicted changes in range size by 2030 differing in both magnitude and direction (e.g. from 92% loss to 322% gain). We explain differences with reference to two characteristics of the modelling techniques: data input requirements (presence/absence vs. presence-only approaches) and assumptions made by each...

  19. Equivalency and unbiasedness of grey prediction models

    Institute of Scientific and Technical Information of China (English)

    Bo Zeng; Chuan Li; Guo Chen; Xianjun Long

    2015-01-01

    In order to deeply research the structure discrepancy and modeling mechanism among different grey prediction mo-dels, the equivalence and unbiasedness of grey prediction mo-dels are analyzed and verified. The results show that al the grey prediction models that are strictly derived from x(0)(k) +az(1)(k) = b have the identical model structure and simulation precision. Moreover, the unbiased simulation for the homoge-neous exponential sequence can be accomplished. However, the models derived from dx(1)/dt+ax(1) =b are only close to those derived from x(0)(k)+az(1)(k)=b provided that|a|has to satisfy|a| < 0.1; neither could the unbiased simulation for the homoge-neous exponential sequence be achieved. The above conclusions are proved and verified through some theorems and examples.

  20. Predictability of extreme values in geophysical models

    Directory of Open Access Journals (Sweden)

    A. E. Sterk

    2012-09-01

    Full Text Available Extreme value theory in deterministic systems is concerned with unlikely large (or small values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical models. We study whether finite-time Lyapunov exponents are larger or smaller for initial conditions leading to extremes. General statements on whether extreme values are better or less predictable are not possible: the predictability of extreme values depends on the observable, the attractor of the system, and the prediction lead time.

  1. MULTI MODEL DATA MINING APPROACH FOR HEART FAILURE PREDICTION

    Directory of Open Access Journals (Sweden)

    Priyanka H U

    2016-09-01

    Full Text Available Developing predictive modelling solutions for risk estimation is extremely challenging in health-care informatics. Risk estimation involves integration of heterogeneous clinical sources having different representation from different health-care provider making the task increasingly complex. Such sources are typically voluminous, diverse, and significantly change over the time. Therefore, distributed and parallel computing tools collectively termed big data tools are in need which can synthesize and assist the physician to make right clinical decisions. In this work we propose multi-model predictive architecture, a novel approach for combining the predictive ability of multiple models for better prediction accuracy. We demonstrate the effectiveness and efficiency of the proposed work on data from Framingham Heart study. Results show that the proposed multi-model predictive architecture is able to provide better accuracy than best model approach. By modelling the error of predictive models we are able to choose sub set of models which yields accurate results. More information was modelled into system by multi-level mining which has resulted in enhanced predictive accuracy.

  2. Hybrid modeling and prediction of dynamical systems

    Science.gov (United States)

    Lloyd, Alun L.; Flores, Kevin B.

    2017-01-01

    Scientific analysis often relies on the ability to make accurate predictions of a system’s dynamics. Mechanistic models, parameterized by a number of unknown parameters, are often used for this purpose. Accurate estimation of the model state and parameters prior to prediction is necessary, but may be complicated by issues such as noisy data and uncertainty in parameters and initial conditions. At the other end of the spectrum exist nonparametric methods, which rely solely on data to build their predictions. While these nonparametric methods do not require a model of the system, their performance is strongly influenced by the amount and noisiness of the data. In this article, we consider a hybrid approach to modeling and prediction which merges recent advancements in nonparametric analysis with standard parametric methods. The general idea is to replace a subset of a mechanistic model’s equations with their corresponding nonparametric representations, resulting in a hybrid modeling and prediction scheme. Overall, we find that this hybrid approach allows for more robust parameter estimation and improved short-term prediction in situations where there is a large uncertainty in model parameters. We demonstrate these advantages in the classical Lorenz-63 chaotic system and in networks of Hindmarsh-Rose neurons before application to experimentally collected structured population data. PMID:28692642

  3. The long sunspot cycle 23 predicts a significant temperature decrease in cycle 24

    CERN Document Server

    Solheim, Jan-Erik; Humlum, Ole

    2012-01-01

    Relations between the length of a sunspot cycle and the average temperature in the same and the next cycle are calculated for a number of meteorological stations in Norway and in the North Atlantic region. No significant trend is found between the length of a cycle and the average temperature in the same cycle, but a significant negative trend is found between the length of a cycle and the temperature in the next cycle. This provides a tool to predict an average temperature decrease of at least 1.0 "C from solar cycle 23 to 24 for the stations and areas analyzed. We find for the Norwegian local stations investigated that 25-56% of the temperature increase the last 150 years may be attributed to the Sun. For 3 North Atlantic stations we get 63-72% solar contribution. This points to the Atlantic currents as reinforcing a solar signal.

  4. Property predictions using microstructural modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wang, K.G. [Department of Materials Science and Engineering, Rensselaer Polytechnic Institute, CII 9219, 110 8th Street, Troy, NY 12180-3590 (United States)]. E-mail: wangk2@rpi.edu; Guo, Z. [Sente Software Ltd., Surrey Technology Centre, 40 Occam Road, Guildford GU2 7YG (United Kingdom); Sha, W. [Metals Research Group, School of Civil Engineering, Architecture and Planning, The Queen' s University of Belfast, Belfast BT7 1NN (United Kingdom); Glicksman, M.E. [Department of Materials Science and Engineering, Rensselaer Polytechnic Institute, CII 9219, 110 8th Street, Troy, NY 12180-3590 (United States); Rajan, K. [Department of Materials Science and Engineering, Rensselaer Polytechnic Institute, CII 9219, 110 8th Street, Troy, NY 12180-3590 (United States)

    2005-07-15

    Precipitation hardening in an Fe-12Ni-6Mn maraging steel during overaging is quantified. First, applying our recent kinetic model of coarsening [Phys. Rev. E, 69 (2004) 061507], and incorporating the Ashby-Orowan relationship, we link quantifiable aspects of the microstructures of these steels to their mechanical properties, including especially the hardness. Specifically, hardness measurements allow calculation of the precipitate size as a function of time and temperature through the Ashby-Orowan relationship. Second, calculated precipitate sizes and thermodynamic data determined with Thermo-Calc[copyright] are used with our recent kinetic coarsening model to extract diffusion coefficients during overaging from hardness measurements. Finally, employing more accurate diffusion parameters, we determined the hardness of these alloys independently from theory, and found agreement with experimental hardness data. Diffusion coefficients determined during overaging of these steels are notably higher than those found during the aging - an observation suggesting that precipitate growth during aging and precipitate coarsening during overaging are not controlled by the same diffusion mechanism.

  5. Parameter and Process Significance in Mechanistic Modeling of Cellulose Hydrolysis

    Science.gov (United States)

    Rotter, B.; Barry, A.; Gerhard, J.; Small, J.; Tahar, B.

    2005-12-01

    The rate of cellulose hydrolysis, and of associated microbial processes, is important in determining the stability of landfills and their potential impact on the environment, as well as associated time scales. To permit further exploration in this field, a process-based model of cellulose hydrolysis was developed. The model, which is relevant to both landfill and anaerobic digesters, includes a novel approach to biomass transfer between a cellulose-bound biofilm and biomass in the surrounding liquid. Model results highlight the significance of the bacterial colonization of cellulose particles by attachment through contact in solution. Simulations revealed that enhanced colonization, and therefore cellulose degradation, was associated with reduced cellulose particle size, higher biomass populations in solution, and increased cellulose-binding ability of the biomass. A sensitivity analysis of the system parameters revealed different sensitivities to model parameters for a typical landfill scenario versus that for an anaerobic digester. The results indicate that relative surface area of cellulose and proximity of hydrolyzing bacteria are key factors determining the cellulose degradation rate.

  6. Modeling and Prediction Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Juhl, Rune; Møller, Jan Kloppenborg; Jørgensen, John Bagterp

    2016-01-01

    Pharmacokinetic/pharmakodynamic (PK/PD) modeling for a single subject is most often performed using nonlinear models based on deterministic ordinary differential equations (ODEs), and the variation between subjects in a population of subjects is described using a population (mixed effects) setup...... that describes the variation between subjects. The ODE setup implies that the variation for a single subject is described by a single parameter (or vector), namely the variance (covariance) of the residuals. Furthermore the prediction of the states is given as the solution to the ODEs and hence assumed...... deterministic and can predict the future perfectly. A more realistic approach would be to allow for randomness in the model due to e.g., the model be too simple or errors in input. We describe a modeling and prediction setup which better reflects reality and suggests stochastic differential equations (SDEs...

  7. Precision Plate Plan View Pattern Predictive Model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yang; YANG Quan; HE An-rui; WANG Xiao-chen; ZHANG Yun

    2011-01-01

    According to the rolling features of plate mill, a 3D elastic-plastic FEM (finite element model) based on full restart method of ANSYS/LS-DYNA was established to study the inhomogeneous plastic deformation of multipass plate rolling. By analyzing the simulation results, the difference of head and tail ends predictive models was found and modified. According to the numerical simulation results of 120 different kinds of conditions, precision plate plan view pattern predictive model was established. Based on these models, the sizing MAS (mizushima automatic plan view pattern control system) method was designed and used on a 2 800 mm plate mill. Comparing the rolled plates with and without PVPP (plan view pattern predictive) model, the reduced width deviation indicates that the olate !olan view Dattern predictive model is preeise.

  8. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    T. Wu; E. Lester; M. Cloke [University of Nottingham, Nottingham (United Kingdom). Nottingham Energy and Fuel Centre

    2005-07-01

    Poor burnout in a coal-fired power plant has marked penalties in the form of reduced energy efficiency and elevated waste material that can not be utilized. The prediction of coal combustion behaviour in a furnace is of great significance in providing valuable information not only for process optimization but also for coal buyers in the international market. Coal combustion models have been developed that can make predictions about burnout behaviour and burnout potential. Most of these kinetic models require standard parameters such as volatile content, particle size and assumed char porosity in order to make a burnout prediction. This paper presents a new model called the Char Burnout Model (ChB) that also uses detailed information about char morphology in its prediction. The model can use data input from one of two sources. Both sources are derived from image analysis techniques. The first from individual analysis and characterization of real char types using an automated program. The second from predicted char types based on data collected during the automated image analysis of coal particles. Modelling results were compared with a different carbon burnout kinetic model and burnout data from re-firing the chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen across several residence times. An improved agreement between ChB model and DTF experimental data proved that the inclusion of char morphology in combustion models can improve model predictions. 27 refs., 4 figs., 4 tabs.

  9. Position Weight Matrix, Gibbs Sampler, and the Associated Significance Tests in Motif Characterization and Prediction

    Directory of Open Access Journals (Sweden)

    Xuhua Xia

    2012-01-01

    Full Text Available Position weight matrix (PWM is not only one of the most widely used bioinformatic methods, but also a key component in more advanced computational algorithms (e.g., Gibbs sampler for characterizing and discovering motifs in nucleotide or amino acid sequences. However, few generally applicable statistical tests are available for evaluating the significance of site patterns, PWM, and PWM scores (PWMS of putative motifs. Statistical significance tests of the PWM output, that is, site-specific frequencies, PWM itself, and PWMS, are in disparate sources and have never been collected in a single paper, with the consequence that many implementations of PWM do not include any significance test. Here I review PWM-based methods used in motif characterization and prediction (including a detailed illustration of the Gibbs sampler for de novo motif discovery, present statistical and probabilistic rationales behind statistical significance tests relevant to PWM, and illustrate their application with real data. The multiple comparison problem associated with the test of site-specific frequencies is best handled by false discovery rate methods. The test of PWM, due to the use of pseudocounts, is best done by resampling methods. The test of individual PWMS for each sequence segment should be based on the extreme value distribution.

  10. NBC Hazard Prediction Model Capability Analysis

    Science.gov (United States)

    1999-09-01

    Puff( SCIPUFF ) Model Verification and Evaluation Study, Air Resources Laboratory, NOAA, May 1998. Based on the NOAA review, the VLSTRACK developers...TO SUBSTANTIAL DIFFERENCES IN PREDICTIONS HPAC uses a transport and dispersion (T&D) model called SCIPUFF and an associated mean wind field model... SCIPUFF is a model for atmospheric dispersion that uses the Gaussian puff method - an arbitrary time-dependent concentration field is represented

  11. Oil cracking to gases: Kinetic modeling and geological significance

    Institute of Scientific and Technical Information of China (English)

    TIAN Hui; WANG Zhaoming; XIAO Zhongyao; LI Xianqing; XIAO Xianming

    2006-01-01

    ATriassic oil sample from LN14 of Tarim Basin was pyrolyzed using the sealed gold tubes at 200-620℃ under a constant pressure of 50 MPa.The gaseous and residual soluble hydrocarbons were analyzed. The results show that the cracking of oil to gas can be divided into two distinct stages: the primary generation of total C1-5 gases from liquid oil characterized by the dominance of C2-5 hydrocarbons and the secondary or further cracking of C2-5gases to methane and carbon-rich matters leading to the progressive dryness of gases. Based on the experimental data, the kinetic parameters were determined for the primary generation and secondary cracking of oil cracking gases and extrapolated to geological conditions to predict the thermal stability and cracking extent of crude oil. Finally, an evolution model for the thermal destruction of crude oil was proposed and its implications to the migration and accumulation of oil cracking gases were discussed.

  12. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  13. Significance of preoperative neutrophil–lymphocyte count ratio on predicting postoperative sepsis after percutaneous nephrolithotomy

    Directory of Open Access Journals (Sweden)

    Volkan Sen

    2016-10-01

    Full Text Available We evaluated the usefulness of preoperative neutrophil–lymphocyte count ratio (NLCR in predicting postoperative sepsis after percutaneous nephrolithotomy (PCNL. In total, 487 patients who underwent PCNL for renal stones were included in the present retrospective study. The stone burden, number of tracts and location, operation time, fluoroscopy time, presence of residual stones, and blood transfusion rates were postoperatively recorded in all patients. All patients were followed up for signs of systemic inflammatory response syndrome (SIRS and sepsis. The association of sepsis/SIRS with the risk factors of infectious complications, including NLCR, was evaluated. SIRS was detected in 91 (18.7% patients, 25 (5.1% of whom were diagnosed with sepsis. Stone burden, operation time, irrigation rate, previous surgery, nephrostomy time, access number, blood transfusion, residual stone, postoperative urinary culture, renal pelvis urinary culture, and stone culture were found to be predictive factors for SIRS and sepsis development. Receiver operating characteristic curve analysis revealed an NLCR cutoff of 2.50 for predicting the occurrence of SIRS/sepsis. We found that the incidence of sepsis was significantly higher in patients with NLCR ≥ 2.50 than in patients with NLCR < 2.50 (p = 0.006. Preoperative and postoperative urine culture positivity were associated with high NLCR (p = 0.039 and p = 0.003, respectively. We believe that preoperative NLCR may be a promising additive predictor of bacteremia and postoperative sepsis in patients who undergo PCNL for renal stones. This marker is simple, easily measured, and easy to use in daily practice without extra costs.

  14. Modeling Seizure Self-Prediction: An E-Diary Study

    Science.gov (United States)

    Haut, Sheryl R.; Hall, Charles B.; Borkowski, Thomas; Tennen, Howard; Lipton, Richard B.

    2013-01-01

    Purpose A subset of patients with epilepsy successfully self-predicted seizures in a paper diary study. We conducted an e-diary study to ensure that prediction precedes seizures, and to characterize the prodromal features and time windows that underlie self-prediction. Methods Subjects 18 or older with LRE and ≥3 seizures/month maintained an e-diary, reporting AM/PM data daily, including mood, premonitory symptoms, and all seizures. Self-prediction was rated by, “How likely are you to experience a seizure [time frame]”? Five choices ranged from almost certain (>95% chance) to very unlikely. Relative odds of seizure (OR) within time frames was examined using Poisson models with log normal random effects to adjust for multiple observations. Key Findings Nineteen subjects reported 244 eligible seizures. OR for prediction choices within 6hrs was as high as 9.31 (1.92,45.23) for “almost certain”. Prediction was most robust within 6hrs of diary entry, and remained significant up to 12hrs. For 9 best predictors, average sensitivity was 50%. Older age contributed to successful self-prediction, and self-prediction appeared to be driven by mood and premonitory symptoms. In multivariate modeling of seizure occurrence, self-prediction (2.84; 1.68,4.81), favorable change in mood (0.82; 0.67,0.99) and number of premonitory symptoms (1,11; 1.00,1.24) were significant. Significance Some persons with epilepsy can self-predict seizures. In these individuals, the odds of a seizure following a positive prediction are high. Predictions were robust, not attributable to recall bias, and were related to self awareness of mood and premonitory features. The 6-hour prediction window is suitable for the development of pre-emptive therapy. PMID:24111898

  15. Is flow velocity a significant parameter in flood damage modelling?

    Directory of Open Access Journals (Sweden)

    H. Kreibich

    2009-10-01

    Full Text Available Flow velocity is generally presumed to influence flood damage. However, this influence is hardly quantified and virtually no damage models take it into account. Therefore, the influences of flow velocity, water depth and combinations of these two impact parameters on various types of flood damage were investigated in five communities affected by the Elbe catchment flood in Germany in 2002. 2-D hydraulic models with high to medium spatial resolutions were used to calculate the impact parameters at the sites in which damage occurred. A significant influence of flow velocity on structural damage, particularly on roads, could be shown in contrast to a minor influence on monetary losses and business interruption. Forecasts of structural damage to road infrastructure should be based on flow velocity alone. The energy head is suggested as a suitable flood impact parameter for reliable forecasting of structural damage to residential buildings above a critical impact level of 2 m of energy head or water depth. However, general consideration of flow velocity in flood damage modelling, particularly for estimating monetary loss, cannot be recommended.

  16. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  17. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  18. Corporate prediction models, ratios or regression analysis?

    NARCIS (Netherlands)

    Bijnen, E.J.; Wijn, M.F.C.M.

    1994-01-01

    The models developed in the literature with respect to the prediction of a company s failure are based on ratios. It has been shown before that these models should be rejected on theoretical grounds. Our study of industrial companies in the Netherlands shows that the ratios which are used in

  19. A systematic review of predictive modeling for bronchiolitis.

    Science.gov (United States)

    Luo, Gang; Nkoy, Flory L; Gesteland, Per H; Glasgow, Tiffany S; Stone, Bryan L

    2014-10-01

    Bronchiolitis is the most common cause of illness leading to hospitalization in young children. At present, many bronchiolitis management decisions are made subjectively, leading to significant practice variation among hospitals and physicians caring for children with bronchiolitis. To standardize care for bronchiolitis, researchers have proposed various models to predict the disease course to help determine a proper management plan. This paper reviews the existing state of the art of predictive modeling for bronchiolitis. Predictive modeling for respiratory syncytial virus (RSV) infection is covered whenever appropriate, as RSV accounts for about 70% of bronchiolitis cases. A systematic review was conducted through a PubMed search up to April 25, 2014. The literature on predictive modeling for bronchiolitis was retrieved using a comprehensive search query, which was developed through an iterative process. Search results were limited to human subjects, the English language, and children (birth to 18 years). The literature search returned 2312 references in total. After manual review, 168 of these references were determined to be relevant and are discussed in this paper. We identify several limitations and open problems in predictive modeling for bronchiolitis, and provide some preliminary thoughts on how to address them, with the hope to stimulate future research in this domain. Many problems remain open in predictive modeling for bronchiolitis. Future studies will need to address them to achieve optimal predictive models. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Modelling Chemical Reasoning to Predict Reactions

    CERN Document Server

    Segler, Marwin H S

    2016-01-01

    The ability to reason beyond established knowledge allows Organic Chemists to solve synthetic problems and to invent novel transformations. Here, we propose a model which mimics chemical reasoning and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outperforms a rule-based expert system in the reaction prediction task for 180,000 randomly selected binary reactions. We show that our data-driven model generalises even beyond known reaction types, and is thus capable of effectively (re-) discovering novel transformations (even including transition-metal catalysed reactions). Our model enables computers to infer hypotheses about reactivity and reactions by only considering the intrinsic local structure of the graph, and because each single reaction prediction is typically ac...

  1. Spatially unassociated galaxies contribute significantly to the blended submillimetre galaxy population: predictions for follow-up observations of ALMA sources

    CERN Document Server

    Hayward, Christopher C; Somerville, Rachel S; Primack, Joel R; Moreno, Jorge; Wechsler, Risa H

    2013-01-01

    There is anecdotal evidence that spatially and physically unassociated galaxies blended into a single submillimetre (submm) source contribute to the submm galaxy (SMG) population. However, the significance of this subpopulation has neither been observationally constrained nor theoretically predicted. This work is the first to theoretically predict the contribution of spatially unassociated components to the SMG population. We generate mock SMG catalogues using lightcones derived from the Bolshoi cosmological simulation; to assign submm flux densities to the mock galaxies, we use a fitting function previously derived from the results of dust radiative transfer performed on hydrodynamical simulations of isolated disc and merging galaxies. We then calculate submm number counts for different beam sizes and without blending. Our model suggests that there are a sufficient number of blended SMGs to account for the observed number counts of submm sources with 850-{\\mu}m flux density S_850 >~12 mJy. Furthermore, we pr...

  2. Significance of Bioindicators for Early Predictions on Diagnosis and Therapy of Irradiated Minipigs.

    Science.gov (United States)

    Moroni, Maria; Port, Matthias; Gulani, Jatinder; Chappell, Mark; Abend, Michael

    2016-08-01

    Decisions on whether to start a therapeutic intervention for management of the Acute Radiation Syndrome (ARS) should be made early after exposure, and it should be based on readily available clinical signs and laboratory parameters. Here, the authors use the minipig to assess if early prediction of the later developing clinical outcome and necessity of therapeutic interventions can be determined within the first 3 d after exposure and whether it is comparable to human data. Retrospective analysis of data accumulated in the period 2009-2012 was used. Male Göttingen minipigs (age 4-5 mo, weight 9-10 kg) were irradiated (or sham-irradiated) bilaterally with gamma-photons (Co, 0.5-0.6 Gy min) in the dose range of 1.6-12 Gy. Complete blood counts, serum chemistry, and clinical symptoms were collected up to 60 d after irradiation in untreated minipigs. Changes in these early parameters (up to 3 d after exposure) were correlated with later occurrence (10-60 d after irradiation) of (1) hematological severity scores, (2) severe thrombocytopenia, (3) severe neutropenia, as well as need for (4) therapeutic intervention, (5) administration of cytokines/antibiotics, or (6) thrombocyte transfusions. Binary endpoints were analyzed using logistic regression analysis and calculating receiver operating characteristic (ROC) curves. Most predictive were decreased lymphocyte counts and increases in body temperature at 3 h after irradiation. These data corroborate earlier findings performed on human radiation victims suffering from severe hematological syndrome and provide further evidence for the suitability of the minipig model as a potential alternative non-rodent animal model.

  3. The prognostic significance of UCA1 for predicting clinical outcome in patients with digestive system malignancies.

    Science.gov (United States)

    Liu, Fang-Teng; Dong, Qing; Gao, Hui; Zhu, Zheng-Ming

    2017-06-20

    Urothelial Carcinoma Associated 1 (UCA1) was an originally identified lncRNA in bladder cancer. Previous studies have reported that UCA1 played a significant role in various types of cancer. This study aimed to clarify the prognostic value of UCA1 in digestive system cancers. The meta-analysis of 15 studies were included, comprising 1441 patients with digestive system cancers. The pooled results of 14 studies indicated that high expression of UCA1 was significantly associated with poorer OS in patients with digestive system cancers (HR: 1.89, 95 % CI: 1.52-2.26). In addition, UCA1 could be as an independent prognostic factor for predicting OS of patients (HR: 1.85, 95 % CI: 1.45-2.25). The pooled results of 3 studies indicated a significant association between UCA1 and DFS in patients with digestive system cancers (HR = 2.50; 95 % CI = 1.30-3.69). Statistical significance was also observed in subgroup meta-analysis. Furthermore, the clinicopathological values of UCA1 were discussed in esophageal cancer, colorectal cancer and pancreatic cancer. A comprehensive retrieval was performed to search studies evaluating the prognostic value of UCA1 in digestive system cancers. Many databases were involved, including PubMed, Web of Science, Embase and Chinese National Knowledge Infrastructure and Wanfang database. Quantitative meta-analysis was performed with standard statistical methods and the prognostic significance of UCA1 in digestive system cancers was qualified. Elevated level of UCA1 indicated the poor clinical outcome for patients with digestive system cancers. It may serve as a new biomarker related to prognosis in digestive system cancers.

  4. Genetic models of homosexuality: generating testable predictions

    OpenAIRE

    Gavrilets, Sergey; Rice, William R.

    2006-01-01

    Homosexuality is a common occurrence in humans and other species, yet its genetic and evolutionary basis is poorly understood. Here, we formulate and study a series of simple mathematical models for the purpose of predicting empirical patterns that can be used to determine the form of selection that leads to polymorphism of genes influencing homosexuality. Specifically, we develop theory to make contrasting predictions about the genetic characteristics of genes influencing homosexuality inclu...

  5. Wind farm production prediction - The Zephyr model

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L. [Risoe National Lab., Wind Energy Dept., Roskilde (Denmark); Giebel, G. [Risoe National Lab., Wind Energy Dept., Roskilde (Denmark); Madsen, H. [IMM (DTU), Kgs. Lyngby (Denmark); Nielsen, T.S. [IMM (DTU), Kgs. Lyngby (Denmark); Joergensen, J.U. [Danish Meteorologisk Inst., Copenhagen (Denmark); Lauersen, L. [Danish Meteorologisk Inst., Copenhagen (Denmark); Toefting, J. [Elsam, Fredericia (DK); Christensen, H.S. [Eltra, Fredericia (Denmark); Bjerge, C. [SEAS, Haslev (Denmark)

    2002-06-01

    This report describes a project - funded by the Danish Ministry of Energy and the Environment - which developed a next generation prediction system called Zephyr. The Zephyr system is a merging between two state-of-the-art prediction systems: Prediktor of Risoe National Laboratory and WPPT of IMM at the Danish Technical University. The numerical weather predictions were generated by DMI's HIRLAM model. Due to technical difficulties programming the system, only the computational core and a very simple version of the originally very complex system were developed. The project partners were: Risoe, DMU, DMI, Elsam, Eltra, Elkraft System, SEAS and E2. (au)

  6. Climate predictability and prediction skill on seasonal time scales over South America from CHFP models

    Science.gov (United States)

    Osman, Marisol; Vera, C. S.

    2016-11-01

    This work presents an assessment of the predictability and skill of climate anomalies over South America. The study was made considering a multi-model ensemble of seasonal forecasts for surface air temperature, precipitation and regional circulation, from coupled global circulation models included in the Climate Historical Forecast Project. Predictability was evaluated through the estimation of the signal-to-total variance ratio while prediction skill was assessed computing anomaly correlation coefficients. Both indicators present over the continent higher values at the tropics than at the extratropics for both, surface air temperature and precipitation. Moreover, predictability and prediction skill for temperature are slightly higher in DJF than in JJA while for precipitation they exhibit similar levels in both seasons. The largest values of predictability and skill for both variables and seasons are found over northwestern South America while modest but still significant values for extratropical precipitation at southeastern South America and the extratropical Andes. The predictability levels in ENSO years of both variables are slightly higher, although with the same spatial distribution, than that obtained considering all years. Nevertheless, predictability at the tropics for both variables and seasons diminishes in both warm and cold ENSO years respect to that in all years. The latter can be attributed to changes in signal rather than in the noise. Predictability and prediction skill for low-level winds and upper-level zonal winds over South America was also assessed. Maximum levels of predictability for low-level winds were found were maximum mean values are observed, i.e. the regions associated with the equatorial trade winds, the midlatitudes westerlies and the South American Low-Level Jet. Predictability maxima for upper-level zonal winds locate where the subtropical jet peaks. Seasonal changes in wind predictability are observed that seem to be related to

  7. Predicting clinically significant changes in motor and functional outcomes after robot-assisted stroke rehabilitation.

    Science.gov (United States)

    Hsieh, Yu-wei; Lin, Keh-chung; Wu, Ching-yi; Lien, Hen-yu; Chen, Jean-lon; Chen, Chih-chi; Chang, Wei-han

    2014-02-01

    To investigate the predictors of minimal clinically important changes on outcome measures after robot-assisted therapy (RT). Observational cohort study. Outpatient rehabilitation clinics. A cohort of outpatients with stroke (N=55). Patients with stroke received RT for 90 to 105min/d, 5d/wk, for 4 weeks. Outcome measures, including the Fugl-Meyer Assessment (FMA) and Motor Activity Log (MAL), were measured before and after the intervention. Potential predictors include age, sex, side of lesion, time since stroke onset, finger extension, Box and Block Test (BBT) score, and FMA distal score. Statistical analysis showed that the BBT score (odds ratio[OR]=1.06; P=.04) was a significant predictor of clinically important changes in the FMA. Being a woman (OR=3.9; P=.05) and BBT score (OR=1.07; P=.02) were the 2 significant predictors of clinically significant changes in the MAL amount of use subscale. The BBT score was the significant predictor of an increased probability of achieving clinically important changes in the MAL quality of movement subscale (OR=1.07; P=.02). The R(2) values for the 3 logistic regression models were low (.114-.272). The results revealed that patients with stroke who had greater manual dexterity measured by the BBT appear to have a higher probability of achieving clinically significant motor and functional outcomes after RT. Further studies are needed to evaluate other potential predictors to improve the models and validate the findings. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. PREDICTIVE CAPACITY OF ARCH FAMILY MODELS

    Directory of Open Access Journals (Sweden)

    Raphael Silveira Amaro

    2016-03-01

    Full Text Available In the last decades, a remarkable number of models, variants from the Autoregressive Conditional Heteroscedastic family, have been developed and empirically tested, making extremely complex the process of choosing a particular model. This research aim to compare the predictive capacity, using the Model Confidence Set procedure, than five conditional heteroskedasticity models, considering eight different statistical probability distributions. The financial series which were used refers to the log-return series of the Bovespa index and the Dow Jones Industrial Index in the period between 27 October 2008 and 30 December 2014. The empirical evidences showed that, in general, competing models have a great homogeneity to make predictions, either for a stock market of a developed country or for a stock market of a developing country. An equivalent result can be inferred for the statistical probability distributions that were used.

  9. Predictive QSAR modeling of phosphodiesterase 4 inhibitors.

    Science.gov (United States)

    Kovalishyn, Vasyl; Tanchuk, Vsevolod; Charochkina, Larisa; Semenuta, Ivan; Prokopenko, Volodymyr

    2012-02-01

    A series of diverse organic compounds, phosphodiesterase type 4 (PDE-4) inhibitors, have been modeled using a QSAR-based approach. 48 QSAR models were compared by following the same procedure with different combinations of descriptors and machine learning methods. QSAR methodologies used random forests and associative neural networks. The predictive ability of the models was tested through leave-one-out cross-validation, giving a Q² = 0.66-0.78 for regression models and total accuracies Ac=0.85-0.91 for classification models. Predictions for the external evaluation sets obtained accuracies in the range of 0.82-0.88 (for active/inactive classifications) and Q² = 0.62-0.76 for regressions. The method showed itself to be a potential tool for estimation of IC₅₀ of new drug-like candidates at early stages of drug development. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Signature prediction for model-based automatic target recognition

    Science.gov (United States)

    Keydel, Eric R.; Lee, Shung W.

    1996-06-01

    The moving and stationary target recognition (MSTAR) model- based automatic target recognition (ATR) system utilizes a paradigm which matches features extracted form an unknown SAR target signature against predictions of those features generated from models of the sensing process and candidate target geometries. The candidate target geometry yielding the best match between predicted and extracted features defines the identify of the unknown target. MSTAR will extend the current model-based ATR state-of-the-art in a number of significant directions. These include: use of Bayesian techniques for evidence accrual, reasoning over target subparts, coarse-to-fine hypothesis search strategies, and explicit reasoning over target articulation, configuration, occlusion, and lay-over. These advances also imply significant technical challenges, particularly for the MSTAR feature prediction module (MPM). In addition to accurate electromagnetics, the MPM must provide traceback between input target geometry and output features, on-line target geometry manipulation, target subpart feature prediction, explicit models for local scene effects, and generation of sensitivity and uncertainty measures for the predicted features. This paper describes the MPM design which is being developed to satisfy these requirements. The overall module structure is presented, along with the specific deign elements focused on MSTAR requirements. Particular attention is paid to design elements that enable on-line prediction of features within the time constraints mandated by model-driven ATR. Finally, the current status, development schedule, and further extensions in the module design are described.

  11. MJO prediction skill, predictability, and teleconnection impacts in the Beijing Climate Center Atmospheric General Circulation Model

    Science.gov (United States)

    Wu, Jie; Ren, Hong-Li; Zuo, Jinqing; Zhao, Chongbo; Chen, Lijuan; Li, Qiaoping

    2016-09-01

    This study evaluates performance of Madden-Julian oscillation (MJO) prediction in the Beijing Climate Center Atmospheric General Circulation Model (BCC_AGCM2.2). By using the real-time multivariate MJO (RMM) indices, it is shown that the MJO prediction skill of BCC_AGCM2.2 extends to about 16-17 days before the bivariate anomaly correlation coefficient drops to 0.5 and the root-mean-square error increases to the level of the climatological prediction. The prediction skill showed a seasonal dependence, with the highest skill occurring in boreal autumn, and a phase dependence with higher skill for predictions initiated from phases 2-4. The results of the MJO predictability analysis showed that the upper bounds of the prediction skill can be extended to 26 days by using a single-member estimate, and to 42 days by using the ensemble-mean estimate, which also exhibited an initial amplitude and phase dependence. The observed relationship between the MJO and the North Atlantic Oscillation was accurately reproduced by BCC_AGCM2.2 for most initial phases of the MJO, accompanied with the Rossby wave trains in the Northern Hemisphere extratropics driven by MJO convection forcing. Overall, BCC_AGCM2.2 displayed a significant ability to predict the MJO and its teleconnections without interacting with the ocean, which provided a useful tool for fully extracting the predictability source of subseasonal prediction.

  12. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-02-01

    Full Text Available Orientation: The article discussed the importance of rigour in credit risk assessment.Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan.Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities.Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems.Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk.Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product.Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  13. Calibrated predictions for multivariate competing risks models.

    Science.gov (United States)

    Gorfine, Malka; Hsu, Li; Zucker, David M; Parmigiani, Giovanni

    2014-04-01

    Prediction models for time-to-event data play a prominent role in assessing the individual risk of a disease, such as cancer. Accurate disease prediction models provide an efficient tool for identifying individuals at high risk, and provide the groundwork for estimating the population burden and cost of disease and for developing patient care guidelines. We focus on risk prediction of a disease in which family history is an important risk factor that reflects inherited genetic susceptibility, shared environment, and common behavior patterns. In this work family history is accommodated using frailty models, with the main novel feature being allowing for competing risks, such as other diseases or mortality. We show through a simulation study that naively treating competing risks as independent right censoring events results in non-calibrated predictions, with the expected number of events overestimated. Discrimination performance is not affected by ignoring competing risks. Our proposed prediction methodologies correctly account for competing events, are very well calibrated, and easy to implement.

  14. Preoperative prediction model of outcome after cholecystectomy for symptomatic gallstones

    DEFF Research Database (Denmark)

    Borly, L; Anderson, I B; Bardram, Linda

    1999-01-01

    BACKGROUND: After cholecystectomy for symptomatic gallstone disease 20%-30% of the patients continue to have abdominal pain. The aim of this study was to investigate whether preoperative variables could predict the symptomatic outcome after cholecystectomy. METHODS: One hundred and two patients...... and sonography evaluated gallbladder motility, gallstones, and gallbladder volume. Preoperative variables in patients with or without postcholecystectomy pain were compared statistically, and significant variables were combined in a logistic regression model to predict the postoperative outcome. RESULTS: Eighty...

  15. Global Solar Dynamo Models: Simulations and Predictions

    Indian Academy of Sciences (India)

    Mausumi Dikpati; Peter A. Gilman

    2008-03-01

    Flux-transport type solar dynamos have achieved considerable success in correctly simulating many solar cycle features, and are now being used for prediction of solar cycle timing and amplitude.We first define flux-transport dynamos and demonstrate how they work. The essential added ingredient in this class of models is meridional circulation, which governs the dynamo period and also plays a crucial role in determining the Sun’s memory about its past magnetic fields.We show that flux-transport dynamo models can explain many key features of solar cycles. Then we show that a predictive tool can be built from this class of dynamo that can be used to predict mean solar cycle features by assimilating magnetic field data from previous cycles.

  16. Quasi-degenerate Neutrino mass models and their significance: A model independent investigation

    CERN Document Server

    Roy, S

    2016-01-01

    The prediction of possible ordering of neutrino masses relies mostly on the model selected. Alienating the $\\mu-\\tau$ interchange symmetry from discrete flavour symmetry based models, turns the neutrino mass matrix less predictive. But this inspires one to seek the answer from other phenomenological frameworks. We need a proper parametrization of the neutrino mass matrices concerning individual hierarchies. In the present work, we attempt to study the six different cases of Quasi-degenerate (QDN) neutrino models. The related mass matrices, $m_{LL}^{\

  17. Model Predictive Control of Sewer Networks

    Science.gov (United States)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik; Poulsen, Niels K.; Falk, Anne K. V.

    2017-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored and controlled have thus become essential factors for effcient performance of waste water treatment plants. This paper examines methods for simplified modelling and controlling a sewer network. A practical approach to the problem is used by analysing simplified design model, which is based on the Barcelona benchmark model. Due to the inherent constraints the applied approach is based on Model Predictive Control.

  18. Serum nucleosomes during neoadjuvant chemotherapy in patients with cervical cancer. Predictive and prognostic significance

    Directory of Open Access Journals (Sweden)

    Cetina Lucely

    2005-06-01

    Full Text Available Abstract Background It has been shown that free DNA circulates in serum plasma of patients with cancer and that at least part is present in the form of oligo- and monucleosomes, a marker of cell death. Preliminary data has shown a good correlation between decrease of nucleosomes with response and prognosis. Here, we performed pre- and post-chemotherapy determinations of serum nucleosomes with an enzyme-linked immunosorbent assay (ELISA method in a group of patients with cervical cancer receiving neoadjuvant chemotherapy. Methods From December 2000 to June 2001, 41 patients with cervical cancer staged as FIGO stages IB2-IIIB received three 21-day courses of carboplatin and paclitaxel, both administered at day 1; then, patients underwent radical hysterectomy. Nucleosomes were measured the day before (baseline, at day seven of the first course and day seven of the third course of chemotherapy. Values of nucleosomes were analyzed with regard to pathologic response and to time to progression-free and overall survival. Results All patients completed chemotherapy, were evaluable for pathologic response, and had nucleosome levels determined. At a mean follow-up of 23 months (range, 7–26 months, projected progression time and overall survival were 80.3 and 80.4%, respectively. Mean differential values of nucleosomes were lower in the third course as compared with the first course (p >0.001. The decrease in the third course correlated with pathologic response (p = 0.041. Survival analysis showed a statistically significant, better progression-free and survival time in patients who showed lower levels at the third course (p = 0.0243 and p = 0.0260, respectively. Cox regression analysis demonstrated that nucleosome increase in the third course increased risk of death to 6.86 (95% confidence interval [CI 95%], 0.84–56.0. Conclusion Serum nucleosomes may have a predictive role for response and prognostic significance in patients with cervical cancer

  19. QSPR Models for Octane Number Prediction

    Directory of Open Access Journals (Sweden)

    Jabir H. Al-Fahemi

    2014-01-01

    Full Text Available Quantitative structure-property relationship (QSPR is performed as a means to predict octane number of hydrocarbons via correlating properties to parameters calculated from molecular structure; such parameters are molecular mass M, hydration energy EH, boiling point BP, octanol/water distribution coefficient logP, molar refractivity MR, critical pressure CP, critical volume CV, and critical temperature CT. Principal component analysis (PCA and multiple linear regression technique (MLR were performed to examine the relationship between multiple variables of the above parameters and the octane number of hydrocarbons. The results of PCA explain the interrelationships between octane number and different variables. Correlation coefficients were calculated using M.S. Excel to examine the relationship between multiple variables of the above parameters and the octane number of hydrocarbons. The data set was split into training of 40 hydrocarbons and validation set of 25 hydrocarbons. The linear relationship between the selected descriptors and the octane number has coefficient of determination (R2=0.932, statistical significance (F=53.21, and standard errors (s =7.7. The obtained QSPR model was applied on the validation set of octane number for hydrocarbons giving RCV2=0.942 and s=6.328.

  20. DKIST Polarization Modeling and Performance Predictions

    Science.gov (United States)

    Harrington, David

    2016-05-01

    Calibrating the Mueller matrices of large aperture telescopes and associated coude instrumentation requires astronomical sources and several modeling assumptions to predict the behavior of the system polarization with field of view, altitude, azimuth and wavelength. The Daniel K Inouye Solar Telescope (DKIST) polarimetric instrumentation requires very high accuracy calibration of a complex coude path with an off-axis f/2 primary mirror, time dependent optical configurations and substantial field of view. Polarization predictions across a diversity of optical configurations, tracking scenarios, slit geometries and vendor coating formulations are critical to both construction and contined operations efforts. Recent daytime sky based polarization calibrations of the 4m AEOS telescope and HiVIS spectropolarimeter on Haleakala have provided system Mueller matrices over full telescope articulation for a 15-reflection coude system. AEOS and HiVIS are a DKIST analog with a many-fold coude optical feed and similar mirror coatings creating 100% polarization cross-talk with altitude, azimuth and wavelength. Polarization modeling predictions using Zemax have successfully matched the altitude-azimuth-wavelength dependence on HiVIS with the few percent amplitude limitations of several instrument artifacts. Polarization predictions for coude beam paths depend greatly on modeling the angle-of-incidence dependences in powered optics and the mirror coating formulations. A 6 month HiVIS daytime sky calibration plan has been analyzed for accuracy under a wide range of sky conditions and data analysis algorithms. Predictions of polarimetric performance for the DKIST first-light instrumentation suite have been created under a range of configurations. These new modeling tools and polarization predictions have substantial impact for the design, fabrication and calibration process in the presence of manufacturing issues, science use-case requirements and ultimate system calibration

  1. Modelling Chemical Reasoning to Predict Reactions

    OpenAIRE

    Segler, Marwin H. S.; Waller, Mark P.

    2016-01-01

    The ability to reason beyond established knowledge allows Organic Chemists to solve synthetic problems and to invent novel transformations. Here, we propose a model which mimics chemical reasoning and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outpe...

  2. Predictive Modeling of the CDRA 4BMS

    Science.gov (United States)

    Coker, Robert; Knox, James

    2016-01-01

    Fully predictive models of the Four Bed Molecular Sieve of the Carbon Dioxide Removal Assembly on the International Space Station are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  3. Raman Model Predicting Hardness of Covalent Crystals

    OpenAIRE

    Zhou, Xiang-Feng; Qian, Quang-Rui; Sun, Jian; Tian, Yongjun; Wang, Hui-Tian

    2009-01-01

    Based on the fact that both hardness and vibrational Raman spectrum depend on the intrinsic property of chemical bonds, we propose a new theoretical model for predicting hardness of a covalent crystal. The quantitative relationship between hardness and vibrational Raman frequencies deduced from the typical zincblende covalent crystals is validated to be also applicable for the complex multicomponent crystals. This model enables us to nondestructively and indirectly characterize the hardness o...

  4. Predictive Modelling of Mycotoxins in Cereals

    NARCIS (Netherlands)

    Fels, van der H.J.; Liu, C.

    2015-01-01

    In dit artikel worden de samenvattingen van de presentaties tijdens de 30e bijeenkomst van de Werkgroep Fusarium weergegeven. De onderwerpen zijn: Predictive Modelling of Mycotoxins in Cereals.; Microbial degradation of DON.; Exposure to green leaf volatiles primes wheat against FHB but boosts

  5. Unreachable Setpoints in Model Predictive Control

    DEFF Research Database (Denmark)

    Rawlings, James B.; Bonné, Dennis; Jørgensen, John Bagterp

    2008-01-01

    steady state is established for terminal constraint model predictive control (MPC). The region of attraction is the steerable set. Existing analysis methods for closed-loop properties of MPC are not applicable to this new formulation, and a new analysis method is developed. It is shown how to extend...

  6. Predictive Modelling of Mycotoxins in Cereals

    NARCIS (Netherlands)

    Fels, van der H.J.; Liu, C.

    2015-01-01

    In dit artikel worden de samenvattingen van de presentaties tijdens de 30e bijeenkomst van de Werkgroep Fusarium weergegeven. De onderwerpen zijn: Predictive Modelling of Mycotoxins in Cereals.; Microbial degradation of DON.; Exposure to green leaf volatiles primes wheat against FHB but boosts produ

  7. Prediction modelling for population conviction data

    NARCIS (Netherlands)

    Tollenaar, N.

    2017-01-01

    In this thesis, the possibilities of using prediction models for judicial penal case data are investigated. The development and refinement of a risk taxation scale based on these data is discussed. When false positives are weighted equally severe as false negatives, 70% can be classified correctly.

  8. A Predictive Model for MSSW Student Success

    Science.gov (United States)

    Napier, Angela Michele

    2011-01-01

    This study tested a hypothetical model for predicting both graduate GPA and graduation of University of Louisville Kent School of Social Work Master of Science in Social Work (MSSW) students entering the program during the 2001-2005 school years. The preexisting characteristics of demographics, academic preparedness and culture shock along with…

  9. Predictability of extreme values in geophysical models

    NARCIS (Netherlands)

    Sterk, A.E.; Holland, M.P.; Rabassa, P.; Broer, H.W.; Vitolo, R.

    2012-01-01

    Extreme value theory in deterministic systems is concerned with unlikely large (or small) values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical model

  10. A revised prediction model for natural conception

    NARCIS (Netherlands)

    Bensdorp, A.J.; Steeg, J.W. van der; Steures, P.; Habbema, J.D.; Hompes, P.G.; Bossuyt, P.M.; Veen, F. van der; Mol, B.W.; Eijkemans, M.J.; Kremer, J.A.M.; et al.,

    2017-01-01

    One of the aims in reproductive medicine is to differentiate between couples that have favourable chances of conceiving naturally and those that do not. Since the development of the prediction model of Hunault, characteristics of the subfertile population have changed. The objective of this analysis

  11. Distributed Model Predictive Control via Dual Decomposition

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Stoustrup, Jakob; Andersen, Palle

    2014-01-01

    This chapter presents dual decomposition as a means to coordinate a number of subsystems coupled by state and input constraints. Each subsystem is equipped with a local model predictive controller while a centralized entity manages the subsystems via prices associated with the coupling constraints...

  12. Predictive Modelling of Mycotoxins in Cereals

    NARCIS (Netherlands)

    Fels, van der H.J.; Liu, C.

    2015-01-01

    In dit artikel worden de samenvattingen van de presentaties tijdens de 30e bijeenkomst van de Werkgroep Fusarium weergegeven. De onderwerpen zijn: Predictive Modelling of Mycotoxins in Cereals.; Microbial degradation of DON.; Exposure to green leaf volatiles primes wheat against FHB but boosts produ

  13. Leptogenesis in minimal predictive seesaw models

    CERN Document Server

    Björkeroth, Fredrik; Varzielas, Ivo de Medeiros; King, Stephen F

    2015-01-01

    We estimate the Baryon Asymmetry of the Universe (BAU) arising from leptogenesis within a class of minimal predictive seesaw models involving two right-handed neutrinos and simple Yukawa structures with one texture zero. The two right-handed neutrinos are dominantly responsible for the "atmospheric" and "solar" neutrino masses with Yukawa couplings to $(\

  14. Technical note: A linear model for predicting δ13 Cprotein.

    Science.gov (United States)

    Pestle, William J; Hubbe, Mark; Smith, Erin K; Stevenson, Joseph M

    2015-08-01

    Development of a model for the prediction of δ(13) Cprotein from δ(13) Ccollagen and Δ(13) Cap-co . Model-generated values could, in turn, serve as "consumer" inputs for multisource mixture modeling of paleodiet. Linear regression analysis of previously published controlled diet data facilitated the development of a mathematical model for predicting δ(13) Cprotein (and an experimentally generated error term) from isotopic data routinely generated during the analysis of osseous remains (δ(13) Cco and Δ(13) Cap-co ). Regression analysis resulted in a two-term linear model (δ(13) Cprotein (%) = (0.78 × δ(13) Cco ) - (0.58× Δ(13) Cap-co ) - 4.7), possessing a high R-value of 0.93 (r(2)  = 0.86, P < 0.01), and experimentally generated error terms of ±1.9% for any predicted individual value of δ(13) Cprotein . This model was tested using isotopic data from Formative Period individuals from northern Chile's Atacama Desert. The model presented here appears to hold significant potential for the prediction of the carbon isotope signature of dietary protein using only such data as is routinely generated in the course of stable isotope analysis of human osseous remains. These predicted values are ideal for use in multisource mixture modeling of dietary protein source contribution. © 2015 Wiley Periodicals, Inc.

  15. Specialized Language Models using Dialogue Predictions

    CERN Document Server

    Popovici, C; Popovici, Cosmin; Baggia, Paolo

    1996-01-01

    This paper analyses language modeling in spoken dialogue systems for accessing a database. The use of several language models obtained by exploiting dialogue predictions gives better results than the use of a single model for the whole dialogue interaction. For this reason several models have been created, each one for a specific system question, such as the request or the confirmation of a parameter. The use of dialogue-dependent language models increases the performance both at the recognition and at the understanding level, especially on answers to system requests. Moreover other methods to increase performance, like automatic clustering of vocabulary words or the use of better acoustic models during recognition, does not affect the improvements given by dialogue-dependent language models. The system used in our experiments is Dialogos, the Italian spoken dialogue system used for accessing railway timetable information over the telephone. The experiments were carried out on a large corpus of dialogues coll...

  16. The ARIC predictive model reliably predicted risk of type II diabetes in Asian populations

    Directory of Open Access Journals (Sweden)

    Chin Calvin

    2012-04-01

    Full Text Available Abstract Background Identification of high-risk individuals is crucial for effective implementation of type 2 diabetes mellitus prevention programs. Several studies have shown that multivariable predictive functions perform as well as the 2-hour post-challenge glucose in identifying these high-risk individuals. The performance of these functions in Asian populations, where the rise in prevalence of type 2 diabetes mellitus is expected to be the greatest in the next several decades, is relatively unknown. Methods Using data from three Asian populations in Singapore, we compared the performance of three multivariate predictive models in terms of their discriminatory power and calibration quality: the San Antonio Health Study model, Atherosclerosis Risk in Communities model and the Framingham model. Results The San Antonio Health Study and Atherosclerosis Risk in Communities models had better discriminative powers than using only fasting plasma glucose or the 2-hour post-challenge glucose. However, the Framingham model did not perform significantly better than fasting glucose or the 2-hour post-challenge glucose. All published models suffered from poor calibration. After recalibration, the Atherosclerosis Risk in Communities model achieved good calibration, the San Antonio Health Study model showed a significant lack of fit in females and the Framingham model showed a significant lack of fit in both females and males. Conclusions We conclude that adoption of the ARIC model for Asian populations is feasible and highly recommended when local prospective data is unavailable.

  17. Disease prediction models and operational readiness.

    Directory of Open Access Journals (Sweden)

    Courtney D Corley

    Full Text Available The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. We define a disease event to be a biological event with focus on the One Health paradigm. These events are characterized by evidence of infection and or disease condition. We reviewed models that attempted to predict a disease event, not merely its transmission dynamics and we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011. We searched commercial and government databases and harvested Google search results for eligible models, using terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche modeling. After removal of duplications and extraneous material, a core collection of 6,524 items was established, and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. As a result, we systematically reviewed 44 papers, and the results are presented in this analysis. We identified 44 models, classified as one or more of the following: event prediction (4, spatial (26, ecological niche (28, diagnostic or clinical (6, spread or response (9, and reviews (3. The model parameters (e.g., etiology, climatic, spatial, cultural and data sources (e.g., remote sensing, non-governmental organizations, expert opinion, epidemiological were recorded and reviewed. A component of this review is the identification of verification and validation (V&V methods applied to each model, if any V&V method was reported. All models were classified as either having undergone Some Verification or Validation method, or No Verification or Validation. We close by outlining an initial set of operational readiness level guidelines for disease prediction models based upon established Technology

  18. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  19. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...

  20. ENSO Prediction using Vector Autoregressive Models

    Science.gov (United States)

    Chapman, D. R.; Cane, M. A.; Henderson, N.; Lee, D.; Chen, C.

    2013-12-01

    A recent comparison (Barnston et al, 2012 BAMS) shows the ENSO forecasting skill of dynamical models now exceeds that of statistical models, but the best statistical models are comparable to all but the very best dynamical models. In this comparison the leading statistical model is the one based on the Empirical Model Reduction (EMR) method. Here we report on experiments with multilevel Vector Autoregressive models using only sea surface temperatures (SSTs) as predictors. VAR(L) models generalizes Linear Inverse Models (LIM), which are a VAR(1) method, as well as multilevel univariate autoregressive models. Optimal forecast skill is achieved using 12 to 14 months of prior state information (i.e 12-14 levels), which allows SSTs alone to capture the effects of other variables such as heat content as well as seasonality. The use of multiple levels allows the model advancing one month at a time to perform at least as well for a 6 month forecast as a model constructed to explicitly forecast 6 months ahead. We infer that the multilevel model has fully captured the linear dynamics (cf. Penland and Magorian, 1993 J. Climate). Finally, while VAR(L) is equivalent to L-level EMR, we show in a 150 year cross validated assessment that we can increase forecast skill by improving on the EMR initialization procedure. The greatest benefit of this change is in allowing the prediction to make effective use of information over many more months.

  1. Gas explosion prediction using CFD models

    Energy Technology Data Exchange (ETDEWEB)

    Niemann-Delius, C.; Okafor, E. [RWTH Aachen Univ. (Germany); Buhrow, C. [TU Bergakademie Freiberg Univ. (Germany)

    2006-07-15

    A number of CFD models are currently available to model gaseous explosions in complex geometries. Some of these tools allow the representation of complex environments within hydrocarbon production plants. In certain explosion scenarios, a correction is usually made for the presence of buildings and other complexities by using crude approximations to obtain realistic estimates of explosion behaviour as can be found when predicting the strength of blast waves resulting from initial explosions. With the advance of computational technology, and greater availability of computing power, computational fluid dynamics (CFD) tools are becoming increasingly available for solving such a wide range of explosion problems. A CFD-based explosion code - FLACS can, for instance, be confidently used to understand the impact of blast overpressures in a plant environment consisting of obstacles such as buildings, structures, and pipes. With its porosity concept representing geometry details smaller than the grid, FLACS can represent geometry well, even when using coarse grid resolutions. The performance of FLACS has been evaluated using a wide range of field data. In the present paper, the concept of computational fluid dynamics (CFD) and its application to gas explosion prediction is presented. Furthermore, the predictive capabilities of CFD-based gaseous explosion simulators are demonstrated using FLACS. Details about the FLACS-code, some extensions made to FLACS, model validation exercises, application, and some results from blast load prediction within an industrial facility are presented. (orig.)

  2. Genetic models of homosexuality: generating testable predictions.

    Science.gov (United States)

    Gavrilets, Sergey; Rice, William R

    2006-12-22

    Homosexuality is a common occurrence in humans and other species, yet its genetic and evolutionary basis is poorly understood. Here, we formulate and study a series of simple mathematical models for the purpose of predicting empirical patterns that can be used to determine the form of selection that leads to polymorphism of genes influencing homosexuality. Specifically, we develop theory to make contrasting predictions about the genetic characteristics of genes influencing homosexuality including: (i) chromosomal location, (ii) dominance among segregating alleles and (iii) effect sizes that distinguish between the two major models for their polymorphism: the overdominance and sexual antagonism models. We conclude that the measurement of the genetic characteristics of quantitative trait loci (QTLs) found in genomic screens for genes influencing homosexuality can be highly informative in resolving the form of natural selection maintaining their polymorphism.

  3. Characterizing Attention with Predictive Network Models.

    Science.gov (United States)

    Rosenberg, M D; Finn, E S; Scheinost, D; Constable, R T; Chun, M M

    2017-04-01

    Recent work shows that models based on functional connectivity in large-scale brain networks can predict individuals' attentional abilities. While being some of the first generalizable neuromarkers of cognitive function, these models also inform our basic understanding of attention, providing empirical evidence that: (i) attention is a network property of brain computation; (ii) the functional architecture that underlies attention can be measured while people are not engaged in any explicit task; and (iii) this architecture supports a general attentional ability that is common to several laboratory-based tasks and is impaired in attention deficit hyperactivity disorder (ADHD). Looking ahead, connectivity-based predictive models of attention and other cognitive abilities and behaviors may potentially improve the assessment, diagnosis, and treatment of clinical dysfunction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A Study On Distributed Model Predictive Consensus

    CERN Document Server

    Keviczky, Tamas

    2008-01-01

    We investigate convergence properties of a proposed distributed model predictive control (DMPC) scheme, where agents negotiate to compute an optimal consensus point using an incremental subgradient method based on primal decomposition as described in Johansson et al. [2006, 2007]. The objective of the distributed control strategy is to agree upon and achieve an optimal common output value for a group of agents in the presence of constraints on the agent dynamics using local predictive controllers. Stability analysis using a receding horizon implementation of the distributed optimal consensus scheme is performed. Conditions are given under which convergence can be obtained even if the negotiations do not reach full consensus.

  5. Five-point Likert scaling on MRI predicts clinically significant prostate carcinoma.

    Science.gov (United States)

    Harada, Taisuke; Abe, Takashige; Kato, Fumi; Matsumoto, Ryuji; Fujita, Hiromi; Murai, Sachiyo; Miyajima, Naoto; Tsuchiya, Kunihiko; Maruyama, Satoru; Kudo, Kohsuke; Shinohara, Nobuo

    2015-09-04

    To clarify the relationship between the probability of prostate cancer scaled using a 5-point Likert system and the biological characteristics of corresponding tumor foci. The present study involved 44 patients undergoing 3.0-Tesla multiparametric MRI before laparoscopic radical prostatectomy. Tracing based on pathological and MRI findings was performed. The relationship between the probability of cancer scaled using the 5-point Likert system and the biological characteristics of corresponding tumor foci was evaluated. A total of 102 tumor foci were identified histologically from the 44 specimens. Of the 102 tumors, 55 were assigned a score based on MRI findings (score 1: n = 3; score 2: n = 3; score 3: n = 16; score 4: n = 11 score 5: n = 22), while 47 were not pointed out on MRI. The tracing study revealed that the proportion of >0.5 cm(3) tumors increased according to the upgrade of Likert scores (score 1 or 2: 33%; score 3: 68.8%; score 4 or 5: 90.9%, χ(2) test, p 7 also increased from scale 2 to scale 5 (scale 2: 0%; scale 3: 56.3%; scale 4: 72.7%; 5: 90.9%, χ(2) test, p = 0.0001). On using score 3 or higher as the threshold of cancer detection on MRI, the detection rate markedly improved if the tumor volume exceeded 0.5 cm(3) (Likert scale favobably reflected the corresponding tumor's volume and Gleason score. Our observations show that "score 3 or higher" could be a useful threshold to predict clinically significant carcinoma when considering treatment options.

  6. Using connectome-based predictive modeling to predict individual behavior from brain connectivity.

    Science.gov (United States)

    Shen, Xilin; Finn, Emily S; Scheinost, Dustin; Rosenberg, Monica D; Chun, Marvin M; Papademetris, Xenophon; Constable, R Todd

    2017-03-01

    Neuroimaging is a fast-developing research area in which anatomical and functional images of human brains are collected using techniques such as functional magnetic resonance imaging (fMRI), diffusion tensor imaging (DTI), and electroencephalography (EEG). Technical advances and large-scale data sets have allowed for the development of models capable of predicting individual differences in traits and behavior using brain connectivity measures derived from neuroimaging data. Here, we present connectome-based predictive modeling (CPM), a data-driven protocol for developing predictive models of brain-behavior relationships from connectivity data using cross-validation. This protocol includes the following steps: (i) feature selection, (ii) feature summarization, (iii) model building, and (iv) assessment of prediction significance. We also include suggestions for visualizing the most predictive features (i.e., brain connections). The final result should be a generalizable model that takes brain connectivity data as input and generates predictions of behavioral measures in novel subjects, accounting for a considerable amount of the variance in these measures. It has been demonstrated that the CPM protocol performs as well as or better than many of the existing approaches in brain-behavior prediction. As CPM focuses on linear modeling and a purely data-driven approach, neuroscientists with limited or no experience in machine learning or optimization will find it easy to implement these protocols. Depending on the volume of data to be processed, the protocol can take 10-100 min for model building, 1-48 h for permutation testing, and 10-20 min for visualization of results.

  7. Development of Interpretable Predictive Models for BPH and Prostate Cancer

    Science.gov (United States)

    Bermejo, Pablo; Vivo, Alicia; Tárraga, Pedro J; Rodríguez-Montes, JA

    2015-01-01

    BACKGROUND Traditional methods for deciding whether to recommend a patient for a prostate biopsy are based on cut-off levels of stand-alone markers such as prostate-specific antigen (PSA) or any of its derivatives. However, in the last decade we have seen the increasing use of predictive models that combine, in a non-linear manner, several predictives that are better able to predict prostate cancer (PC), but these fail to help the clinician to distinguish between PC and benign prostate hyperplasia (BPH) patients. We construct two new models that are capable of predicting both PC and BPH. METHODS An observational study was performed on 150 patients with PSA ≥3 ng/mL and age >50 years. We built a decision tree and a logistic regression model, validated with the leave-one-out methodology, in order to predict PC or BPH, or reject both. RESULTS Statistical dependence with PC and BPH was found for prostate volume (P-value < 0.001), PSA (P-value < 0.001), international prostate symptom score (IPSS; P-value < 0.001), digital rectal examination (DRE; P-value < 0.001), age (P-value < 0.002), antecedents (P-value < 0.006), and meat consumption (P-value < 0.08). The two predictive models that were constructed selected a subset of these, namely, volume, PSA, DRE, and IPSS, obtaining an area under the ROC curve (AUC) between 72% and 80% for both PC and BPH prediction. CONCLUSION PSA and volume together help to build predictive models that accurately distinguish among PC, BPH, and patients without any of these pathologies. Our decision tree and logistic regression models outperform the AUC obtained in the compared studies. Using these models as decision support, the number of unnecessary biopsies might be significantly reduced. PMID:25780348

  8. NONLINEAR MODEL PREDICTIVE CONTROL OF CHEMICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    R. G. SILVA

    1999-03-01

    Full Text Available A new algorithm for model predictive control is presented. The algorithm utilizes a simultaneous solution and optimization strategy to solve the model's differential equations. The equations are discretized by equidistant collocation, and along with the algebraic model equations are included as constraints in a nonlinear programming (NLP problem. This algorithm is compared with the algorithm that uses orthogonal collocation on finite elements. The equidistant collocation algorithm results in simpler equations, providing a decrease in computation time for the control moves. Simulation results are presented and show a satisfactory performance of this algorithm.

  9. Performance model to predict overall defect density

    Directory of Open Access Journals (Sweden)

    J Venkatesh

    2012-08-01

    Full Text Available Management by metrics is the expectation from the IT service providers to stay as a differentiator. Given a project, the associated parameters and dynamics, the behaviour and outcome need to be predicted. There is lot of focus on the end state and in minimizing defect leakage as much as possible. In most of the cases, the actions taken are re-active. It is too late in the life cycle. Root cause analysis and corrective actions can be implemented only to the benefit of the next project. The focus has to shift left, towards the execution phase than waiting for lessons to be learnt post the implementation. How do we pro-actively predict defect metrics and have a preventive action plan in place. This paper illustrates the process performance model to predict overall defect density based on data from projects in an organization.

  10. Neuro-fuzzy modeling in bankruptcy prediction

    Directory of Open Access Journals (Sweden)

    Vlachos D.

    2003-01-01

    Full Text Available For the past 30 years the problem of bankruptcy prediction had been thoroughly studied. From the paper of Altman in 1968 to the recent papers in the '90s, the progress of prediction accuracy was not satisfactory. This paper investigates an alternative modeling of the system (firm, combining neural networks and fuzzy controllers, i.e. using neuro-fuzzy models. Classical modeling is based on mathematical models that describe the behavior of the firm under consideration. The main idea of fuzzy control, on the other hand, is to build a model of a human control expert who is capable of controlling the process without thinking in a mathematical model. This control expert specifies his control action in the form of linguistic rules. These control rules are translated into the framework of fuzzy set theory providing a calculus, which can stimulate the behavior of the control expert and enhance its performance. The accuracy of the model is studied using datasets from previous research papers.

  11. Pressure prediction model for compression garment design.

    Science.gov (United States)

    Leung, W Y; Yuen, D W; Ng, Sun Pui; Shi, S Q

    2010-01-01

    Based on the application of Laplace's law to compression garments, an equation for predicting garment pressure, incorporating the body circumference, the cross-sectional area of fabric, applied strain (as a function of reduction factor), and its corresponding Young's modulus, is developed. Design procedures are presented to predict garment pressure using the aforementioned parameters for clinical applications. Compression garments have been widely used in treating burning scars. Fabricating a compression garment with a required pressure is important in the healing process. A systematic and scientific design method can enable the occupational therapist and compression garments' manufacturer to custom-make a compression garment with a specific pressure. The objectives of this study are 1) to develop a pressure prediction model incorporating different design factors to estimate the pressure exerted by the compression garments before fabrication; and 2) to propose more design procedures in clinical applications. Three kinds of fabrics cut at different bias angles were tested under uniaxial tension, as were samples made in a double-layered structure. Sets of nonlinear force-extension data were obtained for calculating the predicted pressure. Using the value at 0° bias angle as reference, the Young's modulus can vary by as much as 29% for fabric type P11117, 43% for fabric type PN2170, and even 360% for fabric type AP85120 at a reduction factor of 20%. When comparing the predicted pressure calculated from the single-layered and double-layered fabrics, the double-layered construction provides a larger range of target pressure at a particular strain. The anisotropic and nonlinear behaviors of the fabrics have thus been determined. Compression garments can be methodically designed by the proposed analytical pressure prediction model.

  12. Embryo quality predictive models based on cumulus cells gene expression

    Directory of Open Access Journals (Sweden)

    Devjak R

    2016-06-01

    Full Text Available Since the introduction of in vitro fertilization (IVF in clinical practice of infertility treatment, the indicators for high quality embryos were investigated. Cumulus cells (CC have a specific gene expression profile according to the developmental potential of the oocyte they are surrounding, and therefore, specific gene expression could be used as a biomarker. The aim of our study was to combine more than one biomarker to observe improvement in prediction value of embryo development. In this study, 58 CC samples from 17 IVF patients were analyzed. This study was approved by the Republic of Slovenia National Medical Ethics Committee. Gene expression analysis [quantitative real time polymerase chain reaction (qPCR] for five genes, analyzed according to embryo quality level, was performed. Two prediction models were tested for embryo quality prediction: a binary logistic and a decision tree model. As the main outcome, gene expression levels for five genes were taken and the area under the curve (AUC for two prediction models were calculated. Among tested genes, AMHR2 and LIF showed significant expression difference between high quality and low quality embryos. These two genes were used for the construction of two prediction models: the binary logistic model yielded an AUC of 0.72 ± 0.08 and the decision tree model yielded an AUC of 0.73 ± 0.03. Two different prediction models yielded similar predictive power to differentiate high and low quality embryos. In terms of eventual clinical decision making, the decision tree model resulted in easy-to-interpret rules that are highly applicable in clinical practice.

  13. Land-ice modeling for sea-level prediction

    Energy Technology Data Exchange (ETDEWEB)

    Lipscomb, William H [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2010-06-11

    There has been major progress in ice sheet modeling since IPCC AR4. We will soon have efficient higherorder ice sheet models that can run at ",1 km resolution for entire ice sheets, either standalone or coupled to GeMs. These models should significantly reduce uncertainties in sea-level predictions. However, the least certain and potentially greatest contributions to 21st century sea-level rise may come from ice-ocean interactions, especially in West Antarctica. This is a coupled modeling problem that requires collaboration among ice, ocean and atmosphere modelers.

  14. Multiresolution wavelet-ANN model for significant wave height forecasting.

    Digital Repository Service at National Institute of Oceanography (India)

    Deka, P.C.; Mandal, S.; Prahlada, R.

    Hybrid wavelet artificial neural network (WLNN) has been applied in the present study to forecast significant wave heights (Hs). Here Discrete Wavelet Transformation is used to preprocess the time series data (Hs) prior to Artificial Neural Network...

  15. A kinetic model for predicting biodegradation.

    Science.gov (United States)

    Dimitrov, S; Pavlov, T; Nedelcheva, D; Reuschenbach, P; Silvani, M; Bias, R; Comber, M; Low, L; Lee, C; Parkerton, T; Mekenyan, O

    2007-01-01

    Biodegradation plays a key role in the environmental risk assessment of organic chemicals. The need to assess biodegradability of a chemical for regulatory purposes supports the development of a model for predicting the extent of biodegradation at different time frames, in particular the extent of ultimate biodegradation within a '10 day window' criterion as well as estimating biodegradation half-lives. Conceptually this implies expressing the rate of catabolic transformations as a function of time. An attempt to correlate the kinetics of biodegradation with molecular structure of chemicals is presented. A simplified biodegradation kinetic model was formulated by combining the probabilistic approach of the original formulation of the CATABOL model with the assumption of first order kinetics of catabolic transformations. Nonlinear regression analysis was used to fit the model parameters to OECD 301F biodegradation kinetic data for a set of 208 chemicals. The new model allows the prediction of biodegradation multi-pathways, primary and ultimate half-lives and simulation of related kinetic biodegradation parameters such as biological oxygen demand (BOD), carbon dioxide production, and the nature and amount of metabolites as a function of time. The model may also be used for evaluating the OECD ready biodegradability potential of a chemical within the '10-day window' criterion.

  16. Disease Prediction Models and Operational Readiness

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Pullum, Laura L.; Hartley, David M.; Benedum, Corey M.; Noonan, Christine F.; Rabinowitz, Peter M.; Lancaster, Mary J.

    2014-03-19

    INTRODUCTION: The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. One of the primary goals of this research was to characterize the viability of biosurveillance models to provide operationally relevant information for decision makers to identify areas for future research. Two critical characteristics differentiate this work from other infectious disease modeling reviews. First, we reviewed models that attempted to predict the disease event, not merely its transmission dynamics. Second, we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011). Methods: We searched dozens of commercial and government databases and harvested Google search results for eligible models utilizing terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche-modeling, The publication date of search results returned are bound by the dates of coverage of each database and the date in which the search was performed, however all searching was completed by December 31, 2010. This returned 13,767 webpages and 12,152 citations. After de-duplication and removal of extraneous material, a core collection of 6,503 items was established and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. Next, PNNL’s IN-SPIRE visual analytics software was used to cross-correlate these publications with the definition for a biosurveillance model resulting in the selection of 54 documents that matched the criteria resulting Ten of these documents, However, dealt purely with disease spread models, inactivation of bacteria, or the modeling of human immune system responses to pathogens rather than predicting disease events. As a result, we systematically reviewed 44 papers and the

  17. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  18. Predictive Modeling in Actinide Chemistry and Catalysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ping [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-16

    These are slides from a presentation on predictive modeling in actinide chemistry and catalysis. The following topics are covered in these slides: Structures, bonding, and reactivity (bonding can be quantified by optical probes and theory, and electronic structures and reaction mechanisms of actinide complexes); Magnetic resonance properties (transition metal catalysts with multi-nuclear centers, and NMR/EPR parameters); Moving to more complex systems (surface chemistry of nanomaterials, and interactions of ligands with nanoparticles); Path forward and conclusions.

  19. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  20. Probabilistic prediction models for aggregate quarry siting

    Science.gov (United States)

    Robinson, G.R.; Larkins, P.M.

    2007-01-01

    Weights-of-evidence (WofE) and logistic regression techniques were used in a GIS framework to predict the spatial likelihood (prospectivity) of crushed-stone aggregate quarry development. The joint conditional probability models, based on geology, transportation network, and population density variables, were defined using quarry location and time of development data for the New England States, North Carolina, and South Carolina, USA. The Quarry Operation models describe the distribution of active aggregate quarries, independent of the date of opening. The New Quarry models describe the distribution of aggregate quarries when they open. Because of the small number of new quarries developed in the study areas during the last decade, independent New Quarry models have low parameter estimate reliability. The performance of parameter estimates derived for Quarry Operation models, defined by a larger number of active quarries in the study areas, were tested and evaluated to predict the spatial likelihood of new quarry development. Population density conditions at the time of new quarry development were used to modify the population density variable in the Quarry Operation models to apply to new quarry development sites. The Quarry Operation parameters derived for the New England study area, Carolina study area, and the combined New England and Carolina study areas were all similar in magnitude and relative strength. The Quarry Operation model parameters, using the modified population density variables, were found to be a good predictor of new quarry locations. Both the aggregate industry and the land management community can use the model approach to target areas for more detailed site evaluation for quarry location. The models can be revised easily to reflect actual or anticipated changes in transportation and population features. ?? International Association for Mathematical Geology 2007.

  1. Predicting Footbridge Response using Stochastic Load Models

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2013-01-01

    Walking parameters such as step frequency, pedestrian mass, dynamic load factor, etc. are basically stochastic, although it is quite common to adapt deterministic models for these parameters. The present paper considers a stochastic approach to modeling the action of pedestrians, but when doing s...... as it pinpoints which decisions to be concerned about when the goal is to predict footbridge response. The studies involve estimating footbridge responses using Monte-Carlo simulations and focus is on estimating vertical structural response to single person loading....

  2. Nonconvex Model Predictive Control for Commercial Refrigeration

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F.S.; Jørgensen, John Bagterp

    2013-01-01

    is to minimize the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost...... the iterations, which is more than fast enough to run in real-time. We demonstrate our method on a realistic model, with a full year simulation and 15 minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost...

  3. Scalar Dark Matter Models with Significant Internal Bremsstrahlung

    CERN Document Server

    Giacchino, Federica; Tytgat, Michel H G

    2013-01-01

    There has been interest recently on particle physics models that may give rise to sharp gamma ray spectral features from dark matter annihilation. Because dark matter is supposed to be electrically neutral, it is challenging to build weakly interacting massive particle models that may accommodate both a large cross section into gamma rays at, say, the Galactic center, and the right dark matter abundance. In this work, we consider the gamma ray signatures of a class of scalar dark matter models that interact with Standard Model dominantly through heavy vector-like fermions (the vector-like portal). We focus on a real scalar singlet S annihilating into lepton-antilepton pairs. Because this two-body final-state annihilation channel is d-wave suppressed in the chiral limit, we show that virtual internal bremsstrahlung emission of a gamma ray gives a large correction, both today and at the time of freeze-out. For the sake of comparison, we confront this scenario to the familiar case of a Majorana singlet annihilat...

  4. Predictive In Vivo Models for Oncology.

    Science.gov (United States)

    Behrens, Diana; Rolff, Jana; Hoffmann, Jens

    2016-01-01

    Experimental oncology research and preclinical drug development both substantially require specific, clinically relevant in vitro and in vivo tumor models. The increasing knowledge about the heterogeneity of cancer requested a substantial restructuring of the test systems for the different stages of development. To be able to cope with the complexity of the disease, larger panels of patient-derived tumor models have to be implemented and extensively characterized. Together with individual genetically engineered tumor models and supported by core functions for expression profiling and data analysis, an integrated discovery process has been generated for predictive and personalized drug development.Improved “humanized” mouse models should help to overcome current limitations given by xenogeneic barrier between humans and mice. Establishment of a functional human immune system and a corresponding human microenvironment in laboratory animals will strongly support further research.Drug discovery, systems biology, and translational research are moving closer together to address all the new hallmarks of cancer, increase the success rate of drug development, and increase the predictive value of preclinical models.

  5. Constructing predictive models of human running.

    Science.gov (United States)

    Maus, Horst-Moritz; Revzen, Shai; Guckenheimer, John; Ludwig, Christian; Reger, Johann; Seyfarth, Andre

    2015-02-06

    Running is an essential mode of human locomotion, during which ballistic aerial phases alternate with phases when a single foot contacts the ground. The spring-loaded inverted pendulum (SLIP) provides a starting point for modelling running, and generates ground reaction forces that resemble those of the centre of mass (CoM) of a human runner. Here, we show that while SLIP reproduces within-step kinematics of the CoM in three dimensions, it fails to reproduce stability and predict future motions. We construct SLIP control models using data-driven Floquet analysis, and show how these models may be used to obtain predictive models of human running with six additional states comprising the position and velocity of the swing-leg ankle. Our methods are general, and may be applied to any rhythmic physical system. We provide an approach for identifying an event-driven linear controller that approximates an observed stabilization strategy, and for producing a reduced-state model which closely recovers the observed dynamics. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  6. Statistical Seasonal Sea Surface based Prediction Model

    Science.gov (United States)

    Suarez, Roberto; Rodriguez-Fonseca, Belen; Diouf, Ibrahima

    2014-05-01

    The interannual variability of the sea surface temperature (SST) plays a key role in the strongly seasonal rainfall regime on the West African region. The predictability of the seasonal cycle of rainfall is a field widely discussed by the scientific community, with results that fail to be satisfactory due to the difficulty of dynamical models to reproduce the behavior of the Inter Tropical Convergence Zone (ITCZ). To tackle this problem, a statistical model based on oceanic predictors has been developed at the Universidad Complutense of Madrid (UCM) with the aim to complement and enhance the predictability of the West African Monsoon (WAM) as an alternative to the coupled models. The model, called S4CAST (SST-based Statistical Seasonal Forecast) is based on discriminant analysis techniques, specifically the Maximum Covariance Analysis (MCA) and Canonical Correlation Analysis (CCA). Beyond the application of the model to the prediciton of rainfall in West Africa, its use extends to a range of different oceanic, atmospheric and helth related parameters influenced by the temperature of the sea surface as a defining factor of variability.

  7. Mantis: Predicting System Performance through Program Analysis and Modeling

    CERN Document Server

    Chun, Byung-Gon; Lee, Sangmin; Maniatis, Petros; Naik, Mayur

    2010-01-01

    We present Mantis, a new framework that automatically predicts program performance with high accuracy. Mantis integrates techniques from programming language and machine learning for performance modeling, and is a radical departure from traditional approaches. Mantis extracts program features, which are information about program execution runs, through program instrumentation. It uses machine learning techniques to select features relevant to performance and creates prediction models as a function of the selected features. Through program analysis, it then generates compact code slices that compute these feature values for prediction. Our evaluation shows that Mantis can achieve more than 93% accuracy with less than 10% training data set, which is a significant improvement over models that are oblivious to program features. The system generates code slices that are cheap to compute feature values.

  8. Significant increase of Echinococcus multilocularis prevalencein foxes, but no increased predicted risk for humans

    NARCIS (Netherlands)

    Maas, M.; Dam-Deisz, W.D.C.; Roon, van A.M.; Takumi, K.; Giessen, van der J.W.B.

    2014-01-01

    The emergence of the zoonotic tapeworm Echinococcus multilocularis, causative agent ofalveolar echinococcosis (AE), poses a public health risk. A previously designed risk mapmodel predicted a spread of E. multilocularis and increasing numbers of alveolar echinococ-cosis patients in the province of L

  9. Hierarchical Neural Regression Models for Customer Churn Prediction

    Directory of Open Access Journals (Sweden)

    Golshan Mohammadi

    2013-01-01

    Full Text Available As customers are the main assets of each industry, customer churn prediction is becoming a major task for companies to remain in competition with competitors. In the literature, the better applicability and efficiency of hierarchical data mining techniques has been reported. This paper considers three hierarchical models by combining four different data mining techniques for churn prediction, which are backpropagation artificial neural networks (ANN, self-organizing maps (SOM, alpha-cut fuzzy c-means (α-FCM, and Cox proportional hazards regression model. The hierarchical models are ANN + ANN + Cox, SOM + ANN + Cox, and α-FCM + ANN + Cox. In particular, the first component of the models aims to cluster data in two churner and nonchurner groups and also filter out unrepresentative data or outliers. Then, the clustered data as the outputs are used to assign customers to churner and nonchurner groups by the second technique. Finally, the correctly classified data are used to create Cox proportional hazards model. To evaluate the performance of the hierarchical models, an Iranian mobile dataset is considered. The experimental results show that the hierarchical models outperform the single Cox regression baseline model in terms of prediction accuracy, Types I and II errors, RMSE, and MAD metrics. In addition, the α-FCM + ANN + Cox model significantly performs better than the two other hierarchical models.

  10. Predictive modeling by the cerebellum improves proprioception.

    Science.gov (United States)

    Bhanpuri, Nasir H; Okamura, Allison M; Bastian, Amy J

    2013-09-04

    Because sensation is delayed, real-time movement control requires not just sensing, but also predicting limb position, a function hypothesized for the cerebellum. Such cerebellar predictions could contribute to perception of limb position (i.e., proprioception), particularly when a person actively moves the limb. Here we show that human cerebellar patients have proprioceptive deficits compared with controls during active movement, but not when the arm is moved passively. Furthermore, when healthy subjects move in a force field with unpredictable dynamics, they have active proprioceptive deficits similar to cerebellar patients. Therefore, muscle activity alone is likely insufficient to enhance proprioception and predictability (i.e., an internal model of the body and environment) is important for active movement to benefit proprioception. We conclude that cerebellar patients have an active proprioceptive deficit consistent with disrupted movement prediction rather than an inability to generally enhance peripheral proprioceptive signals during action and suggest that active proprioceptive deficits should be considered a fundamental cerebellar impairment of clinical importance.

  11. Community monitoring for youth violence surveillance: testing a prediction model.

    Science.gov (United States)

    Henry, David B; Dymnicki, Allison; Kane, Candice; Quintana, Elena; Cartland, Jenifer; Bromann, Kimberly; Bhatia, Shaun; Wisnieski, Elise

    2014-08-01

    Predictive epidemiology is an embryonic field that involves developing informative signatures for disorder and tracking them using surveillance methods. Through such efforts assistance can be provided to the planning and implementation of preventive interventions. Believing that certain minor crimes indicative of gang activity are informative signatures for the emergence of serious youth violence in communities, in this study we aim to predict outbreaks of violence in neighborhoods from pre-existing levels and changes in reports of minor offenses. We develop a prediction equation that uses publicly available neighborhood-level data on disorderly conduct, vandalism, and weapons violations to predict neighborhoods likely to have increases in serious violent crime. Data for this study were taken from the Chicago Police Department ClearMap reporting system, which provided data on index and non-index crimes for each of the 844 Chicago census tracts. Data were available in three month segments for a single year (fall 2009, winter, spring, and summer 2010). Predicted change in aggravated battery and overall violent crime correlated significantly with actual change. The model was evaluated by comparing alternative models using randomly selected training and test samples, producing favorable results with reference to overfitting, seasonal variation, and spatial autocorrelation. A prediction equation based on winter and spring levels of the predictors had area under the curve ranging from .65 to .71 for aggravated battery, and .58 to .69 for overall violent crime. We discuss future development of such a model and its potential usefulness in violence prevention and community policing.

  12. A prediction model for Clostridium difficile recurrence

    Directory of Open Access Journals (Sweden)

    Francis D. LaBarbera

    2015-02-01

    Full Text Available Background: Clostridium difficile infection (CDI is a growing problem in the community and hospital setting. Its incidence has been on the rise over the past two decades, and it is quickly becoming a major concern for the health care system. High rate of recurrence is one of the major hurdles in the successful treatment of C. difficile infection. There have been few studies that have looked at patterns of recurrence. The studies currently available have shown a number of risk factors associated with C. difficile recurrence (CDR; however, there is little consensus on the impact of most of the identified risk factors. Methods: Our study was a retrospective chart review of 198 patients diagnosed with CDI via Polymerase Chain Reaction (PCR from February 2009 to Jun 2013. In our study, we decided to use a machine learning algorithm called the Random Forest (RF to analyze all of the factors proposed to be associated with CDR. This model is capable of making predictions based on a large number of variables, and has outperformed numerous other models and statistical methods. Results: We came up with a model that was able to accurately predict the CDR with a sensitivity of 83.3%, specificity of 63.1%, and area under curve of 82.6%. Like other similar studies that have used the RF model, we also had very impressive results. Conclusions: We hope that in the future, machine learning algorithms, such as the RF, will see a wider application.

  13. Gamma-Ray Pulsars Models and Predictions

    CERN Document Server

    Harding, A K

    2001-01-01

    Pulsed emission from gamma-ray pulsars originates inside the magnetosphere, from radiation by charged particles accelerated near the magnetic poles or in the outer gaps. In polar cap models, the high energy spectrum is cut off by magnetic pair production above an energy that is dependent on the local magnetic field strength. While most young pulsars with surface fields in the range B = 10^{12} - 10^{13} G are expected to have high energy cutoffs around several GeV, the gamma-ray spectra of old pulsars having lower surface fields may extend to 50 GeV. Although the gamma-ray emission of older pulsars is weaker, detecting pulsed emission at high energies from nearby sources would be an important confirmation of polar cap models. Outer gap models predict more gradual high-energy turnovers at around 10 GeV, but also predict an inverse Compton component extending to TeV energies. Detection of pulsed TeV emission, which would not survive attenuation at the polar caps, is thus an important test of outer gap models. N...

  14. Ground Motion Prediction Models for Caucasus Region

    Science.gov (United States)

    Jorjiashvili, Nato; Godoladze, Tea; Tvaradze, Nino; Tumanova, Nino

    2016-04-01

    Ground motion prediction models (GMPMs) relate ground motion intensity measures to variables describing earthquake source, path, and site effects. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is peak ground acceleration or spectral acceleration because this parameter gives useful information for Seismic Hazard Assessment. Since 2003 development of Georgian Digital Seismic Network has started. In this study new GMP models are obtained based on new data from Georgian seismic network and also from neighboring countries. Estimation of models is obtained by classical, statistical way, regression analysis. In this study site ground conditions are additionally considered because the same earthquake recorded at the same distance may cause different damage according to ground conditions. Empirical ground-motion prediction models (GMPMs) require adjustment to make them appropriate for site-specific scenarios. However, the process of making such adjustments remains a challenge. This work presents a holistic framework for the development of a peak ground acceleration (PGA) or spectral acceleration (SA) GMPE that is easily adjustable to different seismological conditions and does not suffer from the practical problems associated with adjustments in the response spectral domain.

  15. Modeling and Prediction of Krueger Device Noise

    Science.gov (United States)

    Guo, Yueping; Burley, Casey L.; Thomas, Russell H.

    2016-01-01

    This paper presents the development of a noise prediction model for aircraft Krueger flap devices that are considered as alternatives to leading edge slotted slats. The prediction model decomposes the total Krueger noise into four components, generated by the unsteady flows, respectively, in the cove under the pressure side surface of the Krueger, in the gap between the Krueger trailing edge and the main wing, around the brackets supporting the Krueger device, and around the cavity on the lower side of the main wing. For each noise component, the modeling follows a physics-based approach that aims at capturing the dominant noise-generating features in the flow and developing correlations between the noise and the flow parameters that control the noise generation processes. The far field noise is modeled using each of the four noise component's respective spectral functions, far field directivities, Mach number dependencies, component amplitudes, and other parametric trends. Preliminary validations are carried out by using small scale experimental data, and two applications are discussed; one for conventional aircraft and the other for advanced configurations. The former focuses on the parametric trends of Krueger noise on design parameters, while the latter reveals its importance in relation to other airframe noise components.

  16. A generative model for predicting terrorist incidents

    Science.gov (United States)

    Verma, Dinesh C.; Verma, Archit; Felmlee, Diane; Pearson, Gavin; Whitaker, Roger

    2017-05-01

    A major concern in coalition peace-support operations is the incidence of terrorist activity. In this paper, we propose a generative model for the occurrence of the terrorist incidents, and illustrate that an increase in diversity, as measured by the number of different social groups to which that an individual belongs, is inversely correlated with the likelihood of a terrorist incident in the society. A generative model is one that can predict the likelihood of events in new contexts, as opposed to statistical models which are used to predict the future incidents based on the history of the incidents in an existing context. Generative models can be useful in planning for persistent Information Surveillance and Reconnaissance (ISR) since they allow an estimation of regions in the theater of operation where terrorist incidents may arise, and thus can be used to better allocate the assignment and deployment of ISR assets. In this paper, we present a taxonomy of terrorist incidents, identify factors related to occurrence of terrorist incidents, and provide a mathematical analysis calculating the likelihood of occurrence of terrorist incidents in three common real-life scenarios arising in peace-keeping operations

  17. The significance of genetics in pathophysiologic models of premature birth.

    Science.gov (United States)

    Uberos, Jose

    2017-05-31

    Prematurity is a major health problem in all countries, especially in certain ethic groups and increasing recurrence imply the influence of genetic factors. Published genetic polymorphisms are identified in relation to the 4 pathophysiological models of prematurity described: Chorioamniotic-decidual inflammation, premature contraction pathway, decidual haemorrhage and susceptibility to environmental toxins. 240 articles are identified, 52 articles are excluded because they are not original, not written in English or duplicated. From them 125 articles were included in qualitative analysis This review aims to update recent knowledge about genes associated with premature birth.

  18. Optimal feedback scheduling of model predictive controllers

    Institute of Scientific and Technical Information of China (English)

    Pingfang ZHOU; Jianying XIE; Xiaolong DENG

    2006-01-01

    Model predictive control (MPC) could not be reliably applied to real-time control systems because its computation time is not well defined. Implemented as anytime algorithm, MPC task allows computation time to be traded for control performance, thus obtaining the predictability in time. Optimal feedback scheduling (FS-CBS) of a set of MPC tasks is presented to maximize the global control performance subject to limited processor time. Each MPC task is assigned with a constant bandwidth server (CBS), whose reserved processor time is adjusted dynamically. The constraints in the FSCBS guarantee scheduler of the total task set and stability of each component. The FS-CBS is shown robust against the variation of execution time of MPC tasks at runtime. Simulation results illustrate its effectiveness.

  19. Objective calibration of numerical weather prediction models

    Science.gov (United States)

    Voudouri, A.; Khain, P.; Carmona, I.; Bellprat, O.; Grazzini, F.; Avgoustoglou, E.; Bettems, J. M.; Kaufmann, P.

    2017-07-01

    Numerical weather prediction (NWP) and climate models use parameterization schemes for physical processes, which often include free or poorly confined parameters. Model developers normally calibrate the values of these parameters subjectively to improve the agreement of forecasts with available observations, a procedure referred as expert tuning. A practicable objective multi-variate calibration method build on a quadratic meta-model (MM), that has been applied for a regional climate model (RCM) has shown to be at least as good as expert tuning. Based on these results, an approach to implement the methodology to an NWP model is presented in this study. Challenges in transferring the methodology from RCM to NWP are not only restricted to the use of higher resolution and different time scales. The sensitivity of the NWP model quality with respect to the model parameter space has to be clarified, as well as optimize the overall procedure, in terms of required amount of computing resources for the calibration of an NWP model. Three free model parameters affecting mainly turbulence parameterization schemes were originally selected with respect to their influence on the variables associated to daily forecasts such as daily minimum and maximum 2 m temperature as well as 24 h accumulated precipitation. Preliminary results indicate that it is both affordable in terms of computer resources and meaningful in terms of improved forecast quality. In addition, the proposed methodology has the advantage of being a replicable procedure that can be applied when an updated model version is launched and/or customize the same model implementation over different climatological areas.

  20. Prediction models from CAD models of 3D objects

    Science.gov (United States)

    Camps, Octavia I.

    1992-11-01

    In this paper we present a probabilistic prediction based approach for CAD-based object recognition. Given a CAD model of an object, the PREMIO system combines techniques of analytic graphics and physical models of lights and sensors to predict how features of the object will appear in images. In nearly 4,000 experiments on analytically-generated and real images, we show that in a semi-controlled environment, predicting the detectability of features of the image can successfully guide a search procedure to make informed choices of model and image features in its search for correspondences that can be used to hypothesize the pose of the object. Furthermore, we provide a rigorous experimental protocol that can be used to determine the optimal number of correspondences to seek so that the probability of failing to find a pose and of finding an inaccurate pose are minimized.

  1. Comparing predictions made by a prediction model, clinical score, and physicians: pediatric asthma exacerbations in the emergency department.

    Science.gov (United States)

    Farion, K J; Wilk, S; Michalowski, W; O'Sullivan, D; Sayyad-Shirabad, J

    2013-01-01

    Asthma exacerbations are one of the most common medical reasons for children to be brought to the hospital emergency department (ED). Various prediction models have been proposed to support diagnosis of exacerbations and evaluation of their severity. First, to evaluate prediction models constructed from data using machine learning techniques and to select the best performing model. Second, to compare predictions from the selected model with predictions from the Pediatric Respiratory Assessment Measure (PRAM) score, and predictions made by ED physicians. A two-phase study conducted in the ED of an academic pediatric hospital. In phase 1 data collected prospectively using paper forms was used to construct and evaluate five prediction models, and the best performing model was selected. In phase 2 data collected prospectively using a mobile system was used to compare the predictions of the selected prediction model with those from PRAM and ED physicians. Area under the receiver operating characteristic curve and accuracy in phase 1; accuracy, sensitivity, specificity, positive and negative predictive values in phase 2. In phase 1 prediction models were derived from a data set of 240 patients and evaluated using 10-fold cross validation. A naive Bayes (NB) model demonstrated the best performance and it was selected for phase 2. Evaluation in phase 2 was conducted on data from 82 patients. Predictions made by the NB model were less accurate than the PRAM score and physicians (accuracy of 70.7%, 73.2% and 78.0% respectively), however, according to McNemar's test it is not possible to conclude that the differences between predictions are statistically significant. Both the PRAM score and the NB model were less accurate than physicians. The NB model can handle incomplete patient data and as such may complement the PRAM score. However, it requires further research to improve its accuracy.

  2. An Anisotropic Hardening Model for Springback Prediction

    Science.gov (United States)

    Zeng, Danielle; Xia, Z. Cedric

    2005-08-01

    As more Advanced High-Strength Steels (AHSS) are heavily used for automotive body structures and closures panels, accurate springback prediction for these components becomes more challenging because of their rapid hardening characteristics and ability to sustain even higher stresses. In this paper, a modified Mroz hardening model is proposed to capture realistic Bauschinger effect at reverse loading, such as when material passes through die radii or drawbead during sheet metal forming process. This model accounts for material anisotropic yield surface and nonlinear isotropic/kinematic hardening behavior. Material tension/compression test data are used to accurately represent Bauschinger effect. The effectiveness of the model is demonstrated by comparison of numerical and experimental springback results for a DP600 straight U-channel test.

  3. Models and methods for wind effect prediction; Modeller og metoder til prediktion af vindeffekt

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.

    1997-12-31

    In this report methods and models for predicting power produced by windmills, are considered. Several methods are suggested and investigated on actual observations of wind speed and the corresponding power. In order to improve the predictions meteorological forecasts are used in the formulation of the models. The methods applied cover non-parametric identification, least squares estimation and local regression. It was found that the meteorological forecasts significantly improved the predictions, and that a combination of non-parametric and parametric modelling, proved to be successful. (au) 38 refs.

  4. Incorporating significant amino acid pairs and protein domains to predict RNA splicing-related proteins with functional roles

    Science.gov (United States)

    Hsu, Justin Bo-Kai; Huang, Kai-Yao; Weng, Tzu-Ya; Huang, Chien-Hsun; Lee, Tzong-Yi

    2014-01-01

    Machinery of pre-mRNA splicing is carried out through the interaction of RNA sequence elements and a variety of RNA splicing-related proteins (SRPs) (e.g. spliceosome and splicing factors). Alternative splicing, which is an important post-transcriptional regulation in eukaryotes, gives rise to multiple mature mRNA isoforms, which encodes proteins with functional diversities. However, the regulation of RNA splicing is not yet fully elucidated, partly because SRPs have not yet been exhaustively identified and the experimental identification is labor-intensive. Therefore, we are motivated to design a new method for identifying SRPs with their functional roles in the regulation of RNA splicing. The experimentally verified SRPs were manually curated from research articles. According to the functional annotation of Splicing Related Gene Database, the collected SRPs were further categorized into four functional groups including small nuclear Ribonucleoprotein, Splicing Factor, Splicing Regulation Factor and Novel Spliceosome Protein. The composition of amino acid pairs indicates that there are remarkable differences among four functional groups of SRPs. Then, support vector machines (SVMs) were utilized to learn the predictive models for identifying SRPs as well as their functional roles. The cross-validation evaluation presents that the SVM models trained with significant amino acid pairs and functional domains could provide a better predictive performance. In addition, the independent testing demonstrates that the proposed method could accurately identify SRPs in mammals/plants as well as effectively distinguish between SRPs and RNA-binding proteins. This investigation provides a practical means to identifying potential SRPs and a perspective for exploring the regulation of RNA splicing.

  5. Incorporating significant amino acid pairs and protein domains to predict RNA splicing-related proteins with functional roles.

    Science.gov (United States)

    Hsu, Justin Bo-Kai; Huang, Kai-Yao; Weng, Tzu-Ya; Huang, Chien-Hsun; Lee, Tzong-Yi

    2014-01-01

    Machinery of pre-mRNA splicing is carried out through the interaction of RNA sequence elements and a variety of RNA splicing-related proteins (SRPs) (e.g. spliceosome and splicing factors). Alternative splicing, which is an important post-transcriptional regulation in eukaryotes, gives rise to multiple mature mRNA isoforms, which encodes proteins with functional diversities. However, the regulation of RNA splicing is not yet fully elucidated, partly because SRPs have not yet been exhaustively identified and the experimental identification is labor-intensive. Therefore, we are motivated to design a new method for identifying SRPs with their functional roles in the regulation of RNA splicing. The experimentally verified SRPs were manually curated from research articles. According to the functional annotation of Splicing Related Gene Database, the collected SRPs were further categorized into four functional groups including small nuclear Ribonucleoprotein, Splicing Factor, Splicing Regulation Factor and Novel Spliceosome Protein. The composition of amino acid pairs indicates that there are remarkable differences among four functional groups of SRPs. Then, support vector machines (SVMs) were utilized to learn the predictive models for identifying SRPs as well as their functional roles. The cross-validation evaluation presents that the SVM models trained with significant amino acid pairs and functional domains could provide a better predictive performance. In addition, the independent testing demonstrates that the proposed method could accurately identify SRPs in mammals/plants as well as effectively distinguish between SRPs and RNA-binding proteins. This investigation provides a practical means to identifying potential SRPs and a perspective for exploring the regulation of RNA splicing.

  6. Faculty Decisions on Serials Subscriptions Differ Significantly from Decisions Predicted by a Bibliometric Tool.

    Directory of Open Access Journals (Sweden)

    Sue F. Phelps

    2016-03-01

    of the faculty choices. The p-value for this relationship was less than 0.0001, also indicating that the result was not by chance. A quadratic model plotted alongside the previous linear model follows a similar pattern. The p-value of the comparison is 0.0002, which indicates the quadratic model’s fit cannot be explained by random chance. Main Results – The authors point out three outstanding findings. First, the match rate between faculty valuations and bibliometric scores for serials is 65%. This exceeds the 50% rate that would indicate random association, but also indicates a statistically significant difference between faculty and bibliometric valuations. Secondly, the match rate with the bibliometric scores for titles that faculty chose to keep (73% was higher than those they chose to cancel (54%. Thirdly, the match rate increased with higher bibliometric scores. Conclusions – Though the authors identify only a modest degree of similarity between faculty and bibliometric valuations of serials, it is noted that there is more agreement in the higher valued serials than the lower valued serials. With that in mind, librarians might focus faculty review on the lower scoring titles in the future, taking into consideration that unique faculty interests may drive selection at that level and would need to be balanced with the mission of the library.

  7. Predicting functional brain ROIs via fiber shape models.

    Science.gov (United States)

    Zhang, Tuo; Guo, Lei; Li, Kaiming; Zhu, Dajing; Cui, Guangbin; Liu, Tianming

    2011-01-01

    Study of structural and functional connectivities of the human brain has received significant interest and effort recently. A fundamental question arises when attempting to measure the structural and/or functional connectivities of specific brain networks: how to best identify possible Regions of Interests (ROIs)? In this paper, we present a novel ROI prediction framework that localizes ROIs in individual brains based on learned fiber shape models from multimodal task-based fMRI and diffusion tensor imaging (DTI) data. In the training stage, ROIs are identified as activation peaks in task-based fMRI data. Then, shape models of white matter fibers emanating from these functional ROIs are learned. In addition, ROIs' location distribution model is learned to be used as an anatomical constraint. In the prediction stage, functional ROIs are predicted in individual brains based on DTI data. The ROI prediction is formulated and solved as an energy minimization problem, in which the two learned models are used as energy terms. Our experiment results show that the average ROI prediction error is 3.45 mm, in comparison with the benchmark data provided by working memory task-based fMRI. Promising results were also obtained on the ADNI-2 longitudinal DTI dataset.

  8. Predictive modelling of ferroelectric tunnel junctions

    Science.gov (United States)

    Velev, Julian P.; Burton, John D.; Zhuravlev, Mikhail Ye; Tsymbal, Evgeny Y.

    2016-05-01

    Ferroelectric tunnel junctions combine the phenomena of quantum-mechanical tunnelling and switchable spontaneous polarisation of a nanometre-thick ferroelectric film into novel device functionality. Switching the ferroelectric barrier polarisation direction produces a sizable change in resistance of the junction—a phenomenon known as the tunnelling electroresistance effect. From a fundamental perspective, ferroelectric tunnel junctions and their version with ferromagnetic electrodes, i.e., multiferroic tunnel junctions, are testbeds for studying the underlying mechanisms of tunnelling electroresistance as well as the interplay between electric and magnetic degrees of freedom and their effect on transport. From a practical perspective, ferroelectric tunnel junctions hold promise for disruptive device applications. In a very short time, they have traversed the path from basic model predictions to prototypes for novel non-volatile ferroelectric random access memories with non-destructive readout. This remarkable progress is to a large extent driven by a productive cycle of predictive modelling and innovative experimental effort. In this review article, we outline the development of the ferroelectric tunnel junction concept and the role of theoretical modelling in guiding experimental work. We discuss a wide range of physical phenomena that control the functional properties of ferroelectric tunnel junctions and summarise the state-of-the-art achievements in the field.

  9. Simple predictions from multifield inflationary models.

    Science.gov (United States)

    Easther, Richard; Frazer, Jonathan; Peiris, Hiranya V; Price, Layne C

    2014-04-25

    We explore whether multifield inflationary models make unambiguous predictions for fundamental cosmological observables. Focusing on N-quadratic inflation, we numerically evaluate the full perturbation equations for models with 2, 3, and O(100) fields, using several distinct methods for specifying the initial values of the background fields. All scenarios are highly predictive, with the probability distribution functions of the cosmological observables becoming more sharply peaked as N increases. For N=100 fields, 95% of our Monte Carlo samples fall in the ranges ns∈(0.9455,0.9534), α∈(-9.741,-7.047)×10-4, r∈(0.1445,0.1449), and riso∈(0.02137,3.510)×10-3 for the spectral index, running, tensor-to-scalar ratio, and isocurvature-to-adiabatic ratio, respectively. The expected amplitude of isocurvature perturbations grows with N, raising the possibility that many-field models may be sensitive to postinflationary physics and suggesting new avenues for testing these scenarios.

  10. The Comparison Study of Short-Term Prediction Methods to Enhance the Model Predictive Controller Applied to Microgrid Energy Management

    Directory of Open Access Journals (Sweden)

    César Hernández-Hernández

    2017-06-01

    Full Text Available Electricity load forecasting, optimal power system operation and energy management play key roles that can bring significant operational advantages to microgrids. This paper studies how methods based on time series and neural networks can be used to predict energy demand and production, allowing them to be combined with model predictive control. Comparisons of different prediction methods and different optimum energy distribution scenarios are provided, permitting us to determine when short-term energy prediction models should be used. The proposed prediction models in addition to the model predictive control strategy appear as a promising solution to energy management in microgrids. The controller has the task of performing the management of electricity purchase and sale to the power grid, maximizing the use of renewable energy sources and managing the use of the energy storage system. Simulations were performed with different weather conditions of solar irradiation. The obtained results are encouraging for future practical implementation.

  11. Accuracy of pitch matching significantly improved by live voice model.

    Science.gov (United States)

    Granot, Roni Y; Israel-Kolatt, Rona; Gilboa, Avi; Kolatt, Tsafrir

    2013-05-01

    Singing is, undoubtedly, the most fundamental expression of our musical capacity, yet an estimated 10-15% of Western population sings "out-of-tune (OOT)." Previous research in children and adults suggests, albeit inconsistently, that imitating a human voice can improve pitch matching. In the present study, we focus on the potentially beneficial effects of the human voice and especially the live human voice. Eighteen participants varying in their singing abilities were required to imitate in singing a set of nine ascending and descending intervals presented to them in five different randomized blocked conditions: live piano, recorded piano, live voice using optimal voice production, recorded voice using optimal voice production, and recorded voice using artificial forced voice production. Pitch and interval matching in singing were much more accurate when participants repeated sung intervals as compared with intervals played to them on the piano. The advantage of the vocal over the piano stimuli was robust and emerged clearly regardless of whether piano tones were played live and in full view or were presented via recording. Live vocal stimuli elicited higher accuracy than recorded vocal stimuli, especially when the recorded vocal stimuli were produced in a forced vocal production. Remarkably, even those who would be considered OOT singers on the basis of their performance when repeating piano tones were able to pitch match live vocal sounds, with deviations well within the range of what is considered accurate singing (M=46.0, standard deviation=39.2 cents). In fact, those participants who were most OOT gained the most from the live voice model. Results are discussed in light of the dual auditory-motor encoding of pitch analogous to that found in speech. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  12. Predictions of models for environmental radiological assessment

    Energy Technology Data Exchange (ETDEWEB)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa, E-mail: suelip@ird.gov.br, E-mail: dejanira@irg.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Servico de Avaliacao de Impacto Ambiental, Rio de Janeiro, RJ (Brazil); Mahler, Claudio Fernando [Coppe. Instituto Alberto Luiz Coimbra de Pos-Graduacao e Pesquisa de Engenharia, Universidade Federal do Rio de Janeiro (UFRJ) - Programa de Engenharia Civil, RJ (Brazil)

    2011-07-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for {sup 137}Cs and {sup 60}Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  13. Predicting Protein Secondary Structure with Markov Models

    DEFF Research Database (Denmark)

    Fischer, Paul; Larsen, Simon; Thomsen, Claus

    2004-01-01

    we are considering here, is to predict the secondary structure from the primary one. To this end we train a Markov model on training data and then use it to classify parts of unknown protein sequences as sheets, helices or coils. We show how to exploit the directional information contained......The primary structure of a protein is the sequence of its amino acids. The secondary structure describes structural properties of the molecule such as which parts of it form sheets, helices or coils. Spacial and other properties are described by the higher order structures. The classification task...

  14. A Modified Model Predictive Control Scheme

    Institute of Scientific and Technical Information of China (English)

    Xiao-Bing Hu; Wen-Hua Chen

    2005-01-01

    In implementations of MPC (Model Predictive Control) schemes, two issues need to be addressed. One is how to enlarge the stability region as much as possible. The other is how to guarantee stability when a computational time limitation exists. In this paper, a modified MPC scheme for constrained linear systems is described. An offline LMI-based iteration process is introduced to expand the stability region. At the same time, a database of feasible control sequences is generated offline so that stability can still be guaranteed in the case of computational time limitations. Simulation results illustrate the effectiveness of this new approach.

  15. Hierarchical Model Predictive Control for Resource Distribution

    DEFF Research Database (Denmark)

    Bendtsen, Jan Dimon; Trangbæk, K; Stoustrup, Jakob

    2010-01-01

    This paper deals with hierarchichal model predictive control (MPC) of distributed systems. A three level hierachical approach is proposed, consisting of a high level MPC controller, a second level of so-called aggregators, controlled by an online MPC-like algorithm, and a lower level of autonomous...... facilitates plug-and-play addition of subsystems without redesign of any controllers. The method is supported by a number of simulations featuring a three-level smart-grid power control system for a small isolated power grid....

  16. Explicit model predictive control accuracy analysis

    OpenAIRE

    Knyazev, Andrew; Zhu, Peizhen; Di Cairano, Stefano

    2015-01-01

    Model Predictive Control (MPC) can efficiently control constrained systems in real-time applications. MPC feedback law for a linear system with linear inequality constraints can be explicitly computed off-line, which results in an off-line partition of the state space into non-overlapped convex regions, with affine control laws associated to each region of the partition. An actual implementation of this explicit MPC in low cost micro-controllers requires the data to be "quantized", i.e. repre...

  17. Lepton Flavor Violation in Predictive SUSY-GUT Models

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Carl H.; /Northern Illinois U. /Fermilab; Chen, Mu-Chun; /UC, Irvine

    2008-02-01

    There have been many theoretical models constructed which aim to explain the neutrino masses and mixing patterns. While many of the models will be eliminated once more accurate determinations of the mixing parameters, especially sin{sup 2} 2{theta}{sub 13}, are obtained, charged lepton flavor violation (LFV) experiments are able to differentiate even further among the models. In this paper, they investigate various rare LFV processes, such as {ell}{sub i} {yields} {ell}{sub j} + {gamma} and {mu} - e conversion, in five predictive SUSY SO(10) models and their allowed soft SUSY breaking parameter space in the constrained minimal SUSY standard model (CMSSM). Utilizing the WMAP dark matter constraints, they obtain lower bounds on the branching ratios of these rare processes and find that at least three of the five models they consider give rise to predictions for {mu} {yields} e + {gamma} that will be tested by the MEG collaboration at PSI. in addition, the next generation {mu} - e conversion experiment has sensitivity to the predictions of all five models, making it an even more robust way to test these models. While generic studies have emphasized the dependence of the branching ratios of these rare processes on the reactor neutrino angle, {theta}{sub 13}, and the mass of the heaviest right-handed neutrino, M{sub 3}, they find very massive M{sub 3} is more significant than large {theta}{sub 13} in leading to branching ratios near to the present upper limits.

  18. Critical conceptualism in environmental modeling and prediction.

    Science.gov (United States)

    Christakos, G

    2003-10-15

    Many important problems in environmental science and engineering are of a conceptual nature. Research and development, however, often becomes so preoccupied with technical issues, which are themselves fascinating, that it neglects essential methodological elements of conceptual reasoning and theoretical inquiry. This work suggests that valuable insight into environmental modeling can be gained by means of critical conceptualism which focuses on the software of human reason and, in practical terms, leads to a powerful methodological framework of space-time modeling and prediction. A knowledge synthesis system develops the rational means for the epistemic integration of various physical knowledge bases relevant to the natural system of interest in order to obtain a realistic representation of the system, provide a rigorous assessment of the uncertainty sources, generate meaningful predictions of environmental processes in space-time, and produce science-based decisions. No restriction is imposed on the shape of the distribution model or the form of the predictor (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated). The scientific reasoning structure underlying knowledge synthesis involves teleologic criteria and stochastic logic principles which have important advantages over the reasoning method of conventional space-time techniques. Insight is gained in terms of real world applications, including the following: the study of global ozone patterns in the atmosphere using data sets generated by instruments on board the Nimbus 7 satellite and secondary information in terms of total ozone-tropopause pressure models; the mapping of arsenic concentrations in the Bangladesh drinking water by assimilating hard and soft data from an extensive network of monitoring wells; and the dynamic imaging of probability distributions of pollutants across the Kalamazoo river.

  19. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  20. The significance of the ProtDeform score for structure prediction and alignment.

    Directory of Open Access Journals (Sweden)

    Jairo Rocha

    Full Text Available BACKGROUND: When a researcher uses a program to align two proteins and gets a score, one of her main concerns is how often the program gives a similar score to pairs that are or are not in the same fold. This issue was analysed in detail recently for the program TM-align with its associated TM-score. It was shown that because the TM-score is length independent, it allows a P-value and a hit probability to be defined depending only on the score. Also, it was found that the TM-scores of gapless alignments closely follow an Extreme Value Distribution (EVD. The program ProtDeform for structural protein alignment was developed recently and is characterised by the ability to propose different transformations of different protein regions. Our goal is to analyse its associated score to allow a researcher to have objective reasons to prefer one aligner over another, and carry out a better interpretation of the output. RESULTS: The study on the ProtDeform score reveals that it is length independent in a wider score range than TM-scores and that PD-scores of gapless (random alignments also approximately follow an EVD. On the CASP8 predictions, PD-scores and TM-scores, with respect to native structures, are highly correlated (0.95, and show that around a fifth of the predictions have a quality as low as 99.5% of the random scores. Using the Gold Standard benchmark, ProtDeform has lower probabilities of error than TM-align both at a similar speed. The analysis is extended to homology discrimination showing that, again, ProtDeform offers higher hit probabilities than TM-align. Finally, we suggest using three different P-values according to the three different contexts: Gapless alignments, optimised alignments for fold discrimination and that for superfamily discrimination. In conclusion, PD-scores are at the very least as valuable for prediction scoring as TM-scores, and on the protein classification problem, even more reliable.

  1. Additional Value of Transluminal Attenuation Gradient in CT Angiography to Predict Hemodynamic Significance of Coronary Artery Stenosis

    Science.gov (United States)

    Stuijfzand, Wynand J.; Danad, Ibrahim; Raijmakers, Pieter G.; Marcu, C. Bogdan; Heymans, Martijn W.; van Kuijk, Cornelis C.; van Rossum, Albert C.; Nieman, Koen; Min, James K.; Leipsic, Jonathon; van Royen, Niels; Knaapen, Paul

    2015-01-01

    OBJECTIVES The current study evaluates the incremental value of transluminal attenuation gradient (TAG), TAG with corrected contrast opacification (CCO), and TAG with exclusion of calcified coronary segments (ExC) over coronary computed tomography angiogram (CTA) alone using fractional flow reserve (FFR) as the gold standard. BACKGROUND TAG is defined as the contrast opacification gradient along the length of a coronary artery on a coronary CTA. Preliminary data suggest that TAG provides additional functional information. Interpretation of TAG is hampered by multiple heartbeat acquisition algorithms and coronary calcifications. Two correction models have been proposed based on either dephasing of contrast delivery by relating coronary density to corresponding descending aortic opacification (TAG-CCO) or excluding calcified coronary segments (TAG-ExC). METHODS Eighty-five patients with intermediate probability of coronary artery disease were prospectively included. All patients underwent step-and-shoot 256-slice coronary CTA. TAG, TAG-CCO, and TAG-ExC analyses were performed followed by invasive coronary angiography in conjunction with FFR measurements of all major coronary branches. RESULTS Thirty-four patients (40%) were diagnosed with hemodynamically-significant coronary artery disease (i.e., FFR ≤0.80). On a per-vessel basis (n = 253), 59 lesions (23%) were graded as hemodynamically significant, and the diagnostic accuracy of coronary CTA (diameter stenosis ≥50%) was 95%, 75%, 98%, and 54% for sensitivity, specificity, negative predictive value, and positive predictive value, respectively. TAG and TAG-ExC did not discriminate between vessels with or without hemodynamically significant lesions (−13.5 ± 17.1 HU [Hounsfield units] × 10 mm−1 vs. −11.6 ± 13.3 HU × 10 mm−1, p = 0.36; and 13.1 ± 15.9 HU × 10 mm−1 vs. −11.4 ± 11.7 HU × 10 mm−1, p = 0.77, respectively). TAG-CCO was lower in vessels with a hemodynamically-significant lesion (−0

  2. Formalization of the model of the enterprise insolvency risk prediction

    Directory of Open Access Journals (Sweden)

    Elena V. Shirinkina

    2015-12-01

    Full Text Available Objective to improve the conceptual apparatus and analytical procedures of insolvency risk identification. Methods general scientific methods of systemic and comparative analysis economicstatistical and dynamic analysis of economic processes and phenomena. Results nowadays managing the insolvency risk is relevant for any company regardless of the economy sector. Instability manifests itself through the uncertainty of the directions of the external environment changes and their high frequency. Analysis of the economic literature showed that currently there is no single approach to systematization of methods for insolvency risk prediction which means that there is no objective view on tools that can be used to monitor the insolvency risk. In this respect scientific and practical search of representative indicators for the formalization of the models predicting the insolvency is very important. Therefore the study has solved the following tasks defined the nature of the insolvency risk and its identification in the process of financial relations in management system proved the representativeness of the indicators in the insolvency risk prediction and formed the model of the risk insolvency prediction. Scientific novelty grounding the model of risk insolvency prediction. Practical significance development of a theoretical framework to address issues arising in the diagnosis of insolvent enterprises and application of the results obtained in the practice of the bankruptcy institution bodies. The presented model allows to predict the insolvency risk of the enterprise through the general development trend and the fluctuation boundaries of bankruptcy risk to determine the significance of each indicatorfactor its quantitative impact and therefore to avoid the risk of the enterprise insolvency. nbsp

  3. QSAR Models for the Prediction of Plasma Protein Binding

    Directory of Open Access Journals (Sweden)

    Zeshan Amin

    2013-02-01

    Full Text Available Introduction: The prediction of plasma protein binding (ppb is of paramount importance in the pharmacokinetics characterization of drugs, as it causes significant changes in volume of distribution, clearance and drug half life. This study utilized Quantitative Structure – Activity Relationships (QSAR for the prediction of plasma protein binding. Methods: Protein binding values for 794 compounds were collated from literature. The data was partitioned into a training set of 662 compounds and an external validation set of 132 compounds. Physicochemical and molecular descriptors were calculated for each compound using ACD labs/logD, MOE (Chemical Computing Group and Symyx QSAR software packages. Several data mining tools were employed for the construction of models. These included stepwise regression analysis, Classification and Regression Trees (CART, Boosted trees and Random Forest. Results: Several predictive models were identified; however, one model in particular produced significantly superior prediction accuracy for the external validation set as measured using mean absolute error and correlation coefficient. The selected model was a boosted regression tree model which had the mean absolute error for training set of 13.25 and for validation set of 14.96. Conclusion: Plasma protein binding can be modeled using simple regression trees or multiple linear regressions with reasonable model accuracies. These interpretable models were able to identify the governing molecular factors for a high ppb that included hydrophobicity, van der Waals surface area parameters, and aromaticity. On the other hand, the more complicated ensemble method of boosted regression trees produced the most accurate ppb estimations for the external validation set.

  4. Predictive significance of tissue eosinophilia for nasal polyp recurrence in the Chinese population.

    Science.gov (United States)

    Lou, Hongfei; Meng, Yifan; Piao, Yingshi; Wang, Chengshuo; Zhang, Luo; Bachert, Claus

    2015-01-01

    Chronic rhinosinusitis with nasal polyps (CRSwNP) remains a challenging clinical entity with its propensity for recurrence. Tissue eosinophilia is a hallmark of CRSwNP, and its role in polyp recurrence is a subject of much investigation. The aim of this study was to evaluate the association between clinical parameters, especially tissue eosinophilia and polyp recurrence, and to identify the optimal cutoff value of tissue eosinophilia as a predictor for polyp recurrence in Chinese subjects. Overall, 387 patients with CRSwNP were enrolled in this retrospective analysis and postoperative follow-up for polyp recurrence was over a period that lasted >24 months (mean [standard deviation], 34.03 ± 4.95 months). The baseline demographic and clinical features and the preoperative computed tomography were compared, and mucosal specimens obtained at endoscopic sinus surgery were assessed for inflammatory cells by using histocytologic staining. Predictive factors associated with polyp recurrence were analyzed by logistic regression analysis, and optimal cutoff points of the predictors were determined by receiver operating characteristic curves and the Youden index. A total of 55.3% patients (214/387) experienced recurrence. Tissue eosinophilia markedly outweighed other parameters and correlated with polyp recurrence. Receiver operating characteristic curves indicated that a cutoff value of 27% tissue eosinophils predicted recurrence with 96.7% sensitivity and 92.5% specificity (area under the curve = 0.969; p 27% of total cells or a tissue eosinophil absolute count of >55 eosinophils per high power field may act as a reliable prognostic indicator for nasal polyp recurrence within 2 years after surgery.

  5. A Predictive Maintenance Model for Railway Tracks

    DEFF Research Database (Denmark)

    Li, Rui; Wen, Min; Salling, Kim Bang

    2015-01-01

    For the modern railways, maintenance is critical for ensuring safety, train punctuality and overall capacity utilization. The cost of railway maintenance in Europe is high, on average between 30,000 – 100,000 Euro per km per year [1]. Aiming to reduce such maintenance expenditure, this paper...... presents a mathematical model based on Mixed Integer Programming (MIP) which is designed to optimize the predictive railway tamping activities for ballasted track for the time horizon up to four years. The objective function is setup to minimize the actual costs for the tamping machine (measured by time...... recovery on the track quality after tamping operation and (5) Tamping machine operation factors. A Danish railway track between Odense and Fredericia with 57.2 km of length is applied for a time period of two to four years in the proposed maintenance model. The total cost can be reduced with up to 50...

  6. Predictive Model of Radiative Neutrino Masses

    CERN Document Server

    Babu, K S

    2013-01-01

    We present a simple and predictive model of radiative neutrino masses. It is a special case of the Zee model which introduces two Higgs doublets and a charged singlet. We impose a family-dependent Z_4 symmetry acting on the leptons, which reduces the number of parameters describing neutrino oscillations to four. A variety of predictions follow: The hierarchy of neutrino masses must be inverted; the lightest neutrino mass is extremely small and calculable; one of the neutrino mixing angles is determined in terms of the other two; the phase parameters take CP-conserving values with \\delta_{CP} = \\pi; and the effective mass in neutrinoless double beta decay lies in a narrow range, m_{\\beta \\beta} = (17.6 - 18.5) meV. The ratio of vacuum expectation values of the two Higgs doublets, tan\\beta, is determined to be either 1.9 or 0.19 from neutrino oscillation data. Flavor-conserving and flavor-changing couplings of the Higgs doublets are also determined from neutrino data. The non-standard neutral Higgs bosons, if t...

  7. Hybrid multiscale modeling and prediction of cancer cell behavior.

    Science.gov (United States)

    Zangooei, Mohammad Hossein; Habibi, Jafar

    2017-01-01

    Understanding cancer development crossing several spatial-temporal scales is of great practical significance to better understand and treat cancers. It is difficult to tackle this challenge with pure biological means. Moreover, hybrid modeling techniques have been proposed that combine the advantages of the continuum and the discrete methods to model multiscale problems. In light of these problems, we have proposed a new hybrid vascular model to facilitate the multiscale modeling and simulation of cancer development with respect to the agent-based, cellular automata and machine learning methods. The purpose of this simulation is to create a dataset that can be used for prediction of cell phenotypes. By using a proposed Q-learning based on SVR-NSGA-II method, the cells have the capability to predict their phenotypes autonomously that is, to act on its own without external direction in response to situations it encounters. Computational simulations of the model were performed in order to analyze its performance. The most striking feature of our results is that each cell can select its phenotype at each time step according to its condition. We provide evidence that the prediction of cell phenotypes is reliable. Our proposed model, which we term a hybrid multiscale modeling of cancer cell behavior, has the potential to combine the best features of both continuum and discrete models. The in silico results indicate that the 3D model can represent key features of cancer growth, angiogenesis, and its related micro-environment and show that the findings are in good agreement with biological tumor behavior. To the best of our knowledge, this paper is the first hybrid vascular multiscale modeling of cancer cell behavior that has the capability to predict cell phenotypes individually by a self-generated dataset.

  8. A predictive model for dimensional errors in fused deposition modeling

    DEFF Research Database (Denmark)

    Stolfi, A.

    2015-01-01

    This work concerns the effect of deposition angle (a) and layer thickness (L) on the dimensional performance of FDM parts using a predictive model based on the geometrical description of the FDM filament profile. An experimental validation over the whole a range from 0° to 177° at 3° steps and two...

  9. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  10. Continuous-Discrete Time Prediction-Error Identification Relevant for Linear Model Predictive Control

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    model is realized from a continuous-discrete-time linear stochastic system specified using transfer functions with time-delays. It is argued that the prediction-error criterion should be selected such that it is compatible with the objective function of the predictive controller in which the model......A Prediction-error-method tailored for model based predictive control is presented. The prediction-error method studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model. The linear discrete-time stochastic state space...

  11. Using the IAT to predict ethnic and racial discrimination: small effect sizes of unknown societal significance.

    Science.gov (United States)

    Oswald, Frederick L; Mitchell, Gregory; Blanton, Hart; Jaccard, James; Tetlock, Philip E

    2015-04-01

    Greenwald, Banaji, and Nosek (2015) present a reanalysis of the meta-analysis by Oswald, Mitchell, Blanton, Jaccard, and Tetlock (2013) that examined the effect sizes of Implicit Association Tests (IATs) designed to predict racial and ethnic discrimination. We discuss points of agreement and disagreement with respect to methods used to synthesize the IAT studies, and we correct an error by Greenwald et al. that obscures a key contribution of our meta-analysis. In the end, all of the meta-analyses converge on the conclusion that, across diverse methods of coding and analyzing the data, IAT scores are not good predictors of ethnic or racial discrimination, and explain, at most, small fractions of the variance in discriminatory behavior in controlled laboratory settings. The thought experiments presented by Greenwald et al. go well beyond the lab to claim systematic IAT effects in noisy real-world settings, but these hypothetical exercises depend crucially on untested and, arguably, untenable assumptions. (c) 2015 APA, all rights reserved).

  12. Patient-specific metrics of invasiveness reveal significant prognostic benefit of resection in a predictable subset of gliomas.

    Directory of Open Access Journals (Sweden)

    Anne L Baldock

    Full Text Available Malignant gliomas are incurable, primary brain neoplasms noted for their potential to extensively invade brain parenchyma. Current methods of clinical imaging do not elucidate the full extent of brain invasion, making it difficult to predict which, if any, patients are likely to benefit from gross total resection. Our goal was to apply a mathematical modeling approach to estimate the overall tumor invasiveness on a patient-by-patient basis and determine whether gross total resection would improve survival in patients with relatively less invasive gliomas.In 243 patients presenting with contrast-enhancing gliomas, estimates of the relative invasiveness of each patient's tumor, in terms of the ratio of net proliferation rate of the glioma cells to their net dispersal rate, were derived by applying a patient-specific mathematical model to routine pretreatment MR imaging. The effect of varying degrees of extent of resection on overall survival was assessed for cohorts of patients grouped by tumor invasiveness.We demonstrate that patients with more diffuse tumors showed no survival benefit (P = 0.532 from gross total resection over subtotal/biopsy, while those with nodular (less diffuse tumors showed a significant benefit (P = 0.00142 with a striking median survival benefit of over eight months compared to sub-totally resected tumors in the same cohort (an 80% improvement in survival time for GTR only seen for nodular tumors.These results suggest that our patient-specific, model-based estimates of tumor invasiveness have clinical utility in surgical decision making. Quantification of relative invasiveness assessed from routinely obtained pre-operative imaging provides a practical predictor of the benefit of gross total resection.

  13. Two criteria for evaluating risk prediction models.

    Science.gov (United States)

    Pfeiffer, R M; Gail, M H

    2011-09-01

    We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF (q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF (p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF (q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF (p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data.

  14. Methods for Handling Missing Variables in Risk Prediction Models

    NARCIS (Netherlands)

    Held, Ulrike; Kessels, Alfons; Aymerich, Judith Garcia; Basagana, Xavier; ter Riet, Gerben; Moons, Karel G. M.; Puhan, Milo A.

    2016-01-01

    Prediction models should be externally validated before being used in clinical practice. Many published prediction models have never been validated. Uncollected predictor variables in otherwise suitable validation cohorts are the main factor precluding external validation.We used individual patient

  15. Prediction of blast-induced air overpressure: a hybrid AI-based predictive model.

    Science.gov (United States)

    Jahed Armaghani, Danial; Hajihassani, Mohsen; Marto, Aminaton; Shirani Faradonbeh, Roohollah; Mohamad, Edy Tonnizam

    2015-11-01

    Blast operations in the vicinity of residential areas usually produce significant environmental problems which may cause severe damage to the nearby areas. Blast-induced air overpressure (AOp) is one of the most important environmental impacts of blast operations which needs to be predicted to minimize the potential risk of damage. This paper presents an artificial neural network (ANN) optimized by the imperialist competitive algorithm (ICA) for the prediction of AOp induced by quarry blasting. For this purpose, 95 blasting operations were precisely monitored in a granite quarry site in Malaysia and AOp values were recorded in each operation. Furthermore, the most influential parameters on AOp, including the maximum charge per delay and the distance between the blast-face and monitoring point, were measured and used to train the ICA-ANN model. Based on the generalized predictor equation and considering the measured data from the granite quarry site, a new empirical equation was developed to predict AOp. For comparison purposes, conventional ANN models were developed and compared with the ICA-ANN results. The results demonstrated that the proposed ICA-ANN model is able to predict blast-induced AOp more accurately than other presented techniques.

  16. The Prediction Model of Dam Uplift Pressure Based on Random Forest

    Science.gov (United States)

    Li, Xing; Su, Huaizhi; Hu, Jiang

    2017-09-01

    The prediction of the dam uplift pressure is of great significance in the dam safety monitoring. Based on the comprehensive consideration of various factors, 18 parameters are selected as the main factors affecting the prediction of uplift pressure, use the actual monitoring data of uplift pressure as the evaluation factors for the prediction model, based on the random forest algorithm and support vector machine to build the dam uplift pressure prediction model to predict the uplift pressure of the dam, and the predict performance of the two models were compared and analyzed. At the same time, based on the established random forest prediction model, the significance of each factor is analyzed, and the importance of each factor of the prediction model is calculated by the importance function. Results showed that: (1) RF prediction model can quickly and accurately predict the uplift pressure value according to the influence factors, the average prediction accuracy is above 96%, compared with the support vector machine (SVM) model, random forest model has better robustness, better prediction precision and faster convergence speed, and the random forest model is more robust to missing data and unbalanced data. (2) The effect of water level on uplift pressure is the largest, and the influence of rainfall on the uplift pressure is the smallest compared with other factors.

  17. Predictive modelling of contagious deforestation in the Brazilian Amazon.

    Directory of Open Access Journals (Sweden)

    Isabel M D Rosa

    Full Text Available Tropical forests are diminishing in extent due primarily to the rapid expansion of agriculture, but the future magnitude and geographical distribution of future tropical deforestation is uncertain. Here, we introduce a dynamic and spatially-explicit model of deforestation that predicts the potential magnitude and spatial pattern of Amazon deforestation. Our model differs from previous models in three ways: (1 it is probabilistic and quantifies uncertainty around predictions and parameters; (2 the overall deforestation rate emerges "bottom up", as the sum of local-scale deforestation driven by local processes; and (3 deforestation is contagious, such that local deforestation rate increases through time if adjacent locations are deforested. For the scenarios evaluated-pre- and post-PPCDAM ("Plano de Ação para Proteção e Controle do Desmatamento na Amazônia"-the parameter estimates confirmed that forests near roads and already deforested areas are significantly more likely to be deforested in the near future and less likely in protected areas. Validation tests showed that our model correctly predicted the magnitude and spatial pattern of deforestation that accumulates over time, but that there is very high uncertainty surrounding the exact sequence in which pixels are deforested. The model predicts that under pre-PPCDAM (assuming no change in parameter values due to, for example, changes in government policy, annual deforestation rates would halve between 2050 compared to 2002, although this partly reflects reliance on a static map of the road network. Consistent with other models, under the pre-PPCDAM scenario, states in the south and east of the Brazilian Amazon have a high predicted probability of losing nearly all forest outside of protected areas by 2050. This pattern is less strong in the post-PPCDAM scenario. Contagious spread along roads and through areas lacking formal protection could allow deforestation to reach the core, which is

  18. Predictive modelling of contagious deforestation in the Brazilian Amazon.

    Science.gov (United States)

    Rosa, Isabel M D; Purves, Drew; Souza, Carlos; Ewers, Robert M

    2013-01-01

    Tropical forests are diminishing in extent due primarily to the rapid expansion of agriculture, but the future magnitude and geographical distribution of future tropical deforestation is uncertain. Here, we introduce a dynamic and spatially-explicit model of deforestation that predicts the potential magnitude and spatial pattern of Amazon deforestation. Our model differs from previous models in three ways: (1) it is probabilistic and quantifies uncertainty around predictions and parameters; (2) the overall deforestation rate emerges "bottom up", as the sum of local-scale deforestation driven by local processes; and (3) deforestation is contagious, such that local deforestation rate increases through time if adjacent locations are deforested. For the scenarios evaluated-pre- and post-PPCDAM ("Plano de Ação para Proteção e Controle do Desmatamento na Amazônia")-the parameter estimates confirmed that forests near roads and already deforested areas are significantly more likely to be deforested in the near future and less likely in protected areas. Validation tests showed that our model correctly predicted the magnitude and spatial pattern of deforestation that accumulates over time, but that there is very high uncertainty surrounding the exact sequence in which pixels are deforested. The model predicts that under pre-PPCDAM (assuming no change in parameter values due to, for example, changes in government policy), annual deforestation rates would halve between 2050 compared to 2002, although this partly reflects reliance on a static map of the road network. Consistent with other models, under the pre-PPCDAM scenario, states in the south and east of the Brazilian Amazon have a high predicted probability of losing nearly all forest outside of protected areas by 2050. This pattern is less strong in the post-PPCDAM scenario. Contagious spread along roads and through areas lacking formal protection could allow deforestation to reach the core, which is currently

  19. The predictive significance of CD20 expression in B-cell lymphomas

    Directory of Open Access Journals (Sweden)

    Horvat Mateja

    2011-04-01

    Full Text Available Abstract Background In our recent study, we determined the cut-off value of CD20 expression at the level of 25 000 molecules of equivalent soluble fluorochrome (MESF to be the predictor of response to rituximab containing treatment in patients with B-cell lymphomas. In 17.5% of patients, who had the level of CD20 expression below the cut-off value, the response to rituximab containing treatment was significantly worse than in the rest of the patients with the level of CD20 expression above the cut-off value. The proportion of patients with low CD20 expression who might not benefit from rituximab containing treatment was not necessarily representative. Therefore the aim of this study was to quantify the CD20 expression in a larger series of patients with B-cell lymphomas which might allow us to determine more reliably the proportion of patients with the CD20 expression below the cut-off. Methods Cytological samples of 64 diffuse large B-cell lymphomas (DLBCL, 56 follicular lymphomas (FL, 31 chronic lymphocytic leukemias (CLL, 34 mantle cell lymphomas (MCL, 18 marginal zone lymphomas (MZL and 15 B-cell lymphomas unclassified were analyzed for CD20 expression by quantitative four-color flow cytometric measurements using FACSCalibur flow cytometer (BD Biosciences. Results The range of CD20 expression in different B-cell lymphomas was very broad, varying from 2 737 to 115 623 MESF in CLL and 3 549 to 679 577 MESF in DLBCL. However, when we compared the CD20 expression in the groups of patients with DLBCL, FL, MCL, MZL, CLL and B-cell lymphomas unclassified, it was found to be significantly lower (p = 0.002 only in CLL but did not significantly differ in other lymphoma types (p = NS. Fifty-three out of 218 (24.3% patients with B-cell lymphomas had the CD20 expression below the cut-off value. Conclusions The CD20 expression in CLL is significantly lower than in most histological types of mature B-cell lymphomas in which it appears to be comparable

  20. Estimating the magnitude of prediction uncertainties for the APLE model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  1. Regression Model to Predict Global Solar Irradiance in Malaysia

    Directory of Open Access Journals (Sweden)

    Hairuniza Ahmed Kutty

    2015-01-01

    Full Text Available A novel regression model is developed to estimate the monthly global solar irradiance in Malaysia. The model is developed based on different available meteorological parameters, including temperature, cloud cover, rain precipitate, relative humidity, wind speed, pressure, and gust speed, by implementing regression analysis. This paper reports on the details of the analysis of the effect of each prediction parameter to identify the parameters that are relevant to estimating global solar irradiance. In addition, the proposed model is compared in terms of the root mean square error (RMSE, mean bias error (MBE, and the coefficient of determination (R2 with other models available from literature studies. Seven models based on single parameters (PM1 to PM7 and five multiple-parameter models (PM7 to PM12 are proposed. The new models perform well, with RMSE ranging from 0.429% to 1.774%, R2 ranging from 0.942 to 0.992, and MBE ranging from −0.1571% to 0.6025%. In general, cloud cover significantly affects the estimation of global solar irradiance. However, cloud cover in Malaysia lacks sufficient influence when included into multiple-parameter models although it performs fairly well in single-parameter prediction models.

  2. Estimating Predictive Variance for Statistical Gas Distribution Modelling

    Science.gov (United States)

    Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo

    2009-05-01

    Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.

  3. A prediction model for progressive disease in systemic sclerosis

    Science.gov (United States)

    Meijs, Jessica; Schouffoer, Anne A; Ajmone Marsan, Nina; Stijnen, Theo; Putter, Hein; Ninaber, Maarten K; Huizinga, Tom W J; de Vries-Bouwstra, Jeska K

    2015-01-01

    Objective To develop a model that assesses the risk for progressive disease in patients with systemic sclerosis (SSc) over the short term, in order to guide clinical management. Methods Baseline characteristics and 1 year follow-up results of 163 patients with SSc referred to a multidisciplinary healthcare programme were evaluated. Progressive disease was defined as: death, ≥10% decrease in forced vital capacity, ≥15% decrease in diffusing capacity for carbon monoxide, ≥10% decrease in body weight, ≥30% decrease in estimated-glomerular filtration rate, ≥30% increase in modified Rodnan Skin Score (with Δ≥5) or ≥0.25 increase in Scleroderma Health Assessment Questionnaire. The number of patients with progressive disease was determined. Univariable and multivariable logistic regression analyses were used to assess the probability of progressive disease for each individual patient. Performance of the prediction model was evaluated using a calibration plot and area under the receiver operating characteristic curve. Results 63 patients had progressive disease, including 8 patients who died ≤18 months after first evaluation. Multivariable analysis showed that friction rubs, proximal muscular weakness and decreased maximum oxygen uptake as % predicted, adjusted for age, gender and use of immunosuppressive therapy at baseline, were significantly associated with progressive disease. Using the prediction model, the predicted chance for progressive disease increased from a pretest chance of 37% to 67–89%. Conclusions Using the prediction model, the chance for progressive disease for individual patients could be doubled. Friction rubs, proximal muscular weakness and maximum oxygen uptake as % predicted were identified as relevant parameters. PMID:26688749

  4. The predictive performance and stability of six species distribution models.

    Science.gov (United States)

    Duan, Ren-Yan; Kong, Xiao-Quan; Huang, Min-Yi; Fan, Wei-Yi; Wang, Zhi-Gao

    2014-01-01

    Predicting species' potential geographical range by species distribution models (SDMs) is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs. We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis) and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials). We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values. The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (pMAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM) had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points). According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  5. The predictive performance and stability of six species distribution models.

    Directory of Open Access Journals (Sweden)

    Ren-Yan Duan

    Full Text Available Predicting species' potential geographical range by species distribution models (SDMs is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs.We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials. We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values.The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (p<0.05, while the associated standard deviations and coefficients of variation were larger for BIOCLIM and DOMAIN trials (p<0.05, and the 99% confidence intervals for AUC and Kappa values were narrower for MAHAL, RF, MAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points.According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  6. Prediction of Catastrophes: an experimental model

    CERN Document Server

    Peters, Randall D; Pomeau, Yves

    2012-01-01

    Catastrophes of all kinds can be roughly defined as short duration-large amplitude events following and followed by long periods of "ripening". Major earthquakes surely belong to the class of 'catastrophic' events. Because of the space-time scales involved, an experimental approach is often difficult, not to say impossible, however desirable it could be. Described in this article is a "laboratory" setup that yields data of a type that is amenable to theoretical methods of prediction. Observations are made of a critical slowing down in the noisy signal of a solder wire creeping under constant stress. This effect is shown to be a fair signal of the forthcoming catastrophe in both of two dynamical models. The first is an "abstract" model in which a time dependent quantity drifts slowly but makes quick jumps from time to time. The second is a realistic physical model for the collective motion of dislocations (the Ananthakrishna set of equations for creep). Hope thus exists that similar changes in the response to ...

  7. Predictive modeling of low solubility semiconductor alloys

    Science.gov (United States)

    Rodriguez, Garrett V.; Millunchick, Joanna M.

    2016-09-01

    GaAsBi is of great interest for applications in high efficiency optoelectronic devices due to its highly tunable bandgap. However, the experimental growth of high Bi content films has proven difficult. Here, we model GaAsBi film growth using a kinetic Monte Carlo simulation that explicitly takes cation and anion reactions into account. The unique behavior of Bi droplets is explored, and a sharp decrease in Bi content upon Bi droplet formation is demonstrated. The high mobility of simulated Bi droplets on GaAsBi surfaces is shown to produce phase separated Ga-Bi droplets as well as depressions on the film surface. A phase diagram for a range of growth rates that predicts both Bi content and droplet formation is presented to guide the experimental growth of high Bi content GaAsBi films.

  8. Distributed model predictive control made easy

    CERN Document Server

    Negenborn, Rudy

    2014-01-01

    The rapid evolution of computer science, communication, and information technology has enabled the application of control techniques to systems beyond the possibilities of control theory just a decade ago. Critical infrastructures such as electricity, water, traffic and intermodal transport networks are now in the scope of control engineers. The sheer size of such large-scale systems requires the adoption of advanced distributed control approaches. Distributed model predictive control (MPC) is one of the promising control methodologies for control of such systems.   This book provides a state-of-the-art overview of distributed MPC approaches, while at the same time making clear directions of research that deserve more attention. The core and rationale of 35 approaches are carefully explained. Moreover, detailed step-by-step algorithmic descriptions of each approach are provided. These features make the book a comprehensive guide both for those seeking an introduction to distributed MPC as well as for those ...

  9. Leptogenesis in minimal predictive seesaw models

    Science.gov (United States)

    Björkeroth, Fredrik; de Anda, Francisco J.; de Medeiros Varzielas, Ivo; King, Stephen F.

    2015-10-01

    We estimate the Baryon Asymmetry of the Universe (BAU) arising from leptogenesis within a class of minimal predictive seesaw models involving two right-handed neutrinos and simple Yukawa structures with one texture zero. The two right-handed neutrinos are dominantly responsible for the "atmospheric" and "solar" neutrino masses with Yukawa couplings to ( ν e , ν μ , ν τ ) proportional to (0, 1, 1) and (1, n, n - 2), respectively, where n is a positive integer. The neutrino Yukawa matrix is therefore characterised by two proportionality constants with their relative phase providing a leptogenesis-PMNS link, enabling the lightest right-handed neutrino mass to be determined from neutrino data and the observed BAU. We discuss an SU(5) SUSY GUT example, where A 4 vacuum alignment provides the required Yukawa structures with n = 3, while a {{Z}}_9 symmetry fixes the relatives phase to be a ninth root of unity.

  10. Predictive modeling of respiratory tumor motion for real-time prediction of baseline shifts

    Science.gov (United States)

    Balasubramanian, A.; Shamsuddin, R.; Prabhakaran, B.; Sawant, A.

    2017-03-01

    Baseline shifts in respiratory patterns can result in significant spatiotemporal changes in patient anatomy (compared to that captured during simulation), in turn, causing geometric and dosimetric errors in the administration of thoracic and abdominal radiotherapy. We propose predictive modeling of the tumor motion trajectories for predicting a baseline shift ahead of its occurrence. The key idea is to use the features of the tumor motion trajectory over a 1 min window, and predict the occurrence of a baseline shift in the 5 s that immediately follow (lookahead window). In this study, we explored a preliminary trend-based analysis with multi-class annotations as well as a more focused binary classification analysis. In both analyses, a number of different inter-fraction and intra-fraction training strategies were studied, both offline as well as online, along with data sufficiency and skew compensation for class imbalances. The performance of different training strategies were compared across multiple machine learning classification algorithms, including nearest neighbor, Naïve Bayes, linear discriminant and ensemble Adaboost. The prediction performance is evaluated using metrics such as accuracy, precision, recall and the area under the curve (AUC) for repeater operating characteristics curve. The key results of the trend-based analysis indicate that (i) intra-fraction training strategies achieve highest prediction accuracies (90.5-91.4%) (ii) the predictive modeling yields lowest accuracies (50-60%) when the training data does not include any information from the test patient; (iii) the prediction latencies are as low as a few hundred milliseconds, and thus conducive for real-time prediction. The binary classification performance is promising, indicated by high AUCs (0.96-0.98). It also confirms the utility of prior data from previous patients, and also the necessity of training the classifier on some initial data from the new patient for reasonable

  11. Earth-To-Space Improved Model for Rain Attenuation Prediction at Ku-Band

    Directory of Open Access Journals (Sweden)

    Mandeep S.J. Singh

    2006-01-01

    Full Text Available A model for predicting rain attenuation on earth-to-space was developed by using the measurement data obtained from tropical and equatorial region. The proposed rain attenuation model uses the complete rainfall rate cumulative distribution as input data. It was shown that significant improvements in terms of prediction error over existing attenuation model obtained.

  12. Mesoscale Wind Predictions for Wave Model Evaluation

    Science.gov (United States)

    2016-06-07

    N0001400WX20041(B) http://www.nrlmry.navy.mil LONG TERM GOALS The long-term goal is to demonstrate the significance and importance of high...ocean waves by an appropriate wave model. OBJECTIVES The main objectives of this project are to: 1. Build the infrastructure to generate the...temperature for all COAMPS grids at the resolution of each of these grids. These analyses are important for the proper 2 specification of the lower

  13. Phylogenomic analyses predict sistergroup relationship of nucleariids and Fungi and paraphyly of zygomycetes with significant support

    Directory of Open Access Journals (Sweden)

    Steenkamp Emma

    2009-01-01

    Full Text Available Abstract Background Resolving the evolutionary relationships among Fungi remains challenging because of their highly variable evolutionary rates, and lack of a close phylogenetic outgroup. Nucleariida, an enigmatic group of amoeboids, have been proposed to emerge close to the fungal-metazoan divergence and might fulfill this role. Yet, published phylogenies with up to five genes are without compelling statistical support, and genome-level data should be used to resolve this question with confidence. Results Our analyses with nuclear (118 proteins and mitochondrial (13 proteins data now robustly associate Nucleariida and Fungi as neighbors, an assemblage that we term 'Holomycota'. With Nucleariida as an outgroup, we revisit unresolved deep fungal relationships. Conclusion Our phylogenomic analysis provides significant support for the paraphyly of the traditional taxon Zygomycota, and contradicts a recent proposal to include Mortierella in a phylum Mucoromycotina. We further question the introduction of separate phyla for Glomeromycota and Blastocladiomycota, whose phylogenetic positions relative to other phyla remain unresolved even with genome-level datasets. Our results motivate broad sampling of additional genome sequences from these phyla.

  14. A background error covariance model of significant wave height employing Monte Carlo simulation

    Institute of Scientific and Technical Information of China (English)

    GUO Yanyou; HOU Yijun; ZHANG Chunmei; YANG Jie

    2012-01-01

    The quality of background error statistics is one of the key components for successful assimilation of observations in a numerical model.The background error covariance(BEC)of ocean waves is generally estimated under an assumption that it is stationary over a period of time and uniform over a domain.However,error statistics are in fact functions of the physical processes governing the meteorological situation and vary with the wave condition.In this paper,we simulated the BEC of the significant wave height(SWH)employing Monte Carlo methods.An interesting result is that the BEC varies consistently with the mean wave direction(MWD).In the model domain,the BEC of the SWH decreases significantly when the MWD changes abruptly.A new BEC model of the SWH based on the correlation between the BEC and MWD was then developed.A case study of regional data assimilation was performed,where the SWH observations of buoy 22001 were used to assess the SWH hindcast.The results show that the new BEC model benefits wave prediction and allows reasonable approximations of anisotropy and inhomogeneous errors.

  15. Comparing model predictions for ecosystem-based management

    DEFF Research Database (Denmark)

    Jacobsen, Nis Sand; Essington, Timothy E.; Andersen, Ken Haste

    2016-01-01

    Ecosystem modeling is becoming an integral part of fisheries management, but there is a need to identify differences between predictions derived from models employed for scientific and management purposes. Here, we compared two models: a biomass-based food-web model (Ecopath with Ecosim (Ew......E)) and a size-structured fish community model. The models were compared with respect to predicted ecological consequences of fishing to identify commonalities and differences in model predictions for the California Current fish community. We compared the models regarding direct and indirect responses to fishing...... on one or more species. The size-based model predicted a higher fishing mortality needed to reach maximum sustainable yield than EwE for most species. The size-based model also predicted stronger top-down effects of predator removals than EwE. In contrast, EwE predicted stronger bottom-up effects...

  16. Modeling and Prediction of Hot Deformation Flow Curves

    Science.gov (United States)

    Mirzadeh, Hamed; Cabrera, Jose Maria; Najafizadeh, Abbas

    2012-01-01

    The modeling of hot flow stress and prediction of flow curves for unseen deformation conditions are important in metal-forming processes because any feasible mathematical simulation needs accurate flow description. In the current work, in an attempt to summarize, generalize, and introduce efficient methods, the dynamic recrystallization (DRX) flow curves of a 17-4 PH martensitic precipitation hardening stainless steel, a medium carbon microalloyed steel, and a 304 H austenitic stainless steel were modeled and predicted using (1) a hyperbolic sine equation with strain dependent constants, (2) a developed constitutive equation in a simple normalized stress-normalized strain form and its modified version, and (3) a feed-forward artificial neural network (ANN). These methods were critically discussed, and the ANN technique was found to be the best for the modeling available flow curves; however, the developed constitutive equation showed slightly better performance than that of ANN and significantly better predicted values than those of the hyperbolic sine equation in prediction of flow curves for unseen deformation conditions.

  17. PREDICTIVE SIGNIFICANCE OF ANTI-HLA AUTOANTIBODIES IN HEART TRANSPLANT RECIPIENTS

    Directory of Open Access Journals (Sweden)

    O. P. Shevchenko

    2013-01-01

    Full Text Available Aim. The aim of this study was to define the role of preformed anti-HLA antibodies (anti-HLA in antibody-mediated rejection (AMR and cardiac allograft vasculopathy (CAV after heart transplantation. Materials and Methods. 140 heart transplant recipients were followed after heart transplantation performed for 106 dilated and 34 – ischemic cardiomyopathy. Anti-HLA was determined before transplantation by ELISA. Results. Recipients were divided into 2 groups: anti-HLA positive (n = 45, 32,1% and anti-HLA negative (n = 95, 67,9%. The incidence of AMR in anti-HLA positive group was 12 (26,67% and 11 (11,58% in anti-HLA negative group. Risk of AMR was significantly higher in anti-HLA positive recipients (RR 2,3: 95% CI 1,02–4,81, р = 0,03. During first three years after transplantation CAV was diagnosed in 9 (20% of anti-HLA positive recipients and in 7 (6,8% of patients without anti-HLA. (RR 2,7: 95% CI 1,08–6,82, р = 0,03. Survival in freedom from CAV in anti-HLA negative recipients was much higher than in anti-HLA positive recipients (0,89 ± 0,07, 0,72 ± 0,06, resp. (p = 0,02.Conclusions. The presence of preformed anti-HLA antibodies in candidates for heart transplantation increase the risk of AMR and CAV post transplantation in 2,3 and 2,7 times, respectively. 

  18. Confronting species distribution model predictions with species functional traits.

    Science.gov (United States)

    Wittmann, Marion E; Barnes, Matthew A; Jerde, Christopher L; Jones, Lisa A; Lodge, David M

    2016-02-01

    Species distribution models are valuable tools in studies of biogeography, ecology, and climate change and have been used to inform conservation and ecosystem management. However, species distribution models typically incorporate only climatic variables and species presence data. Model development or validation rarely considers functional components of species traits or other types of biological data. We implemented a species distribution model (Maxent) to predict global climate habitat suitability for Grass Carp (Ctenopharyngodon idella). We then tested the relationship between the degree of climate habitat suitability predicted by Maxent and the individual growth rates of both wild (N = 17) and stocked (N = 51) Grass Carp populations using correlation analysis. The Grass Carp Maxent model accurately reflected the global occurrence data (AUC = 0.904). Observations of Grass Carp growth rate covered six continents and ranged from 0.19 to 20.1 g day(-1). Species distribution model predictions were correlated (r = 0.5, 95% CI (0.03, 0.79)) with observed growth rates for wild Grass Carp populations but were not correlated (r = -0.26, 95% CI (-0.5, 0.012)) with stocked populations. Further, a review of the literature indicates that the few studies for other species that have previously assessed the relationship between the degree of predicted climate habitat suitability and species functional traits have also discovered significant relationships. Thus, species distribution models may provide inferences beyond just where a species may occur, providing a useful tool to understand the linkage between species distributions and underlying biological mechanisms.

  19. Simple clinical variables predict liver histology in hepatitis C: prospective validation of a clinical prediction model.

    Science.gov (United States)

    Romagnuolo, Joseph; Andrews, Christopher N; Bain, Vincent G; Bonacini, Maurizio; Cotler, Scott J; Ma, Mang; Sherman, Morris

    2005-11-01

    A recent single-center multivariate analysis of hepatitis C (HCV) patients showed that having any two criteria: 1) ferritin > or =200 microg/l and 2) spider nevi and/or albumin clinical prediction model using an independent multicenter sample. Eighty-one patients with previously untreated active chronic HCV underwent physical examination, laboratory investigation, and liver biopsy. Biopsies were read, in blinded fashion, by a single pathologist, using a modified Hytiroglou (1995) scale. The clinical scoring system was correlated with histology; likelihood ratios (LRs), Fisher's exact p-values, and receiver operating characteristics (ROCs) were calculated. Data recording was complete in 77 and 38 patients regarding fibrotic stage and inflammatory grade, respectively. For fibrosis, 3/3 patients with any three criteria (LR 17, positive predictive value (PPV) 100%), 4/5 patients with any two criteria (LR 5.1), and 15/47 with no criteria (LR 0.6, negative predictive value (NPV) 68%) had stage 2 or greater fibrosis on biopsy (p=0.01). For inflammation, 5/5 patients with both criteria (LR 15, PPV 100%), and 8/19 patients with no criteria (LR 0.5, NPV 58%) had moderate-severe inflammation on liver biopsy (p=0.036). When missing variables were assumed to be normal, recalculated LRs were almost identical. An alanine aminotransferase (ALAT) level data set has validated our published model which uses simple clinical variables accurately and significantly to predict hepatic fibrosis and inflammation in HCV patients.

  20. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  1. Introducing Model Predictive Control for Improving Power Plant Portfolio Performance

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Bendtsen, Jan Dimon; Børresen, Simon

    2008-01-01

    This paper introduces a model predictive control (MPC) approach for construction of a controller for balancing the power generation against consumption in a power system. The objective of the controller is to coordinate a portfolio consisting of multiple power plant units in the effort to perform...... reference tracking and disturbance rejection in an economically optimal way. The performance function is chosen as a mixture of the `1-norm and a linear weighting to model the economics of the system. Simulations show a significant improvement of the performance of the MPC compared to the current...

  2. Remaining Useful Lifetime (RUL - Probabilistic Predictive Model

    Directory of Open Access Journals (Sweden)

    Ephraim Suhir

    2011-01-01

    Full Text Available Reliability evaluations and assurances cannot be delayed until the device (system is fabricated and put into operation. Reliability of an electronic product should be conceived at the early stages of its design; implemented during manufacturing; evaluated (considering customer requirements and the existing specifications, by electrical, optical and mechanical measurements and testing; checked (screened during manufacturing (fabrication; and, if necessary and appropriate, maintained in the field during the product’s operation Simple and physically meaningful probabilistic predictive model is suggested for the evaluation of the remaining useful lifetime (RUL of an electronic device (system after an appreciable deviation from its normal operation conditions has been detected, and the increase in the failure rate and the change in the configuration of the wear-out portion of the bathtub has been assessed. The general concepts are illustrated by numerical examples. The model can be employed, along with other PHM forecasting and interfering tools and means, to evaluate and to maintain the high level of the reliability (probability of non-failure of a device (system at the operation stage of its lifetime.

  3. A Predictive Model of Geosynchronous Magnetopause Crossings

    CERN Document Server

    Dmitriev, A; Chao, J -K

    2013-01-01

    We have developed a model predicting whether or not the magnetopause crosses geosynchronous orbit at given location for given solar wind pressure Psw, Bz component of interplanetary magnetic field (IMF) and geomagnetic conditions characterized by 1-min SYM-H index. The model is based on more than 300 geosynchronous magnetopause crossings (GMCs) and about 6000 minutes when geosynchronous satellites of GOES and LANL series are located in the magnetosheath (so-called MSh intervals) in 1994 to 2001. Minimizing of the Psw required for GMCs and MSh intervals at various locations, Bz and SYM-H allows describing both an effect of magnetopause dawn-dusk asymmetry and saturation of Bz influence for very large southward IMF. The asymmetry is strong for large negative Bz and almost disappears when Bz is positive. We found that the larger amplitude of negative SYM-H the lower solar wind pressure is required for GMCs. We attribute this effect to a depletion of the dayside magnetic field by a storm-time intensification of t...

  4. Predictive modeling for EBPC in EBDW

    Science.gov (United States)

    Zimmermann, Rainer; Schulz, Martin; Hoppe, Wolfgang; Stock, Hans-Jürgen; Demmerle, Wolfgang; Zepka, Alex; Isoyan, Artak; Bomholt, Lars; Manakli, Serdar; Pain, Laurent

    2009-10-01

    We demonstrate a flow for e-beam proximity correction (EBPC) to e-beam direct write (EBDW) wafer manufacturing processes, demonstrating a solution that covers all steps from the generation of a test pattern for (experimental or virtual) measurement data creation, over e-beam model fitting, proximity effect correction (PEC), and verification of the results. We base our approach on a predictive, physical e-beam simulation tool, with the possibility to complement this with experimental data, and the goal of preparing the EBPC methods for the advent of high-volume EBDW tools. As an example, we apply and compare dose correction and geometric correction for low and high electron energies on 1D and 2D test patterns. In particular, we show some results of model-based geometric correction as it is typical for the optical case, but enhanced for the particularities of e-beam technology. The results are used to discuss PEC strategies, with respect to short and long range effects.

  5. Significance of settling model structures and parameter subsets in modelling WWTPs under wet-weather flow and filamentous bulking conditions.

    Science.gov (United States)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen; Plósz, Benedek Gy

    2014-10-15

    Current research focuses on predicting and mitigating the impacts of high hydraulic loadings on centralized wastewater treatment plants (WWTPs) under wet-weather conditions. The maximum permissible inflow to WWTPs depends not only on the settleability of activated sludge in secondary settling tanks (SSTs) but also on the hydraulic behaviour of SSTs. The present study investigates the impacts of ideal and non-ideal flow (dry and wet weather) and settling (good settling and bulking) boundary conditions on the sensitivity of WWTP model outputs to uncertainties intrinsic to the one-dimensional (1-D) SST model structures and parameters. We identify the critical sources of uncertainty in WWTP models through global sensitivity analysis (GSA) using the Benchmark simulation model No. 1 in combination with first- and second-order 1-D SST models. The results obtained illustrate that the contribution of settling parameters to the total variance of the key WWTP process outputs significantly depends on the influent flow and settling conditions. The magnitude of the impact is found to vary, depending on which type of 1-D SST model is used. Therefore, we identify and recommend potential parameter subsets for WWTP model calibration, and propose optimal choice of 1-D SST models under different flow and settling boundary conditions. Additionally, the hydraulic parameters in the second-order SST model are found significant under dynamic wet-weather flow conditions. These results highlight the importance of developing a more mechanistic based flow-dependent hydraulic sub-model in second-order 1-D SST models in the future.

  6. Bayesian prediction of placebo analgesia in an instrumental learning model

    Science.gov (United States)

    Jung, Won-Mo; Lee, Ye-Seul; Wallraven, Christian; Chae, Younbyoung

    2017-01-01

    Placebo analgesia can be primarily explained by the Pavlovian conditioning paradigm in which a passively applied cue becomes associated with less pain. In contrast, instrumental conditioning employs an active paradigm that might be more similar to clinical settings. In the present study, an instrumental conditioning paradigm involving a modified trust game in a simulated clinical situation was used to induce placebo analgesia. Additionally, Bayesian modeling was applied to predict the placebo responses of individuals based on their choices. Twenty-four participants engaged in a medical trust game in which decisions to receive treatment from either a doctor (more effective with high cost) or a pharmacy (less effective with low cost) were made after receiving a reference pain stimulus. In the conditioning session, the participants received lower levels of pain following both choices, while high pain stimuli were administered in the test session even after making the decision. The choice-dependent pain in the conditioning session was modulated in terms of both intensity and uncertainty. Participants reported significantly less pain when they chose the doctor or the pharmacy for treatment compared to the control trials. The predicted pain ratings based on Bayesian modeling showed significant correlations with the actual reports from participants for both of the choice categories. The instrumental conditioning paradigm allowed for the active choice of optional cues and was able to induce the placebo analgesia effect. Additionally, Bayesian modeling successfully predicted pain ratings in a simulated clinical situation that fits well with placebo analgesia induced by instrumental conditioning. PMID:28225816

  7. Preservation engineering assets developed from an oxidation predictive model

    Directory of Open Access Journals (Sweden)

    Coutelieris Frank A.

    2016-01-01

    Full Text Available A previously developed model which effectively predicts the probability of olive oil reaching the end of its shelf-life within a certain time frame was tested for its response when the convective diffusion of oxygen through packaging material is taken in account. Darcy’s Law was used to correlate the packaging permeability with the oxygen flow through the packaging materials. Mass transport within the food-packaging system was considered transient and the relative one-dimensional differential equations along with appropriate initial and boundary conditions were numerically solved. When the Peclet (Pe number was used to validate the significance of the oxygen transport mechanism through packaging, the model results confirmed the Arrhenius type dependency of diffusion, where the slope of the line per material actually indicated their –Ea/R. Furthermore, Pe could not be correlated to the hexanal produced in samples stored under light. Photo-oxidation has a significant role in the oxidative degradation of olive oil confirmed by the shelf-assessing test. The validity of our model for the oxygen diffusion driven systems, was also confirmed, for that reason the predictive boundaries were set. Results safely indicated the significance of applying a self-assessing process to confirm the packaging selection process for oxygen sensitive food via this model.

  8. Interpolation techniques in robust constrained model predictive control

    Science.gov (United States)

    Kheawhom, Soorathep; Bumroongsri, Pornchai

    2017-05-01

    This work investigates interpolation techniques that can be employed on off-line robust constrained model predictive control for a discrete time-varying system. A sequence of feedback gains is determined by solving off-line a series of optimal control optimization problems. A sequence of nested corresponding robustly positive invariant set, which is either ellipsoidal or polyhedral set, is then constructed. At each sampling time, the smallest invariant set containing the current state is determined. If the current invariant set is the innermost set, the pre-computed gain associated with the innermost set is applied. If otherwise, a feedback gain is variable and determined by a linear interpolation of the pre-computed gains. The proposed algorithms are illustrated with case studies of a two-tank system. The simulation results showed that the proposed interpolation techniques significantly improve control performance of off-line robust model predictive control without much sacrificing on-line computational performance.

  9. Model for predicting mountain wave field uncertainties

    Science.gov (United States)

    Damiens, Florentin; Lott, François; Millet, Christophe; Plougonven, Riwal

    2017-04-01

    Studying the propagation of acoustic waves throughout troposphere requires knowledge of wind speed and temperature gradients from the ground up to about 10-20 km. Typical planetary boundary layers flows are known to present vertical low level shears that can interact with mountain waves, thereby triggering small-scale disturbances. Resolving these fluctuations for long-range propagation problems is, however, not feasible because of computer memory/time restrictions and thus, they need to be parameterized. When the disturbances are small enough, these fluctuations can be described by linear equations. Previous works by co-authors have shown that the critical layer dynamics that occur near the ground produces large horizontal flows and buoyancy disturbances that result in intense downslope winds and gravity wave breaking. While these phenomena manifest almost systematically for high Richardson numbers and when the boundary layer depth is relatively small compare to the mountain height, the process by which static stability affects downslope winds remains unclear. In the present work, new linear mountain gravity wave solutions are tested against numerical predictions obtained with the Weather Research and Forecasting (WRF) model. For Richardson numbers typically larger than unity, the mesoscale model is used to quantify the effect of neglected nonlinear terms on downslope winds and mountain wave patterns. At these regimes, the large downslope winds transport warm air, a so called "Foehn" effect than can impact sound propagation properties. The sensitivity of small-scale disturbances to Richardson number is quantified using two-dimensional spectral analysis. It is shown through a pilot study of subgrid scale fluctuations of boundary layer flows over realistic mountains that the cross-spectrum of mountain wave field is made up of the same components found in WRF simulations. The impact of each individual component on acoustic wave propagation is discussed in terms of

  10. Predictive modeling of nanomaterial exposure effects in biological systems

    Directory of Open Access Journals (Sweden)

    Liu X

    2013-09-01

    Full Text Available Xiong Liu,1 Kaizhi Tang,1 Stacey Harper,2 Bryan Harper,2 Jeffery A Steevens,3 Roger Xu1 1Intelligent Automation, Inc., Rockville, MD, USA; 2Department of Environmental and Molecular Toxicology, School of Chemical, Biological, and Environmental Engineering, Oregon State University, Corvallis, OR, USA; 3ERDC Environmental Laboratory, Vicksburg, MS, USA Background: Predictive modeling of the biological effects of nanomaterials is critical for industry and policymakers to assess the potential hazards resulting from the application of engineered nanomaterials. Methods: We generated an experimental dataset on the toxic effects experienced by embryonic zebrafish due to exposure to nanomaterials. Several nanomaterials were studied, such as metal nanoparticles, dendrimer, metal oxide, and polymeric materials. The embryonic zebrafish metric (EZ Metric was used as a screening-level measurement representative of adverse effects. Using the dataset, we developed a data mining approach to model the toxic endpoints and the overall biological impact of nanomaterials. Data mining techniques, such as numerical prediction, can assist analysts in developing risk assessment models for nanomaterials. Results: We found several important attributes that contribute to the 24 hours post-fertilization (hpf mortality, such as dosage concentration, shell composition, and surface charge. These findings concur with previous studies on nanomaterial toxicity using embryonic zebrafish. We conducted case studies on modeling the overall effect/impact of nanomaterials and the specific toxic endpoints such as mortality, delayed development, and morphological malformations. The results show that we can achieve high prediction accuracy for certain biological effects, such as 24 hpf mortality, 120 hpf mortality, and 120 hpf heart malformation. The results also show that the weighting scheme for individual biological effects has a significant influence on modeling the overall impact of

  11. Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.

    Science.gov (United States)

    Proppe, Jonny; Reiher, Markus

    2017-07-11

    One of the major challenges in computational science is to determine the uncertainty of a virtual measurement, that is the prediction of an observable based on calculations. As highly accurate first-principles calculations are in general unfeasible for most physical systems, one usually resorts to parameteric property models of observables, which require calibration by incorporating reference data. The resulting predictions and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the (57)Fe Mössbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with 12 density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm s(-1) and 0.04-0.05 mm s(-1), respectively, the latter being close to the average experimental uncertainty of 0.02 mm s(-1). Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the squared coefficient of correlation, r(2), or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical M

  12. Predictive models for population performance on real biological fitness landscapes.

    Science.gov (United States)

    Rowe, William; Wedge, David C; Platt, Mark; Kell, Douglas B; Knowles, Joshua

    2010-09-01

    Directed evolution, in addition to its principal application of obtaining novel biomolecules, offers significant potential as a vehicle for obtaining useful information about the topologies of biomolecular fitness landscapes. In this article, we make use of a special type of model of fitness landscapes-based on finite state machines-which can be inferred from directed evolution experiments. Importantly, the model is constructed only from the fitness data and phylogeny, not sequence or structural information, which is often absent. The model, called a landscape state machine (LSM), has already been used successfully in the evolutionary computation literature to model the landscapes of artificial optimization problems. Here, we use the method for the first time to simulate a biological fitness landscape based on experimental evaluation. We demonstrate in this study that LSMs are capable not only of representing the structure of model fitness landscapes such as NK-landscapes, but also the fitness landscape of real DNA oligomers binding to a protein (allophycocyanin), data we derived from experimental evaluations on microarrays. The LSMs prove adept at modelling the progress of evolution as a function of various controlling parameters, as validated by evaluations on the real landscapes. Specifically, the ability of the model to 'predict' optimal mutation rates and other parameters of the evolution is demonstrated. A modification to the standard LSM also proves accurate at predicting the effects of recombination on the evolution.

  13. Neural Fuzzy Inference System-Based Weather Prediction Model and Its Precipitation Predicting Experiment

    Directory of Open Access Journals (Sweden)

    Jing Lu

    2014-11-01

    Full Text Available We propose a weather prediction model in this article based on neural network and fuzzy inference system (NFIS-WPM, and then apply it to predict daily fuzzy precipitation given meteorological premises for testing. The model consists of two parts: the first part is the “fuzzy rule-based neural network”, which simulates sequential relations among fuzzy sets using artificial neural network; and the second part is the “neural fuzzy inference system”, which is based on the first part, but could learn new fuzzy rules from the previous ones according to the algorithm we proposed. NFIS-WPM (High Pro and NFIS-WPM (Ave are improved versions of this model. It is well known that the need for accurate weather prediction is apparent when considering the benefits. However, the excessive pursuit of accuracy in weather prediction makes some of the “accurate” prediction results meaningless and the numerical prediction model is often complex and time-consuming. By adapting this novel model to a precipitation prediction problem, we make the predicted outcomes of precipitation more accurate and the prediction methods simpler than by using the complex numerical forecasting model that would occupy large computation resources, be time-consuming and which has a low predictive accuracy rate. Accordingly, we achieve more accurate predictive precipitation results than by using traditional artificial neural networks that have low predictive accuracy.

  14. RFI modeling and prediction approach for SATOP applications: RFI prediction models

    Science.gov (United States)

    Nguyen, Tien M.; Tran, Hien T.; Wang, Zhonghai; Coons, Amanda; Nguyen, Charles C.; Lane, Steven A.; Pham, Khanh D.; Chen, Genshe; Wang, Gang

    2016-05-01

    This paper describes a technical approach for the development of RFI prediction models using carrier synchronization loop when calculating Bit or Carrier SNR degradation due to interferences for (i) detecting narrow-band and wideband RFI signals, and (ii) estimating and predicting the behavior of the RFI signals. The paper presents analytical and simulation models and provides both analytical and simulation results on the performance of USB (Unified S-Band) waveforms in the presence of narrow-band and wideband RFI signals. The models presented in this paper will allow the future USB command systems to detect the RFI presence, estimate the RFI characteristics and predict the RFI behavior in real-time for accurate assessment of the impacts of RFI on the command Bit Error Rate (BER) performance. The command BER degradation model presented in this paper also allows the ground system operator to estimate the optimum transmitted SNR to maintain a required command BER level in the presence of both friendly and un-friendly RFI sources.

  15. Regional differences in prediction models of lung function in Germany

    Directory of Open Access Journals (Sweden)

    Schäper Christoph

    2010-04-01

    Full Text Available Abstract Background Little is known about the influencing potential of specific characteristics on lung function in different populations. The aim of this analysis was to determine whether lung function determinants differ between subpopulations within Germany and whether prediction equations developed for one subpopulation are also adequate for another subpopulation. Methods Within three studies (KORA C, SHIP-I, ECRHS-I in different areas of Germany 4059 adults performed lung function tests. The available data consisted of forced expiratory volume in one second, forced vital capacity and peak expiratory flow rate. For each study multivariate regression models were developed to predict lung function and Bland-Altman plots were established to evaluate the agreement between predicted and measured values. Results The final regression equations for FEV1 and FVC showed adjusted r-square values between 0.65 and 0.75, and for PEF they were between 0.46 and 0.61. In all studies gender, age, height and pack-years were significant determinants, each with a similar effect size. Regarding other predictors there were some, although not statistically significant, differences between the studies. Bland-Altman plots indicated that the regression models for each individual study adequately predict medium (i.e. normal but not extremely high or low lung function values in the whole study population. Conclusions Simple models with gender, age and height explain a substantial part of lung function variance whereas further determinants add less than 5% to the total explained r-squared, at least for FEV1 and FVC. Thus, for different adult subpopulations of Germany one simple model for each lung function measures is still sufficient.

  16. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to unders

  17. Clinical significance of thrombocytosis before preoperative chemoradiotherapy in rectal cancer: predicting pathologic tumor response and oncologic outcome.

    Science.gov (United States)

    Kim, Hye Jin; Choi, Gyu-Seog; Park, Jun Seok; Park, SooYeun; Kawai, Kazushige; Watanabe, Toshiaki

    2015-02-01

    Thrombocytosis is considered an adverse prognostic factor in various malignancies. However, the clinical significance of thrombocytosis in rectal cancer patients is unknown. We investigated the predictive value of thrombocytosis for pathologic tumor response to preoperative chemoradiotherapy (CRT) and oncologic outcomes in patients with rectal cancer. A total of 314 patients who underwent preoperative CRT and subsequent rectal resection for rectal cancer were retrospectively evaluated at two tertiary institutions. Univariate and multivariate analyses of the clinical parameters were performed to identify markers predictive of a pathologic complete response (pCR). The Kaplan-Meier method was used to estimate 3-year disease-free and overall survival rates. Sixty-nine patients (22 %) had thrombocytosis before CRT, which significantly correlated with a large tumor size and advanced tumor depth. Thirty-nine patients (12.4 %) achieved a pCR. In the multivariate analyses, a platelet count of thrombocytosis had lower 3-year disease-free (P = 0.037) and overall survival (P = 0.001) rates than patients with normal pretreatment platelet counts. Thrombocytosis is a negative predictive factor for a pCR and has an adverse impact on survival in rectal cancer. The predictive value of this easily available clinical factor should not be underestimated, and better therapeutic strategies for these tumors are required.

  18. Foundation Settlement Prediction Based on a Novel NGM Model

    Directory of Open Access Journals (Sweden)

    Peng-Yu Chen

    2014-01-01

    Full Text Available Prediction of foundation or subgrade settlement is very important during engineering construction. According to the fact that there are lots of settlement-time sequences with a nonhomogeneous index trend, a novel grey forecasting model called NGM (1,1,k,c model is proposed in this paper. With an optimized whitenization differential equation, the proposed NGM (1,1,k,c model has the property of white exponential law coincidence and can predict a pure nonhomogeneous index sequence precisely. We used two case studies to verify the predictive effect of NGM (1,1,k,c model for settlement prediction. The results show that this model can achieve excellent prediction accuracy; thus, the model is quite suitable for simulation and prediction of approximate nonhomogeneous index sequence and has excellent application value in settlement prediction.

  19. Models for predicting objective function weights in prostate cancer IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Boutilier, Justin J., E-mail: j.boutilier@mail.utoronto.ca; Lee, Taewoo [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King’s College Road, Toronto, Ontario M5S 3G8 (Canada); Craig, Tim [Radiation Medicine Program, UHN Princess Margaret Cancer Centre, 610 University of Avenue, Toronto, Ontario M5T 2M9, Canada and Department of Radiation Oncology, University of Toronto, 148 - 150 College Street, Toronto, Ontario M5S 3S2 (Canada); Sharpe, Michael B. [Radiation Medicine Program, UHN Princess Margaret Cancer Centre, 610 University of Avenue, Toronto, Ontario M5T 2M9 (Canada); Department of Radiation Oncology, University of Toronto, 148 - 150 College Street, Toronto, Ontario M5S 3S2 (Canada); Techna Institute for the Advancement of Technology for Health, 124 - 100 College Street, Toronto, Ontario M5G 1P5 (Canada); Chan, Timothy C. Y. [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King’s College Road, Toronto, Ontario M5S 3G8, Canada and Techna Institute for the Advancement of Technology for Health, 124 - 100 College Street, Toronto, Ontario M5G 1P5 (Canada)

    2015-04-15

    Purpose: To develop and evaluate the clinical applicability of advanced machine learning models that simultaneously predict multiple optimization objective function weights from patient geometry for intensity-modulated radiation therapy of prostate cancer. Methods: A previously developed inverse optimization method was applied retrospectively to determine optimal objective function weights for 315 treated patients. The authors used an overlap volume ratio (OV) of bladder and rectum for different PTV expansions and overlap volume histogram slopes (OVSR and OVSB for the rectum and bladder, respectively) as explanatory variables that quantify patient geometry. Using the optimal weights as ground truth, the authors trained and applied three prediction models: logistic regression (LR), multinomial logistic regression (MLR), and weighted K-nearest neighbor (KNN). The population average of the optimal objective function weights was also calculated. Results: The OV at 0.4 cm and OVSR at 0.1 cm features were found to be the most predictive of the weights. The authors observed comparable performance (i.e., no statistically significant difference) between LR, MLR, and KNN methodologies, with LR appearing to perform the best. All three machine learning models outperformed the population average by a statistically significant amount over a range of clinical metrics including bladder/rectum V53Gy, bladder/rectum V70Gy, and dose to the bladder, rectum, CTV, and PTV. When comparing the weights directly, the LR model predicted bladder and rectum weights that had, on average, a 73% and 74% relative improvement over the population average weights, respectively. The treatment plans resulting from the LR weights had, on average, a rectum V70Gy that was 35% closer to the clinical plan and a bladder V70Gy that was 29% closer, compared to the population average weights. Similar results were observed for all other clinical metrics. Conclusions: The authors demonstrated that the KNN and MLR

  20. A grey NGM(1,1, k) self-memory coupling prediction model for energy consumption prediction.

    Science.gov (United States)

    Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling

    2014-01-01

    Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1, k) self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1, k) model. The traditional grey model's weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1, k) self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span.

  1. A Grey NGM(1,1, k) Self-Memory Coupling Prediction Model for Energy Consumption Prediction

    Science.gov (United States)

    Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling

    2014-01-01

    Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1, k) self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1, k) model. The traditional grey model's weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1, k) self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span. PMID:25054174

  2. Data on clinical significance of second trimester inflammatory biomarkers in the amniotic fluid in predicting preterm delivery

    Directory of Open Access Journals (Sweden)

    Assaad Kesrouani

    2016-12-01

    Data includes ROC curves for glucose (Fig. 1, IL-6 (Fig. 2 and MMP-9 (Fig. 3, aiming to search for sensitivity and specificity in the prediction of premature delivery. Statistical analyses are performed with SPSS v20.0 software. Statistical significance is determined using the Mann–Whitney and one way ANOVA test. The association with preterm delivery is performed using a two proportions test. Correlations are measured using the Pearson׳’s coefficient. A p value<0.05 is considered statistically significant. The data is presented in the figures provided. Data relied on a previous publication “Prediction of preterm delivery by second trimester inflammatory biomarkers in the amniotic fluid” (A. Kesrouani, E. Chalhoub, E. El Rassy, M. Germanos, A. Khazzaka, J. Rizkallah, E. Attieh, N. Aouad, 2016 [1].

  3. Nonconvex model predictive control for commercial refrigeration

    Science.gov (United States)

    Gybel Hovgaard, Tobias; Boyd, Stephen; Larsen, Lars F. S.; Bagterp Jørgensen, John

    2013-08-01

    We consider the control of a commercial multi-zone refrigeration system, consisting of several cooling units that share a common compressor, and is used to cool multiple areas or rooms. In each time period we choose cooling capacity to each unit and a common evaporation temperature. The goal is to minimise the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimisation method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out the iterations, which is more than fast enough to run in real time. We demonstrate our method on a realistic model, with a full year simulation and 15-minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost savings, on the order of 30%, compared to a standard thermostat-based control system. Perhaps more important, we see that the method exhibits sophisticated response to real-time variations in electricity prices. This demand response is critical to help balance real-time uncertainties in generation capacity associated with large penetration of intermittent renewable energy sources in a future smart grid.

  4. Leptogenesis in minimal predictive seesaw models

    Energy Technology Data Exchange (ETDEWEB)

    Björkeroth, Fredrik [School of Physics and Astronomy, University of Southampton,Southampton, SO17 1BJ (United Kingdom); Anda, Francisco J. de [Departamento de Física, CUCEI, Universidad de Guadalajara,Guadalajara (Mexico); Varzielas, Ivo de Medeiros; King, Stephen F. [School of Physics and Astronomy, University of Southampton,Southampton, SO17 1BJ (United Kingdom)

    2015-10-15

    We estimate the Baryon Asymmetry of the Universe (BAU) arising from leptogenesis within a class of minimal predictive seesaw models involving two right-handed neutrinos and simple Yukawa structures with one texture zero. The two right-handed neutrinos are dominantly responsible for the “atmospheric” and “solar” neutrino masses with Yukawa couplings to (ν{sub e},ν{sub μ},ν{sub τ}) proportional to (0,1,1) and (1,n,n−2), respectively, where n is a positive integer. The neutrino Yukawa matrix is therefore characterised by two proportionality constants with their relative phase providing a leptogenesis-PMNS link, enabling the lightest right-handed neutrino mass to be determined from neutrino data and the observed BAU. We discuss an SU(5) SUSY GUT example, where A{sub 4} vacuum alignment provides the required Yukawa structures with n=3, while a ℤ{sub 9} symmetry fixes the relatives phase to be a ninth root of unity.

  5. Decadal prediction skill using a high-resolution climate model

    Science.gov (United States)

    Monerie, Paul-Arthur; Coquart, Laure; Maisonnave, Éric; Moine, Marie-Pierre; Terray, Laurent; Valcke, Sophie

    2017-02-01

    The ability of a high-resolution coupled atmosphere-ocean general circulation model (with a horizontal resolution of a quarter of a degree in the ocean and of about 0.5° in the atmosphere) to predict the annual means of temperature, precipitation, sea-ice volume and extent is assessed based on initialized hindcasts over the 1993-2009 period. Significant skill in predicting sea surface temperatures is obtained, especially over the North Atlantic, the tropical Atlantic and the Indian Ocean. The Sea Ice Extent and volume are also reasonably predicted in winter (March) and summer (September). The model skill is mainly due to the external forcing associated with well-mixed greenhouse gases. A decrease in the global warming rate associated with a negative phase of the Pacific Decadal Oscillation is simulated by the model over a suite of 10-year periods when initialized from starting dates between 1999 and 2003. The model ability to predict regional change is investigated by focusing on the mid-90's Atlantic Ocean subpolar gyre warming. The model simulates the North Atlantic warming associated with a meridional heat transport increase, a strengthening of the North Atlantic current and a deepening of the mixed layer over the Labrador Sea. The atmosphere plays a role in the warming through a modulation of the North Atlantic Oscillation: a negative sea level pressure anomaly, located south of the subpolar gyre is associated with a wind speed decrease over the subpolar gyre. This leads to a reduced oceanic heat-loss and favors a northward displacement of anomalously warm and salty subtropical water that both concur to the subpolar gyre warming. We finally conclude that the subpolar gyre warming is mainly triggered by ocean dynamics with a possible contribution of atmospheric circulation favoring its persistence.

  6. Dinucleotide controlled null models for comparative RNA gene prediction

    Directory of Open Access Journals (Sweden)

    Gesell Tanja

    2008-05-01

    Full Text Available Abstract Background Comparative prediction of RNA structures can be used to identify functional noncoding RNAs in genomic screens. It was shown recently by Babak et al. [BMC Bioinformatics. 8:33] that RNA gene prediction programs can be biased by the genomic dinucleotide content, in particular those programs using a thermodynamic folding model including stacking energies. As a consequence, there is need for dinucleotide-preserving control strategies to assess the significance of such predictions. While there have been randomization algorithms for single sequences for many years, the problem has remained challenging for multiple alignments and there is currently no algorithm available. Results We present a program called SISSIz that simulates multiple alignments of a given average dinucleotide content. Meeting additional requirements of an accurate null model, the randomized alignments are on average of the same sequence diversity and preserve local conservation and gap patterns. We make use of a phylogenetic substitution model that includes overlapping dependencies and site-specific rates. Using fast heuristics and a distance based approach, a tree is estimated under this model which is used to guide the simulations. The new algorithm is tested on vertebrate genomic alignments and the effect on RNA structure predictions is studied. In addition, we directly combined the new null model with the RNAalifold consensus folding algorithm giving a new variant of a thermodynamic structure based RNA gene finding program that is not biased by the dinucleotide content. Conclusion SISSIz implements an efficient algorithm to randomize multiple alignments preserving dinucleotide content. It can be used to get more accurate estimates of false positive rates of existing programs, to produce negative controls for the training of machine learning based programs, or as standalone RNA gene finding program. Other applications in comparative genomics that require

  7. Infiltration under snow cover: Modeling approaches and predictive uncertainty

    Science.gov (United States)

    Meeks, Jessica; Moeck, Christian; Brunner, Philip; Hunkeler, Daniel

    2017-03-01

    Groundwater recharge from snowmelt represents a temporal redistribution of precipitation. This is extremely important because the rate and timing of snowpack drainage has substantial consequences to aquifer recharge patterns, which in turn affect groundwater availability throughout the rest of the year. The modeling methods developed to estimate drainage from a snowpack, which typically rely on temporally-dense point-measurements or temporally-limited spatially-dispersed calibration data, range in complexity from the simple degree-day method to more complex and physically-based energy balance approaches. While the gamut of snowmelt models are routinely used to aid in water resource management, a comparison of snowmelt models' predictive uncertainties had previously not been done. Therefore, we established a snowmelt model calibration dataset that is both temporally dense and represents the integrated snowmelt infiltration signal for the Vers Chez le Brandt research catchment, which functions as a rather unique natural lysimeter. We then evaluated the uncertainty associated with the degree-day, a modified degree-day and energy balance snowmelt model predictions using the null-space Monte Carlo approach. All three melt models underestimate total snowpack drainage, underestimate the rate of early and midwinter drainage and overestimate spring snowmelt rates. The actual rate of snowpack water loss is more constant over the course of the entire winter season than the snowmelt models would imply, indicating that mid-winter melt can contribute as significantly as springtime snowmelt to groundwater recharge in low alpine settings. Further, actual groundwater recharge could be between 2 and 31% greater than snowmelt models suggest, over the total winter season. This study shows that snowmelt model predictions can have considerable uncertainty, which may be reduced by the inclusion of more data that allows for the use of more complex approaches such as the energy balance

  8. Predictability in models of the atmospheric circulation.

    NARCIS (Netherlands)

    Houtekamer, P.L.

    1992-01-01

    It will be clear from the above discussions that skill forecasts are still in their infancy. Operational skill predictions do not exist. One is still struggling to prove that skill predictions, at any range, have any quality at all. It is not clear what the statistics of the analysis error are. The

  9. The effects of model and data complexity on predictions from species distributions models

    DEFF Research Database (Denmark)

    García-Callejas, David; Bastos, Miguel

    2016-01-01

    study contradicts the widely held view that the complexity of species distributions models has significant effects in their predictive ability while findings support for previous observations that the properties of species distributions data and their relationship with the environment are strong...

  10. Predicting life satisfaction of the Angolan elderly: a structural model.

    Science.gov (United States)

    Gutiérrez, M; Tomás, J M; Galiana, L; Sancho, P; Cebrià, M A

    2013-01-01

    Satisfaction with life is of particular interest in the study of old age well-being because it has arisen as an important component of old age. A considerable amount of research has been done to explain life satisfaction in the elderly, and there is growing empirical evidence on best predictors of life satisfaction. This research evaluates the predictive power of some aging process variables, on Angolan elderly people's life satisfaction, while including perceived health into the model. Data for this research come from a cross-sectional survey of elderly people living in the capital of Angola, Luanda. A total of 1003 Angolan elderly were surveyed on socio-demographic information, perceived health, active engagement, generativity, and life satisfaction. A Multiple Indicators Multiple Causes model was built to test variables' predictive power on life satisfaction. The estimated theoretical model fitted the data well. The main predictors were those related to active engagement with others. Perceived health also had a significant and positive effect on life satisfaction. Several processes together may predict life satisfaction in the elderly population of Angola, and the variance accounted for it is large enough to be considered relevant. The key factor associated to life satisfaction seems to be active engagement with others.

  11. Model Predictive Control-Based Fast Charging for Vehicular Batteries

    Directory of Open Access Journals (Sweden)

    Zhibin Song

    2011-08-01

    Full Text Available Battery fast charging is one of the most significant and difficult techniques affecting the commercialization of electric vehicles (EVs. In this paper, we propose a fast charge framework based on model predictive control, with the aim of simultaneously reducing the charge duration, which represents the out-of-service time of vehicles, and the increase in temperature, which represents safety and energy efficiency during the charge process. The RC model is employed to predict the future State of Charge (SOC. A single mode lumped-parameter thermal model and a neural network trained by real experimental data are also applied to predict the future temperature in simulations and experiments respectively. A genetic algorithm is then applied to find the best charge sequence under a specified fitness function, which consists of two objectives: minimizing the charging duration and minimizing the increase in temperature. Both simulation and experiment demonstrate that the Pareto front of the proposed method dominates that of the most popular constant current constant voltage (CCCV charge method.

  12. Improving active space telescope wavefront control using predictive thermal modeling

    Science.gov (United States)

    Gersh-Range, Jessica; Perrin, Marshall D.

    2015-01-01

    Active control algorithms for space telescopes are less mature than those for large ground telescopes due to differences in the wavefront control problems. Active wavefront control for space telescopes at L2, such as the James Webb Space Telescope (JWST), requires weighing control costs against the benefits of correcting wavefront perturbations that are a predictable byproduct of the observing schedule, which is known and determined in advance. To improve the control algorithms for these telescopes, we have developed a model that calculates the temperature and wavefront evolution during a hypothetical mission, assuming the dominant wavefront perturbations are due to changes in the spacecraft attitude with respect to the sun. Using this model, we show that the wavefront can be controlled passively by introducing scheduling constraints that limit the allowable attitudes for an observation based on the observation duration and the mean telescope temperature. We also describe the implementation of a predictive controller designed to prevent the wavefront error (WFE) from exceeding a desired threshold. This controller outperforms simpler algorithms even with substantial model error, achieving a lower WFE without requiring significantly more corrections. Consequently, predictive wavefront control based on known spacecraft attitude plans is a promising approach for JWST and other future active space observatories.

  13. Non-invasive prediction of hemodynamically significant coronary artery stenoses by contrast density difference in coronary CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Hell, Michaela M., E-mail: michaela.hell@uk-erlangen.de [Department of Cardiology, University of Erlangen (Germany); Dey, Damini [Department of Biomedical Sciences, Biomedical Imaging Research Institute, Cedars-Sinai Medical Center, Taper Building, Room A238, 8700 Beverly Boulevard, Los Angeles, CA 90048 (United States); Marwan, Mohamed; Achenbach, Stephan; Schmid, Jasmin; Schuhbaeck, Annika [Department of Cardiology, University of Erlangen (Germany)

    2015-08-15

    Highlights: • Overestimation of coronary lesions by coronary computed tomography angiography and subsequent unnecessary invasive coronary angiography and revascularization is a concern. • Differences in plaque characteristics and contrast density difference between hemodynamically significant and non-significant stenoses, as defined by invasive fractional flow reserve, were assessed. • At a threshold of ≥24%, contrast density difference predicted hemodynamically significant lesions with a specificity of 75%, sensitivity of 33%, PPV of 35% and NPV of 73%. • The determination of contrast density difference required less time than transluminal attenuation gradient measurement. - Abstract: Objectives: Coronary computed tomography angiography (CTA) allows the detection of obstructive coronary artery disease. However, its ability to predict the hemodynamic significance of stenoses is limited. We assessed differences in plaque characteristics and contrast density difference between hemodynamically significant and non-significant stenoses, as defined by invasive fractional flow reserve (FFR). Methods: Lesion characteristics of 59 consecutive patients (72 lesions) in whom invasive FFR was performed in at least one coronary artery with moderate to high-grade stenoses in coronary CTA were evaluated by two experienced readers. Coronary CTA data sets were acquired on a second-generation dual-source CT scanner using retrospectively ECG-gated spiral acquisition or prospectively ECG-triggered axial acquisition mode. Plaque volume and composition (non-calcified, calcified), remodeling index as well as contrast density difference (defined as the percentage decline in luminal CT attenuation/cross-sectional area over the lesion) were assessed using a semi-automatic software tool (Autoplaq). Additionally, the transluminal attenuation gradient (defined as the linear regression coefficient between intraluminal CT attenuation and length from the ostium) was determined

  14. Significance of settling model structures and parameter subsets in modelling WWTPs under wet-weather flow and filamentous bulking conditions

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen;

    2014-01-01

    Current research focuses on predicting and mitigating the impacts of high hydraulic loadings on centralized wastewater treatment plants (WWTPs) under wet-weather conditions. The maximum permissible inflow to WWTPs depends not only on the settleability of activated sludge in secondary settling tanks...... (SSTs) but also on the hydraulic behaviour of SSTs. The present study investigates the impacts of ideal and non-ideal flow (dry and wet weather) and settling (good settling and bulking) boundary conditions on the sensitivity of WWTP model outputs to uncertainties intrinsic to the one-dimensional (1-D...... of settling parameters to the total variance of the key WWTP process outputs significantly depends on the influent flow and settling conditions. The magnitude of the impact is found to vary, depending on which type of 1-D SST model is used. Therefore, we identify and recommend potential parameter subsets...

  15. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    OpenAIRE

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) dat...

  16. The Significance of the Bystander Effect: Modeling, Experiments, and More Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Brenner, David J.

    2009-07-22

    -term models is needed. As an example of this novel approach, we integrated a stochastic short-term initiation/inactivation/repopulation model with a deterministic two-stage long-term model. Within this new formalism, the following assumptions are implemented: radiation initiates, promotes, or kills pre-malignant cells; a pre-malignant cell generates a clone, which, if it survives, quickly reaches a size limitation; the clone subsequently grows more slowly and can eventually generate a malignant cell; the carcinogenic potential of pre-malignant cells decreases with age. The effectiveness of high-LET radiation per unit dose increases as dose rate decreases. This “inverse dose rate effect” is seen in radon-induced lung carcinogenesis. We suggest a biologically-motivated mechanism based on radiation-induced direct and bystander-effect-related risks: During radon exposure, only a fraction of cells are traversed by alpha particles. These irradiated cells have an increased probability of being initiated into a pre-malignant state. They release signals, which convert some nearby unirradiated cells to an activated state. When already pre-malignant cells are activated, their proliferation (promotion) rate increases. If a radiation dose is sufficient to activate most susceptible cells, protracting the exposure does not substantially decrease the number of activated cells, but prolongs the activated state during which pre-malignant cell proliferation is accelerated. This mechanism is implemented in a low-dose-rate extension of our carcinogenesis model, which integrates both short- and long-term modeling approaches, and was applied to radiotherapy-induced second cancer risk estimation. Model predictions adequately describe the data on radon-induced lung carcinogenesis in humans and rats, using few adjustable parameters. Conclusions about the relative importance of promotion vs. initiation for radon carcinogenesis are similar to those reported with the two-stage clonal expansion model

  17. Allostasis: a model of predictive regulation.

    Science.gov (United States)

    Sterling, Peter

    2012-04-12

    The premise of the standard regulatory model, "homeostasis", is flawed: the goal of regulation is not to preserve constancy of the internal milieu. Rather, it is to continually adjust the milieu to promote survival and reproduction. Regulatory mechanisms need to be efficient, but homeostasis (error-correction by feedback) is inherently inefficient. Thus, although feedbacks are certainly ubiquitous, they could not possibly serve as the primary regulatory mechanism. A newer model, "allostasis", proposes that efficient regulation requires anticipating needs and preparing to satisfy them before they arise. The advantages: (i) errors are reduced in magnitude and frequency; (ii) response capacities of different components are matched -- to prevent bottlenecks and reduce safety factors; (iii) resources are shared between systems to minimize reserve capacities; (iv) errors are remembered and used to reduce future errors. This regulatory strategy requires a dedicated organ, the brain. The brain tracks multitudinous variables and integrates their values with prior knowledge to predict needs and set priorities. The brain coordinates effectors to mobilize resources from modest bodily stores and enforces a system of flexible trade-offs: from each organ according to its ability, to each organ according to its need. The brain also helps regulate the internal milieu by governing anticipatory behavior. Thus, an animal conserves energy by moving to a warmer place - before it cools, and it conserves salt and water by moving to a cooler one before it sweats. The behavioral strategy requires continuously updating a set of specific "shopping lists" that document the growing need for each key component (warmth, food, salt, water). These appetites funnel into a common pathway that employs a "stick" to drive the organism toward filling the need, plus a "carrot" to relax the organism when the need is satisfied. The stick corresponds broadly to the sense of anxiety, and the carrot broadly to

  18. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  19. A prediction model for assessing residential radon concentration in Switzerland

    NARCIS (Netherlands)

    Hauri, D.D.; Huss, A.; Zimmermann, F.; Kuehni, C.E.; Roosli, M.

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the

  20. Predicting aquifer response time for application in catchment modeling.

    Science.gov (United States)

    Walker, Glen R; Gilfedder, Mat; Dawes, Warrick R; Rassam, David W

    2015-01-01

    It is well established that changes in catchment land use can lead to significant impacts on water resources. Where land-use changes increase evapotranspiration there is a resultant decrease in groundwater recharge, which in turn decreases groundwater discharge to streams. The response time of changes in groundwater discharge to a change in recharge is a key aspect of predicting impacts of land-use change on catchment water yield. Predicting these impacts across the large catchments relevant to water resource planning can require the estimation of groundwater response times from hundreds of aquifers. At this scale, detailed site-specific measured data are often absent, and available spatial data are limited. While numerical models can be applied, there is little advantage if there are no detailed data to parameterize them. Simple analytical methods are useful in this situation, as they allow the variability in groundwater response to be incorporated into catchment hydrological models, with minimal modeling overhead. This paper describes an analytical model which has been developed to capture some of the features of real, sloping aquifer systems. The derived groundwater response timescale can be used to parameterize a groundwater discharge function, allowing groundwater response to be predicted in relation to different broad catchment characteristics at a level of complexity which matches the available data. The results from the analytical model are compared to published field data and numerical model results, and provide an approach with broad application to inform water resource planning in other large, data-scarce catchments. © 2014, CommonWealth of Australia. Groundwater © 2014, National Ground Water Association.

  1. Predicting future glacial lakes in Austria using different modelling approaches

    Science.gov (United States)

    Otto, Jan-Christoph; Helfricht, Kay; Prasicek, Günther; Buckel, Johannes; Keuschnig, Markus

    2017-04-01

    Glacier retreat is one of the most apparent consequences of temperature rise in the 20th and 21th centuries in the European Alps. In Austria, more than 240 new lakes have formed in glacier forefields since the Little Ice Age. A similar signal is reported from many mountain areas worldwide. Glacial lakes can constitute important environmental and socio-economic impacts on high mountain systems including water resource management, sediment delivery, natural hazards, energy production and tourism. Their development significantly modifies the landscape configuration and visual appearance of high mountain areas. Knowledge on the location, number and extent of these future lakes can be used to assess potential impacts on high mountain geo-ecosystems and upland-lowland interactions. Information on new lakes is critical to appraise emerging threads and potentials for society. The recent development of regional ice thickness models and their combination with high resolution glacier surface data allows predicting the topography below current glaciers by subtracting ice thickness from glacier surface. Analyzing these modelled glacier bed surfaces reveals overdeepenings that represent potential locations for future lakes. In order to predict the location of future glacial lakes below recent glaciers in the Austrian Alps we apply different ice thickness models using high resolution terrain data and glacier outlines. The results are compared and validated with ice thickness data from geophysical surveys. Additionally, we run the models on three different glacier extents provided by the Austrian Glacier Inventories from 1969, 1998 and 2006. Results of this historical glacier extent modelling are compared to existing glacier lakes and discussed focusing on geomorphological impacts on lake evolution. We discuss model performance and observed differences in the results in order to assess the approach for a realistic prediction of future lake locations. The presentation delivers

  2. Distributional Analysis for Model Predictive Deferrable Load Control

    OpenAIRE

    Chen, Niangjun; Gan, Lingwen; Low, Steven H.; Wierman, Adam

    2014-01-01

    Deferrable load control is essential for handling the uncertainties associated with the increasing penetration of renewable generation. Model predictive control has emerged as an effective approach for deferrable load control, and has received considerable attention. In particular, previous work has analyzed the average-case performance of model predictive deferrable load control. However, to this point, distributional analysis of model predictive deferrable load control has been elusive. In ...

  3. Predicting lower mantle heterogeneity from 4-D Earth models

    Science.gov (United States)

    Flament, Nicolas; Williams, Simon; Müller, Dietmar; Gurnis, Michael; Bower, Dan J.

    2016-04-01

    The Earth's lower mantle is characterized by two large-low-shear velocity provinces (LLSVPs), approximately ˜15000 km in diameter and 500-1000 km high, located under Africa and the Pacific Ocean. The spatial stability and chemical nature of these LLSVPs are debated. Here, we compare the lower mantle structure predicted by forward global mantle flow models constrained by tectonic reconstructions (Bower et al., 2015) to an analysis of five global tomography models. In the dynamic models, spanning 230 million years, slabs subducting deep into the mantle deform an initially uniform basal layer containing 2% of the volume of the mantle. Basal density, convective vigour (Rayleigh number Ra), mantle viscosity, absolute plate motions, and relative plate motions are varied in a series of model cases. We use cluster analysis to classify a set of equally-spaced points (average separation ˜0.45°) on the Earth's surface into two groups of points with similar variations in present-day temperature between 1000-2800 km depth, for each model case. Below ˜2400 km depth, this procedure reveals a high-temperature cluster in which mantle temperature is significantly larger than ambient and a low-temperature cluster in which mantle temperature is lower than ambient. The spatial extent of the high-temperature cluster is in first-order agreement with the outlines of the African and Pacific LLSVPs revealed by a similar cluster analysis of five tomography models (Lekic et al., 2012). Model success is quantified by computing the accuracy and sensitivity of the predicted temperature clusters in predicting the low-velocity cluster obtained from tomography (Lekic et al., 2012). In these cases, the accuracy varies between 0.61-0.80, where a value of 0.5 represents the random case, and the sensitivity ranges between 0.18-0.83. The largest accuracies and sensitivities are obtained for models with Ra ≈ 5 x 107, no asthenosphere (or an asthenosphere restricted to the oceanic domain), and a

  4. The significance of Bremsstrahlung SPECT/CT after yttrium-90 radioembolization treatment in the prediction of extrahepatic side effects

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadzadehfar, Hojjat; Muckle, Marianne; Sabet, Amir; Biermann, Kim; Haslerud, Torjan; Biersack, Hans-Juergen; Ezziddin, Samer [University Hospital Bonn, Department of Nuclear Medicine, Bonn (Germany); Wilhelm, Kai [University Hospital Bonn, Department of Radiology, Bonn (Germany); Kuhl, Christiane [University Hospital Aachen, Department of Radiology, Aachen (Germany)

    2012-02-15

    Unwanted deposition of {sup 90}Y microspheres in organs other than the liver during radioembolization of liver tumours may cause severe side effects such as duodenal ulcer. The aim of this study was to evaluate the significance of posttherapy bremsstrahlung (BS) SPECT/CT images of the liver in comparison to planar and SPECT images in the prediction of radioembolization-induced extrahepatic side effects. A total of 188 radioembolization procedures were performed in 123 patients (50 women, 73 men) over a 2-year period. Planar, whole-body and BS SPECT/CT imaging were performed 24 h after treatment as a part of therapy work-up. Any focally increased extrahepatic accumulation was evaluated as suspicious. Clinical follow-up and gastroduodenoscopy served as reference standards. The studies were reviewed to evaluate whether BS SPECT/CT imaging was of benefit. In the light of anatomic data obtained from SPECT/CT, apparent extrahepatic BS in 43% of planar and in 52% of SPECT images proved to be in the liver and hence false-positive. The results of planar scintigraphy could not be analysed further since 12 images were not assessable due to high scatter artefacts. On the basis of the gastrointestinal (GI) complications and the results of gastroduodenoscopy, true-positive, true-negative, false-positive and false-negative results of BS SPECT and SPECT/CT imaging in the prediction of GI ulcers were determined. The sensitivity, specificity, positive and negative predictive values and the accuracy of SPECT and SPECT/CT in the prediction of GI ulcers were 13%, 88%, 8%, 92% and 82%, and 87%, 100%, 100%, 99% and 99%, respectively. Despite the low quality of BS images, BS SPECT/CT can be used as a reliable method to confirm the safe distribution of {sup 90}Y microspheres and in the prediction of GI side effects. (orig.)

  5. Prediction for Major Adverse Outcomes in Cardiac Surgery: Comparison of Three Prediction Models

    Directory of Open Access Journals (Sweden)

    Cheng-Hung Hsieh

    2007-09-01

    Conclusion: The Parsonnet score performed as well as the logistic regression models in predicting major adverse outcomes. The Parsonnet score appears to be a very suitable model for clinicians to use in risk stratification of cardiac surgery.

  6. Algorithms and Software for Predictive and Perceptual Modeling of Speech

    CERN Document Server

    Atti, Venkatraman

    2010-01-01

    From the early pulse code modulation-based coders to some of the recent multi-rate wideband speech coding standards, the area of speech coding made several significant strides with an objective to attain high quality of speech at the lowest possible bit rate. This book presents some of the recent advances in linear prediction (LP)-based speech analysis that employ perceptual models for narrow- and wide-band speech coding. The LP analysis-synthesis framework has been successful for speech coding because it fits well the source-system paradigm for speech synthesis. Limitations associated with th

  7. On hydrological model complexity, its geometrical interpretations and prediction uncertainty

    NARCIS (Netherlands)

    Arkesteijn, E.C.M.M.; Pande, S.

    2013-01-01

    Knowledge of hydrological model complexity can aid selection of an optimal prediction model out of a set of available models. Optimal model selection is formalized as selection of the least complex model out of a subset of models that have lower empirical risk. This may be considered equivalent to

  8. The clinical significance of MiR-148a as a predictive biomarker in patients with advanced colorectal cancer.

    Directory of Open Access Journals (Sweden)

    Masanobu Takahashi

    Full Text Available AIM: Development of robust prognostic and/or predictive biomarkers in patients with colorectal cancer (CRC is imperative for advancing treatment strategies for this disease. We aimed to determine whether expression status of certain miRNAs might have prognostic/predictive value in CRC patients treated with conventional cytotoxic chemotherapies. METHODS: We studied a cohort of 273 CRC specimens from stage II/III patients treated with 5-fluorouracil-based adjuvant chemotherapy and stage IV patients subjected to 5-fluorouracil and oxaliplatin-based chemotherapy. In a screening set (n = 44, 13 of 21 candidate miRNAs were successfully quantified by multiplex quantitative RT-PCR. In the validation set comprising of the entire patient cohort, miR-148a expression status was assessed by quantitative RT-PCR, and its promoter methylation was quantified by bisulfite pyrosequencing. Lastly, we analyzed the associations between miR-148a expression and patient survival. RESULTS: Among the candidate miRNAs studied, miR-148a expression was most significantly down-regulated in advanced CRC tissues. In stage III and IV CRC, low miR-148a expression was associated with significantly shorter disease free-survival (DFS, a worse therapeutic response, and poor overall survival (OS. Furthermore, miR-148a methylation status correlated inversely with its expression, and was associated with worse survival in stage IV CRC. In multivariate analysis, miR-148a expression was an independent prognostic/predictive biomarker for advanced CRC patients (DFS in stage III, low vs. high expression, HR 2.11; OS in stage IV, HR 1.93. DISCUSSION: MiR-148a status has a prognostic/predictive value in advanced CRC patients treated with conventional chemotherapy, which has important clinical implications in improving therapeutic strategies and personalized management of this malignancy.

  9. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  10. Predictive modeling of dental pain using neural network.

    Science.gov (United States)

    Kim, Eun Yeob; Lim, Kun Ok; Rhee, Hyun Sill

    2009-01-01

    The mouth is a part of the body for ingesting food that is the most basic foundation and important part. The dental pain predicted by the neural network model. As a result of making a predictive modeling, the fitness of the predictive modeling of dental pain factors was 80.0%. As for the people who are likely to experience dental pain predicted by the neural network model, preventive measures including proper eating habits, education on oral hygiene, and stress release must precede any dental treatment.

  11. Prediction of peptide bonding affinity: kernel methods for nonlinear modeling

    CERN Document Server

    Bergeron, Charles; Sundling, C Matthew; Krein, Michael; Katt, Bill; Sukumar, Nagamani; Breneman, Curt M; Bennett, Kristin P

    2011-01-01

    This paper presents regression models obtained from a process of blind prediction of peptide binding affinity from provided descriptors for several distinct datasets as part of the 2006 Comparative Evaluation of Prediction Algorithms (COEPRA) contest. This paper finds that kernel partial least squares, a nonlinear partial least squares (PLS) algorithm, outperforms PLS, and that the incorporation of transferable atom equivalent features improves predictive capability.

  12. Building predictive models of soil particle-size distribution

    Directory of Open Access Journals (Sweden)

    Alessandro Samuel-Rosa

    2013-04-01

    Full Text Available Is it possible to build predictive models (PMs of soil particle-size distribution (psd in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographic wetness index. The PMs explained more than half of the data variance. This performance is similar to (or even better than that of the conventional soil mapping approach. For some size fractions, the PM performance can reach 70 %. Largest uncertainties were observed in geologically more complex areas. Therefore, significant improvements in the predictions can only be achieved if accurate geological data is made available. Meanwhile, PMs built on terrain attributes are efficient in predicting the particle-size distribution (psd of soils in regions of complex geology.

  13. Hologram QSAR model for the prediction of human oral bioavailability.

    Science.gov (United States)

    Moda, Tiago L; Montanari, Carlos A; Andricopulo, Adriano D

    2007-12-15

    A drug intended for use in humans should have an ideal balance of pharmacokinetics and safety, as well as potency and selectivity. Unfavorable pharmacokinetics can negatively affect the clinical development of many otherwise promising drug candidates. A variety of in silico ADME (absorption, distribution, metabolism, and excretion) models are receiving increased attention due to a better appreciation that pharmacokinetic properties should be considered in early phases of the drug discovery process. Human oral bioavailability is an important pharmacokinetic property, which is directly related to the amount of drug available in the systemic circulation to exert pharmacological and therapeutic effects. In the present work, hologram quantitative structure-activity relationships (HQSAR) were performed on a training set of 250 structurally diverse molecules with known human oral bioavailability. The most significant HQSAR model (q(2)=0.70, r(2)=0.93) was obtained using atoms, bond, connection, and chirality as fragment distinction. The predictive ability of the model was evaluated by an external test set containing 52 molecules not included in the training set, and the predicted values were in good agreement with the experimental values. The HQSAR model should be useful for the design of new drug candidates having increased bioavailability as well as in the process of chemical library design, virtual screening, and high-throughput screening.

  14. Testing 40 Predictions from the Transtheoretical Model Again, with Confidence

    Science.gov (United States)

    Velicer, Wayne F.; Brick, Leslie Ann D.; Fava, Joseph L.; Prochaska, James O.

    2013-01-01

    Testing Theory-based Quantitative Predictions (TTQP) represents an alternative to traditional Null Hypothesis Significance Testing (NHST) procedures and is more appropriate for theory testing. The theory generates explicit effect size predictions and these effect size estimates, with related confidence intervals, are used to test the predictions.…

  15. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  16. Consistent CMT solutions from Harvard University before great earthquakes in Kurile Islands and its significance for earthquake prediction

    Institute of Scientific and Technical Information of China (English)

    WANG Jun-guo; DIAO Gui-ling

    2005-01-01

    In the paper, we use the Central Moment Tensor (CMT) solution acquired by Harvard University for the earthquakes occurred in Kurile Islands to analyze the consistent focal mechanism in the area and propose the idea of making earthquake prediction based on the consistent parameter a of focal mechanism and stress field. The results from the study indicate that before MW≥7.5 earthquakes, the consistent parameter a decreases, which starts about 10~110 days and ends about 30~2 days before the great earthquakes. Although the phenomenon is not totally the same for individual earthquake, the difference is not large. Certainly, the phenomenon should be tested by time for its reliability. However, it should not be random that the focal mechanism of MW≥5.3 earthquakes are consistent successively with the stress field in an area of several hundreds kilometers in length. It should be a phenomenon of predictive significance. When the accumulated earthquake examples are sufficient, uniform judgment criteria and prediction principles can be stipulated then.

  17. Prediction using patient comparison vs. modeling: a case study for mortality prediction.

    Science.gov (United States)

    Hoogendoorn, Mark; El Hassouni, Ali; Mok, Kwongyen; Ghassemi, Marzyeh; Szolovits, Peter

    2016-08-01

    Information in Electronic Medical Records (EMRs) can be used to generate accurate predictions for the occurrence of a variety of health states, which can contribute to more pro-active interventions. The very nature of EMRs does make the application of off-the-shelf machine learning techniques difficult. In this paper, we study two approaches to making predictions that have hardly been compared in the past: (1) extracting high-level (temporal) features from EMRs and building a predictive model, and (2) defining a patient similarity metric and predicting based on the outcome observed for similar patients. We analyze and compare both approaches on the MIMIC-II ICU dataset to predict patient mortality and find that the patient similarity approach does not scale well and results in a less accurate model (AUC of 0.68) compared to the modeling approach (0.84). We also show that mortality can be predicted within a median of 72 hours.

  18. Physiologically-based, predictive analytics using the heart-rate-to-Systolic-Ratio significantly improves the timeliness and accuracy of sepsis prediction compared to SIRS.

    Science.gov (United States)

    Danner, Omar K; Hendren, Sandra; Santiago, Ethel; Nye, Brittany; Abraham, Prasad

    2017-04-01

    Enhancing the efficiency of diagnosis and treatment of severe sepsis by using physiologically-based, predictive analytical strategies has not been fully explored. We hypothesize assessment of heart-rate-to-systolic-ratio significantly increases the timeliness and accuracy of sepsis prediction after emergency department (ED) presentation. We evaluated the records of 53,313 ED patients from a large, urban teaching hospital between January and June 2015. The HR-to-systolic ratio was compared to SIRS criteria for sepsis prediction. There were 884 patients with discharge diagnoses of sepsis, severe sepsis, and/or septic shock. Variations in three presenting variables, heart rate, systolic BP and temperature were determined to be primary early predictors of sepsis with a 74% (654/884) accuracy compared to 34% (304/884) using SIRS criteria (p sepsis identification via detection of variations in HR-to-systolic ratio. This approach may lead to earlier sepsis workup and life-saving interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Improving prediction of surgical site infection risk with multilevel modeling.

    Directory of Open Access Journals (Sweden)

    Lauren Saunders

    Full Text Available BACKGROUND: Surgical site infection (SSI surveillance is a key factor in the elaboration of strategies to reduce SSI occurrence and in providing surgeons with appropriate data feedback (risk indicators, clinical prediction rule. AIM: To improve the predictive performance of an individual-based SSI risk model by considering a multilevel hierarchical structure. PATIENTS AND METHODS: Data were collected anonymously by the French SSI active surveillance system in 2011. An SSI diagnosis was made by the surgical teams and infection control practitioners following standardized criteria. A random 20% sample comprising 151 hospitals, 502 wards and 62280 patients was used. Three-level (patient, ward, hospital hierarchical logistic regression models were initially performed. Parameters were estimated using the simulation-based Markov Chain Monte Carlo procedure. RESULTS: A total of 623 SSI were diagnosed (1%. The hospital level was discarded from the analysis as it did not contribute to variability of SSI occurrence (p  = 0.32. Established individual risk factors (patient history, surgical procedure and hospitalization characteristics were identified. A significant heterogeneity in SSI occurrence between wards was found (median odds ratio [MOR] 3.59, 95% credibility interval [CI] 3.03 to 4.33 after adjusting for patient-level variables. The effects of the follow-up duration varied between wards (p<10-9, with an increased heterogeneity when follow-up was <15 days (MOR 6.92, 95% CI 5.31 to 9.07]. The final two-level model significantly improved the discriminative accuracy compared to the single level reference model (p<10-9, with an area under the ROC curve of 0.84. CONCLUSION: This study sheds new light on the respective contribution of patient-, ward- and hospital-levels to SSI occurrence and demonstrates the significant impact of the ward level over and above risk factors present at patient level (i.e., independently from patient case-mix.

  20. Fuzzy predictive filtering in nonlinear economic model predictive control for demand response

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.;

    2016-01-01

    The performance of a model predictive controller (MPC) is highly correlated with the model's accuracy. This paper introduces an economic model predictive control (EMPC) scheme based on a nonlinear model, which uses a branch-and-bound tree search for solving the inherent non-convex optimization...... problem. Moreover, to reduce the computation time and improve the controller's performance, a fuzzy predictive filter is introduced. With the purpose of testing the developed EMPC, a simulation controlling the temperature levels of an intelligent office building (PowerFlexHouse), with and without fuzzy...

  1. Predictive modeling and reducing cyclic variability in autoignition engines

    Energy Technology Data Exchange (ETDEWEB)

    Hellstrom, Erik; Stefanopoulou, Anna; Jiang, Li; Larimore, Jacob

    2016-08-30

    Methods and systems are provided for controlling a vehicle engine to reduce cycle-to-cycle combustion variation. A predictive model is applied to predict cycle-to-cycle combustion behavior of an engine based on observed engine performance variables. Conditions are identified, based on the predicted cycle-to-cycle combustion behavior, that indicate high cycle-to-cycle combustion variation. Corrective measures are then applied to prevent the predicted high cycle-to-cycle combustion variation.

  2. Intelligent predictive model of ventilating capacity of imperial smelt furnace

    Institute of Scientific and Technical Information of China (English)

    唐朝晖; 胡燕瑜; 桂卫华; 吴敏

    2003-01-01

    In order to know the ventilating capacity of imperial smelt furnace (ISF), and increase the output of plumbum, an intelligent modeling method based on gray theory and artificial neural networks(ANN) is proposed, in which the weight values in the integrated model can be adjusted automatically. An intelligent predictive model of the ventilating capacity of the ISF is established and analyzed by the method. The simulation results and industrial applications demonstrate that the predictive model is close to the real plant, the relative predictive error is 0.72%, which is 50% less than the single model, leading to a notable increase of the output of plumbum.

  3. A Prediction Model of the Capillary Pressure J-Function

    Science.gov (United States)

    Xu, W. S.; Luo, P. Y.; Sun, L.; Lin, N.

    2016-01-01

    The capillary pressure J-function is a dimensionless measure of the capillary pressure of a fluid in a porous medium. The function was derived based on a capillary bundle model. However, the dependence of the J-function on the saturation Sw is not well understood. A prediction model for it is presented based on capillary pressure model, and the J-function prediction model is a power function instead of an exponential or polynomial function. Relative permeability is calculated with the J-function prediction model, resulting in an easier calculation and results that are more representative. PMID:27603701

  4. Adaptation of Predictive Models to PDA Hand-Held Devices

    Directory of Open Access Journals (Sweden)

    Lin, Edward J

    2008-01-01

    Full Text Available Prediction models using multiple logistic regression are appearing with increasing frequency in the medical literature. Problems associated with these models include the complexity of computations when applied in their pure form, and lack of availability at the bedside. Personal digital assistant (PDA hand-held devices equipped with spreadsheet software offer the clinician a readily available and easily applied means of applying predictive models at the bedside. The purposes of this article are to briefly review regression as a means of creating predictive models and to describe a method of choosing and adapting logistic regression models to emergency department (ED clinical practice.

  5. Predicting recycling behaviour: Comparison of a linear regression model and a fuzzy logic model.

    Science.gov (United States)

    Vesely, Stepan; Klöckner, Christian A; Dohnal, Mirko

    2016-03-01

    In this paper we demonstrate that fuzzy logic can provide a better tool for predicting recycling behaviour than the customarily used linear regression. To show this, we take a set of empirical data on recycling behaviour (N=664), which we randomly divide into two halves. The first half is used to estimate a linear regression model of recycling behaviour, and to develop a fuzzy logic model of recycling behaviour. As the first comparison, the fit of both models to the data included in estimation of the models (N=332) is evaluated. As the second comparison, predictive accuracy of both models for "new" cases (hold-out data not included in building the models, N=332) is assessed. In both cases, the fuzzy logic model significantly outperforms the regression model in terms of fit. To conclude, when accurate predictions of recycling and possibly other environmental behaviours are needed, fuzzy logic modelling seems to be a promising technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A model to predict the power output from wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L. [Riso National Lab., Roskilde (Denmark)

    1997-12-31

    This paper will describe a model that can predict the power output from wind farms. To give examples of input the model is applied to a wind farm in Texas. The predictions are generated from forecasts from the NGM model of NCEP. These predictions are made valid at individual sites (wind farms) by applying a matrix calculated by the sub-models of WASP (Wind Atlas Application and Analysis Program). The actual wind farm production is calculated using the Riso PARK model. Because of the preliminary nature of the results, they will not be given. However, similar results from Europe will be given.

  7. Modelling microbial interactions and food structure in predictive microbiology

    NARCIS (Netherlands)

    Malakar, P.K.

    2002-01-01

    Keywords: modelling, dynamic models, microbial interactions, diffusion, microgradients, colony growth, predictive microbiology.

    Growth response of microorganisms in foods is a complex process. Innovations in food production and preservation techniques have resulted in adoption of

  8. Modelling microbial interactions and food structure in predictive microbiology

    NARCIS (Netherlands)

    Malakar, P.K.

    2002-01-01

    Keywords: modelling, dynamic models, microbial interactions, diffusion, microgradients, colony growth, predictive microbiology.    Growth response of microorganisms in foods is a complex process. Innovations in food production and preservation techniques have resulted in adoption of new technologies

  9. Study of prognostic significance of antenatal ultrasonography and renin angiotensin system activation in predicting disease severity in posterior urethral valves

    Directory of Open Access Journals (Sweden)

    Divya Bhadoo

    2015-01-01

    Full Text Available Aims: Study on prognostic significance of antenatal ultrasonography and renin angiotensin system activation in predicting disease severity in posterior urethral valves. Materials and Methods: Antenatally diagnosed hydronephrosis patients were included. Postnatally, they were divided into two groups, posterior urethral valve (PUV and non-PUV. The studied parameters were: Gestational age at detection, surgical intervention, ultrasound findings, cord blood and follow up plasma renin activity (PRA values, vesico-ureteric reflux (VUR, renal scars, and glomerular filtration rate (GFR. Results: A total of 25 patients were included, 10 PUV and 15 non-PUV. All infants with PUV underwent primary valve incision. GFR was less than 60 ml/min/1.73 m 2 body surface area in 4 patients at last follow-up. Keyhole sign, oligoamnios, absent bladder cycling, and cortical cysts were not consistent findings on antenatal ultrasound in PUV. Cord blood PRA was significantly higher (P < 0.0001 in PUV compared to non-PUV patients. Gestational age at detection of hydronephrosis, cortical cysts, bladder wall thickness, and amniotic fluid index were not significantly correlated with GFR while PRA could differentiate between poor and better prognosis cases with PUV. Conclusions: Ultrasound was neither uniformly useful in diagnosing PUV antenatally, nor differentiating it from cases with non-PUV hydronephrosis. In congenital hydronephrosis, cord blood PRA was significantly higher in cases with PUV compared to non-PUV cases and fell significantly after valve ablation. Cord blood PRA could distinguish between poor and better prognosis cases with PUV.

  10. Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?

    Science.gov (United States)

    Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander

    2016-01-01

    Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.

  11. Predicting Career Advancement with Structural Equation Modelling

    Science.gov (United States)

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  12. Predicting Career Advancement with Structural Equation Modelling

    Science.gov (United States)

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  13. Modeling and prediction of surgical procedure times

    NARCIS (Netherlands)

    P.S. Stepaniak (Pieter); C. Heij (Christiaan); G. de Vries (Guus)

    2009-01-01

    textabstractAccurate prediction of medical operation times is of crucial importance for cost efficient operation room planning in hospitals. This paper investigates the possible dependence of procedure times on surgeon factors like age, experience, gender, and team composition. The effect of these f

  14. Prediction Model of Sewing Technical Condition by Grey Neural Network

    Institute of Scientific and Technical Information of China (English)

    DONG Ying; FANG Fang; ZHANG Wei-yuan

    2007-01-01

    The grey system theory and the artificial neural network technology were applied to predict the sewing technical condition. The representative parameters, such as needle, stitch, were selected. Prediction model was established based on the different fabrics' mechanical properties that measured by KES instrument. Grey relevant degree analysis was applied to choose the input parameters of the neural network. The result showed that prediction model has good precision. The average relative error was 4.08% for needle and 4.25% for stitch.

  15. Active diagnosis of hybrid systems - A model predictive approach

    OpenAIRE

    2009-01-01

    A method for active diagnosis of hybrid systems is proposed. The main idea is to predict the future output of both normal and faulty model of the system; then at each time step an optimization problem is solved with the objective of maximizing the difference between the predicted normal and faulty outputs constrained by tolerable performance requirements. As in standard model predictive control, the first element of the optimal input is applied to the system and the whole procedure is repeate...

  16. Evaluation of Fast-Time Wake Vortex Prediction Models

    Science.gov (United States)

    Proctor, Fred H.; Hamilton, David W.

    2009-01-01

    Current fast-time wake models are reviewed and three basic types are defined. Predictions from several of the fast-time models are compared. Previous statistical evaluations of the APA-Sarpkaya and D2P fast-time models are discussed. Root Mean Square errors between fast-time model predictions and Lidar wake measurements are examined for a 24 hr period at Denver International Airport. Shortcomings in current methodology for evaluating wake errors are also discussed.

  17. Interpreting predictive maps of disease: highlighting the pitfalls of distribution models in epidemiology

    Directory of Open Access Journals (Sweden)

    Nicola A. Wardrop

    2014-11-01

    Full Text Available The application of spatial modelling to epidemiology has increased significantly over the past decade, delivering enhanced understanding of the environmental and climatic factors affecting disease distributions and providing spatially continuous representations of disease risk (predictive maps. These outputs provide significant information for disease control programmes, allowing spatial targeting and tailored interventions. However, several factors (e.g. sampling protocols or temporal disease spread can influence predictive mapping outputs. This paper proposes a conceptual framework which defines several scenarios and their potential impact on resulting predictive outputs, using simulated data to provide an exemplar. It is vital that researchers recognise these scenarios and their influence on predictive models and their outputs, as a failure to do so may lead to inaccurate interpretation of predictive maps. As long as these considerations are kept in mind, predictive mapping will continue to contribute significantly to epidemiological research and disease control planning.

  18. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “any fall” and “recurrent falls.” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  19. Testing and analysis of internal hardwood log defect prediction models

    Science.gov (United States)

    R. Edward. Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  20. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “ any fall ” and “ recurrent falls .” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  1. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  2. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  3. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  4. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  5. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  6. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  7. Econometric models for predicting confusion crop ratios

    Science.gov (United States)

    Umberger, D. E.; Proctor, M. H.; Clark, J. E.; Eisgruber, L. M.; Braschler, C. B. (Principal Investigator)

    1979-01-01

    Results for both the United States and Canada show that econometric models can provide estimates of confusion crop ratios that are more accurate than historical ratios. Whether these models can support the LACIE 90/90 accuracy criterion is uncertain. In the United States, experimenting with additional model formulations could provide improved methods models in some CRD's, particularly in winter wheat. Improved models may also be possible for the Canadian CD's. The more aggressive province/state models outperformed individual CD/CRD models. This result was expected partly because acreage statistics are based on sampling procedures, and the sampling precision declines from the province/state to the CD/CRD level. Declining sampling precision and the need to substitute province/state data for the CD/CRD data introduced measurement error into the CD/CRD models.

  8. A computational model that predicts behavioral sensitivity to intracortical microstimulation

    Science.gov (United States)

    Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J.

    2017-02-01

    Objective. Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. Approach. We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Main results. Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R 2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber’s law. Significance. The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics.

  9. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    Science.gov (United States)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  10. PEEX Modelling Platform for Seamless Environmental Prediction

    Science.gov (United States)

    Baklanov, Alexander; Mahura, Alexander; Arnold, Stephen; Makkonen, Risto; Petäjä, Tuukka; Kerminen, Veli-Matti; Lappalainen, Hanna K.; Ezau, Igor; Nuterman, Roman; Zhang, Wen; Penenko, Alexey; Gordov, Evgeny; Zilitinkevich, Sergej; Kulmala, Markku

    2017-04-01

    The Pan-Eurasian EXperiment (PEEX) is a multidisciplinary, multi-scale research programme stared in 2012 and aimed at resolving the major uncertainties in Earth System Science and global sustainability issues concerning the Arctic and boreal Northern Eurasian regions and in China. Such challenges include climate change, air quality, biodiversity loss, chemicalization, food supply, and the use of natural resources by mining, industry, energy production and transport. The research infrastructure introduces the current state of the art modeling platform and observation systems in the Pan-Eurasian region and presents the future baselines for the coherent and coordinated research infrastructures in the PEEX domain. The PEEX modeling Platform is characterized by a complex seamless integrated Earth System Modeling (ESM) approach, in combination with specific models of different processes and elements of the system, acting on different temporal and spatial scales. The ensemble approach is taken to the integration of modeling results from different models, participants and countries. PEEX utilizes the full potential of a hierarchy of models: scenario analysis, inverse modeling, and modeling based on measurement needs and processes. The models are validated and constrained by available in-situ and remote sensing data of various spatial and temporal scales using data assimilation and top-down modeling. The analyses of the anticipated large volumes of data produced by available models and sensors will be supported by a dedicated virtual research environment developed for these purposes.

  11. Prediction and setup of phytoplankton statistical model of Qiandaohu Lake

    Institute of Scientific and Technical Information of China (English)

    严力蛟; 全为民; 赵晓慧

    2004-01-01

    This research considers the mathematical relationship between concentration of Chla and seven environmental factors, i.e. Lake water temperature (T), Secci-depth (SD), pH, DO, CODMn, Total Nitrogen (TN), Total Phosphorus (TP).Stepwise linear regression of 1997 to 1999 monitoring data at each sampling point of Qiandaohu Lake yielded the multivariate regression models presented in this paper. The concentration of Chla as simulation for the year 2000 by the regression model was similar to the observed value. The suggested mathematical relationship could be used to predict changes in the lakewater environment at any point in time. The results showed that SD, TP and pH were the most significant factors affecting Chla concentration.

  12. Models Predicting Success of Infertility Treatment: A Systematic Review

    Science.gov (United States)

    Zarinara, Alireza; Zeraati, Hojjat; Kamali, Koorosh; Mohammad, Kazem; Shahnazari, Parisa; Akhondi, Mohammad Mehdi

    2016-01-01

    Background: Infertile couples are faced with problems that affect their marital life. Infertility treatment is expensive and time consuming and occasionally isn’t simply possible. Prediction models for infertility treatment have been proposed and prediction of treatment success is a new field in infertility treatment. Because prediction of treatment success is a new need for infertile couples, this paper reviewed previous studies for catching a general concept in applicability of the models. Methods: This study was conducted as a systematic review at Avicenna Research Institute in 2015. Six data bases were searched based on WHO definitions and MESH key words. Papers about prediction models in infertility were evaluated. Results: Eighty one papers were eligible for the study. Papers covered years after 1986 and studies were designed retrospectively and prospectively. IVF prediction models have more shares in papers. Most common predictors were age, duration of infertility, ovarian and tubal problems. Conclusion: Prediction model can be clinically applied if the model can be statistically evaluated and has a good validation for treatment success. To achieve better results, the physician and the couples’ needs estimation for treatment success rate were based on history, the examination and clinical tests. Models must be checked for theoretical approach and appropriate validation. The privileges for applying the prediction models are the decrease in the cost and time, avoiding painful treatment of patients, assessment of treatment approach for physicians and decision making for health managers. The selection of the approach for designing and using these models is inevitable. PMID:27141461

  13. The regional prediction model of PM10 concentrations for Turkey

    Science.gov (United States)

    Güler, Nevin; Güneri İşçi, Öznur

    2016-11-01

    This study is aimed to predict a regional model for weekly PM10 concentrations measured air pollution monitoring stations in Turkey. There are seven geographical regions in Turkey and numerous monitoring stations at each region. Predicting a model conventionally for each monitoring station requires a lot of labor and time and it may lead to degradation in quality of prediction when the number of measurements obtained from any õmonitoring station is small. Besides, prediction models obtained by this way only reflect the air pollutant behavior of a small area. This study uses Fuzzy C-Auto Regressive Model (FCARM) in order to find a prediction model to be reflected the regional behavior of weekly PM10 concentrations. The superiority of FCARM is to have the ability of considering simultaneously PM10 concentrations measured monitoring stations in the specified region. Besides, it also works even if the number of measurements obtained from the monitoring stations is different or small. In order to evaluate the performance of FCARM, FCARM is executed for all regions in Turkey and prediction results are compared to statistical Autoregressive (AR) Models predicted for each station separately. According to Mean Absolute Percentage Error (MAPE) criteria, it is observed that FCARM provides the better predictions with a less number of models.

  14. Characterizing climate predictability and model response variability from multiple initial condition and multi-model ensembles

    CERN Document Server

    Kumar, Devashish

    2016-01-01

    Climate models are thought to solve boundary value problems unlike numerical weather prediction, which is an initial value problem. However, climate internal variability (CIV) is thought to be relatively important at near-term (0-30 year) prediction horizons, especially at higher resolutions. The recent availability of significant numbers of multi-model (MME) and multi-initial condition (MICE) ensembles allows for the first time a direct sensitivity analysis of CIV versus model response variability (MRV). Understanding the relative agreement and variability of MME and MICE ensembles for multiple regions, resolutions, and projection horizons is critical for focusing model improvements, diagnostics, and prognosis, as well as impacts, adaptation, and vulnerability studies. Here we find that CIV (MICE agreement) is lower (higher) than MRV (MME agreement) across all spatial resolutions and projection time horizons for both temperature and precipitation. However, CIV dominates MRV over higher latitudes generally an...

  15. Application of Model Predictive Control to BESS for Microgrid Control

    Directory of Open Access Journals (Sweden)

    Thai-Thanh Nguyen

    2015-08-01

    Full Text Available Battery energy storage systems (BESSs have been widely used for microgrid control. Generally, BESS control systems are based on proportional-integral (PI control techniques with the outer and inner control loops based on PI regulators. Recently, model predictive control (MPC has attracted attention for application to future energy processing and control systems because it can easily deal with multivariable cases, system constraints, and nonlinearities. This study considers the application of MPC-based BESSs to microgrid control. Two types of MPC are presented in this study: MPC based on predictive power control (PPC and MPC based on PI control in the outer and predictive current control (PCC in the inner control loops. In particular, the effective application of MPC for microgrids with multiple BESSs should be considered because of the differences in their control performance. In this study, microgrids with two BESSs based on two MPC techniques are considered as an example. The control performance of the MPC used for the control microgrid is compared to that of the PI control. The proposed control strategy is investigated through simulations using MATLAB/Simulink software. The simulation results show that the response time, power and voltage ripples, and frequency spectrum could be improved significantly by using MPC.

  16. Simplified Predictive Models for CO2 Sequestration Performance Assessment

    Science.gov (United States)

    Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis

    2014-05-01

    simulations, the LHS-based meta-model yields a more robust predictive model, as verified by a k-fold cross-validation approach. In the third category (RMM), we use a reduced-order modeling procedure that combines proper orthogonal decomposition (POD) for reducing problem dimensionality with trajectory-piecewise linearization (TPWL) for extrapolating system response at new control points from a limited number of trial runs ("snapshots"). We observe significant savings in computational time with very good accuracy from the POD-TPWL reduced order model - which could be important in the context of history matching, uncertainty quantification and optimization problems. The paper will present results from our ongoing investigations, and also discuss future research directions and likely outcomes. This work was supported by U.S. Department of Energy National Energy Technology Laboratory award DE-FE0009051 and Ohio Department of Development grant D-13-02.

  17. Nonlinear model predictive control of a packed distillation column

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, A.A.; Edgar, T.F. (Univ. of Texas, Austin, TX (United States). Dept. of Chemical Engineering)

    1993-10-01

    A rigorous dynamic model based on fundamental chemical engineering principles was formulated for a packed distillation column separating a mixture of cyclohexane and n-heptane. This model was simplified to a form suitable for use in on-line model predictive control calculations. A packed distillation column was operated at several operating conditions to estimate two unknown model parameters in the rigorous and simplified models. The actual column response to step changes in the feed rate, distillate rate, and reboiler duty agreed well with dynamic model predictions. One unusual characteristic observed was that the packed column exhibited gain-sign changes, which are very difficult to treat using conventional linear feedback control. Nonlinear model predictive control was used to control the distillation column at an operating condition where the process gain changed sign. An on-line, nonlinear model-based scheme was used to estimate unknown/time-varying model parameters.

  18. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine;

    Model Predictive Control (MPC) of building systems is a promising approach to optimize building energy performance. In contrast to traditional control strategies which are reactive in nature, MPC optimizes the utilization of resources based on the predicted effects. It has been shown that energy...... savings potential of this technique can reach up to 40% compared to conventional control strategies depending on the particular building type. However, the effort needed to implement MPC in buildings is significant and often considered prohibitive. That is why until now fully-functional MPC has been...

  19. Application of Nonlinear Predictive Control Based on RBF Network Predictive Model in MCFC Plant

    Institute of Scientific and Technical Information of China (English)

    CHEN Yue-hua; CAO Guang-yi; ZHU Xin-jian

    2007-01-01

    This paper described a nonlinear model predictive controller for regulating a molten carbonate fuel cell (MCFC). A detailed mechanism model of output voltage of a MCFC was presented at first. However, this model was too complicated to be used in a control system. Consequently, an off line radial basis function (RBF) network was introduced to build a nonlinear predictive model. And then, the optimal control sequences were obtained by applying golden mean method. The models and controller have been realized in the MATLAB environment. Simulation results indicate the proposed algorithm exhibits satisfying control effect even when the current densities vary largely.

  20. Efficient Control of Nonlinear Noise-Corrupted Systems Using a Novel Model Predictive Control Framework

    OpenAIRE

    Weissel, Florian; Huber, Marco F.; Hanebeck, Uwe D.

    2007-01-01

    Model identification and measurement acquisition is always to some degree uncertain. Therefore, a framework for Nonlinear Model Predictive Control (NMPC) is proposed that explicitly considers the noise influence on nonlinear dynamic systems with continuous state spaces and a finite set of control inputs in order to significantly increase the control quality. Integral parts of NMPC are the prediction of system states over a finite horizon as well as the problem specific modeling of reward func...

  1. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan;

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re......-search of prediction models, it was observed that different models have different capabilities and also no single model is suitable under all situations. The idea behind EPS (ensemble prediction systems) is to take advantage of the unique features of each subsystem to detain diverse patterns that exist in the dataset....... The conferred results show that the prediction errors can be decreased, while the computation time is reduced....

  2. Improving Environmental Model Calibration and Prediction

    Science.gov (United States)

    2011-01-18

    groundwater model calibration. Adv. Water Resour., 29(4):605–623, 2006. [9] B.E. Skahill, J.S. Baggett, S. Frankenstein , and C.W. Downer. More efficient...of Hydrology, Environmental Modelling & Software, or Water Resources Research). Skahill, B., Baggett, J., Frankenstein , S., and Downer, C.W. (2009

  3. Can multivariate models based on MOAKS predict OA knee pain? Data from the Osteoarthritis Initiative

    Science.gov (United States)

    Luna-Gómez, Carlos D.; Zanella-Calzada, Laura A.; Galván-Tejada, Jorge I.; Galván-Tejada, Carlos E.; Celaya-Padilla, José M.

    2017-03-01

    Osteoarthritis is the most common rheumatic disease in the world. Knee pain is the most disabling symptom in the disease, the prediction of pain is one of the targets in preventive medicine, this can be applied to new therapies or treatments. Using the magnetic resonance imaging and the grading scales, a multivariate model based on genetic algorithms is presented. Using a predictive model can be useful to associate minor structure changes in the joint with the future knee pain. Results suggest that multivariate models can be predictive with future knee chronic pain. All models; T0, T1 and T2, were statistically significant, all p values were 0.60.

  4. LINEAR LAYER AND GENERALIZED REGRESSION COMPUTATIONAL INTELLIGENCE MODELS FOR PREDICTING SHELF LIFE OF PROCESSED CHEESE

    Directory of Open Access Journals (Sweden)

    S. Goyal

    2012-03-01

    Full Text Available This paper highlights the significance of computational intelligence models for predicting shelf life of processed cheese stored at 7-8 g.C. Linear Layer and Generalized Regression models were developed with input parameters: Soluble nitrogen, pH, Standard plate count, Yeast & mould count, Spores, and sensory score as output parameter. Mean Square Error, Root Mean Square Error, Coefficient of Determination and Nash - Sutcliffo Coefficient were used in order to compare the prediction ability of the models. The study revealed that Generalized Regression computational intelligence models are quite effective in predicting the shelf life of processed cheese stored at 7-8 g.C.

  5. Model Predictive Control for Smart Energy Systems

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus

    load shifting capabilities of the units that adapts to the given price predictions. We furthermore evaluated control performance in terms of economic savings for different control strategies and forecasts. Chapter 5 describes and compares the proposed large-scale Aggregator control strategies....... Aggregators are assumed to play an important role in the future Smart Grid and coordinate a large portfolio of units. The developed economic MPC controllers interfaces each unit directly to an Aggregator. We developed several MPC-based aggregation strategies that coordinates the global behavior of a portfolio...

  6. Predictive significance of standardized uptake value parameters of FDG-PET in patients with non-small cell lung carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Duan, X-Y.; Wang, W.; Li, M.; Li, Y.; Guo, Y-M. [PET-CT Center, The First Affiliated Hospital of Xi' an, Jiaotong University, Xi' an, Shaanxi (China)

    2015-02-03

    {sup 18}F-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) is widely used to diagnose and stage non-small cell lung cancer (NSCLC). The aim of this retrospective study was to evaluate the predictive ability of different FDG standardized uptake values (SUVs) in 74 patients with newly diagnosed NSCLC. {sup 18}F-FDG PET/CT scans were performed and different SUV parameters (SUV{sub max}, SUV{sub avg}, SUV{sub T/L}, and SUV{sub T/A}) obtained, and their relationship with clinical characteristics were investigated. Meanwhile, correlation and multiple stepwise regression analyses were performed to determine the primary predictor of SUVs for NSCLC. Age, gender, and tumor size significantly affected SUV parameters. The mean SUVs of squamous cell carcinoma were higher than those of adenocarcinoma. Poorly differentiated tumors exhibited higher SUVs than well-differentiated ones. Further analyses based on the pathologic type revealed that the SUV{sub max}, SUV{sub avg}, and SUV{sub T/L} of poorly differentiated adenocarcinoma tumors were higher than those of moderately or well-differentiated tumors. Among these four SUV parameters, SUV{sub T/L} was the primary predictor for tumor differentiation. However, in adenocarcinoma, SUV{sub max} was the determining factor for tumor differentiation. Our results showed that these four SUV parameters had predictive significance related to NSCLC tumor differentiation; SUV{sub T/L} appeared to be most useful overall, but SUV{sub max} was the best index for adenocarcinoma tumor differentiation.

  7. Comparing predictive validity of four ballistic swing phase models of human walking.

    Science.gov (United States)

    Selles, R W; Bussmann, J B; Wagenaar, R C; Stam, H J

    2001-09-01

    It is unclear to what extent ballistic walking models can be used to qualitatively predict the swing phase at comfortable walking speed. Different study findings regarding the accuracy of the predictions of the swing phase kinematics may have been caused by differences in (1) kinematic input, (2) model characteristics (e.g. the number of segments), and (3) evaluation criteria. In the present study, the predictive validity of four ballistic swing phase models was evaluated and compared, that is, (1) the ballistic walking model as originally introduced by Mochon and McMahon, (2) an extended version of this model in which heel-off of the stance leg is added, (3) a double pendulum model, consisting of a two-segment swing leg with a prescribed hip trajectory, and (4) a shank pendulum model consisting of a shank and rigidly attached foot with a prescribed knee trajectory. The predictive validity was evaluated by comparing the outcome of the model simulations with experimentally derived swing phase kinematics of six healthy subjects. In all models, statistically significant differences were found between model output and experimental data. All models underestimated swing time and step length. In addition, statistically significant differences were found between the output of the different models. The present study shows that although qualitative similarities exist between the ballistic models and normal gait at comfortable walking speed, these models cannot adequately predict swing phase kinematics.

  8. Combining logistic regression and neural networks to create predictive models.

    OpenAIRE

    Spackman, K. A.

    1992-01-01

    Neural networks are being used widely in medicine and other areas to create predictive models from data. The statistical method that most closely parallels neural networks is logistic regression. This paper outlines some ways in which neural networks and logistic regression are similar, shows how a small modification of logistic regression can be used in the training of neural network models, and illustrates the use of this modification for variable selection and predictive model building wit...

  9. Integrated hydro-bacterial modelling for predicting bathing water quality

    Science.gov (United States)

    Huang, Guoxian; Falconer, Roger A.; Lin, Binliang

    2017-03-01

    In recent years health risks associated with the non-compliance of bathing water quality have received increasing worldwide attention. However, it is particularly challenging to establish the source of any non-compliance, due to the complex nature of the source of faecal indicator organisms, and the fate and delivery processes and scarcity of field measured data in many catchments and estuaries. In the current study an integrated hydro-bacterial model, linking a catchment, 1-D model and 2-D model were integrated to simulate the adsorption-desorption processes of faecal bacteria to and from sediment particles in river, estuarine and coastal waters, respectively. The model was then validated using hydrodynamic, sediment and faecal bacteria concentration data, measured in 2012, in the Ribble river and estuary, and along the Fylde coast, UK. Particular emphasis has been placed on the mechanism of faecal bacteria transport and decay through the deposition and resuspension of suspended sediments. The results showed that by coupling the E.coli concentration with the sediment transport processes, the accuracy of the predicted E.coli levels was improved. A series of scenario runs were then carried out to investigate the impacts of different management scenarios on the E.coli concentration levels in the coastal bathing water sites around Liverpool Bay, UK. The model results show that the level of compliance with the new EU bathing water standards can be improved significantly by extending outfalls and/or reducing urban sources by typically 50%.

  10. A thermodynamic model to predict wax formation in petroleum fluids

    Energy Technology Data Exchange (ETDEWEB)

    Coutinho, J.A.P. [Universidade de Aveiro (Portugal). Dept. de Quimica. Centro de Investigacao em Quimica]. E-mail: jcoutinho@dq.ua.pt; Pauly, J.; Daridon, J.L. [Universite de Pau et des Pays de l' Adour, Pau (France). Lab. des Fluides Complexes

    2001-12-01

    Some years ago the authors proposed a model for the non-ideality of the solid phase, based on the Predictive Local Composition concept. This was first applied to the Wilson equation and latter extended to NRTL and UNIQUAC models. Predictive UNIQUAC proved to be extraordinarily successful in predicting the behaviour of both model and real hydrocarbon fluids at low temperatures. This work illustrates the ability of Predictive UNIQUAC in the description of the low temperature behaviour of petroleum fluids. It will be shown that using Predictive UNIQUAC in the description of the solid phase non-ideality a complete prediction of the low temperature behaviour of synthetic paraffin solutions, fuels and crude oils is achieved. The composition of both liquid and solid phases, the amount of crystals formed and the cloud points are predicted within the accuracy of the experimental data. The extension of Predictive UNIQUAC to high pressures, by coupling it with an EOS/G{sup E} model based on the SRK EOS used with the LCVM mixing rule, is proposed and predictions of phase envelopes for live oils are compared with experimental data. (author)

  11. A THERMODYNAMIC MODEL TO PREDICT WAX FORMATION IN PETROLEUM FLUIDS

    Directory of Open Access Journals (Sweden)

    J.A.P. Coutinho

    2001-12-01

    Full Text Available Some years ago the authors proposed a model for the non-ideality of the solid phase, based on the Predictive Local Composition concept. This was first applied to the Wilson equation and latter extended to NRTL and UNIQUAC models. Predictive UNIQUAC proved to be extraordinarily successful in predicting the behaviour of both model and real hydrocarbon fluids at low temperatures. This work illustrates the ability of Predictive UNIQUAC in the description of the low temperature behaviour of petroleum fluids. It will be shown that using Predictive UNIQUAC in the description of the solid phase non-ideality a complete prediction of the low temperature behaviour of synthetic paraffin solutions, fuels and crude oils is achieved. The composition of both liquid and solid phases, the amount of crystals formed and the cloud points are predicted within the accuracy of the experimental data. The extension of Predictive UNIQUAC to high pressures, by coupling it with an EOS/G E model based on the SRK EOS used with the LCVM mixing rule, is proposed and predictions of phase envelopes for live oils are compared with experimental data.

  12. Prediction and assimilation of surf-zone processes using a Bayesian network: Part I: Forward models

    Science.gov (United States)

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    Prediction of coastal processes, including waves, currents, and sediment transport, can be obtained from a variety of detailed geophysical-process models with many simulations showing significant skill. This capability supports a wide range of research and applied efforts that can benefit from accurate numerical predictions. However, the predictions are only as accurate as the data used to drive the models and, given the large temporal and spatial variability of the surf zone, inaccuracies in data are unavoidable such that useful predictions require corresponding estimates of uncertainty. We demonstrate how a Bayesian-network model can be used to provide accurate predictions of wave-height evolution in the surf zone given very sparse and/or inaccurate boundary-condition data. The approach is based on a formal treatment of a data-assimilation problem that takes advantage of significant reduction of the dimensionality of the model system. We demonstrate that predictions of a detailed geophysical model of the wave evolution are reproduced accurately using a Bayesian approach. In this surf-zone application, forward prediction skill was 83%, and uncertainties in the model inputs were accurately transferred to uncertainty in output variables. We also demonstrate that if modeling uncertainties were not conveyed to the Bayesian network (i.e., perfect data or model were assumed), then overly optimistic prediction uncertainties were computed. More consistent predictions and uncertainties were obtained by including model-parameter errors as a source of input uncertainty. Improved predictions (skill of 90%) were achieved because the Bayesian network simultaneously estimated optimal parameters while predicting wave heights.

  13. Predicting and Modelling of Survival Data when Cox's Regression Model does not hold

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects...

  14. Predictive error analysis for a water resource management model

    Science.gov (United States)

    Gallagher, Mark; Doherty, John

    2007-02-01

    SummaryIn calibrating a model, a set of parameters is assigned to the model which will be employed for the making of all future predictions. If these parameters are estimated through solution of an inverse problem, formulated to be properly posed through either pre-calibration or mathematical regularisation, then solution of this inverse problem will, of necessity, lead to a simplified parameter set that omits the details of reality, while still fitting historical data acceptably well. Furthermore, estimates of parameters so obtained will be contaminated by measurement noise. Both of these phenomena will lead to errors in predictions made by the model, with the potential for error increasing with the hydraulic property detail on which the prediction depends. Integrity of model usage demands that model predictions be accompanied by some estimate of the possible errors associated with them. The present paper applies theory developed in a previous work to the analysis of predictive error associated with a real world, water resource management model. The analysis offers many challenges, including the fact that the model is a complex one that was partly calibrated by hand. Nevertheless, it is typical of models which are commonly employed as the basis for the making of important decisions, and for which such an analysis must be made. The potential errors associated with point-based and averaged water level and creek inflow predictions are examined, together with the dependence of these errors on the amount of averaging involved. Error variances associated with predictions made by the existing model are compared with "optimized error variances" that could have been obtained had calibration been undertaken in such a way as to minimize predictive error variance. The contributions by different parameter types to the overall error variance of selected predictions are also examined.

  15. Models for short term malaria prediction in Sri Lanka

    Directory of Open Access Journals (Sweden)

    Galappaththy Gawrie NL

    2008-05-01

    Full Text Available Abstract Background Malaria in Sri Lanka is unstable and fluctuates in intensity both spatially and temporally. Although the case counts are dwindling at present, given the past history of resurgence of outbreaks despite effective control measures, the control programmes have to stay prepared. The availability of long time series of monitored/diagnosed malaria cases allows for the study of forecasting models, with an aim to developing a forecasting system which could assist in the efficient allocation of resources for malaria control. Methods Exponentially weighted moving average models, autoregressive integrated moving average (ARIMA models with seasonal components, and seasonal multiplicative autoregressive integrated moving average (SARIMA models were compared on monthly time series of district malaria cases for their ability to predict the number of malaria cases one to four months ahead. The addition of covariates such as the number of malaria cases in neighbouring districts or rainfall were assessed for their ability to improve prediction of selected (seasonal ARIMA models. Results The best model for forecasting and the forecasting error varied strongly among the districts. The addition of rainfall as a covariate improved prediction of selected (seasonal ARIMA models modestly in some districts but worsened prediction in other districts. Improvement by adding rainfall was more frequent at larger forecasting horizons. Conclusion Heterogeneity of patterns of malaria in Sri Lanka requires regionally specific prediction models. Prediction error was large at a minimum of 22% (for one of the districts for one month ahead predictions. The modest improvement made in short term prediction by adding rainfall as a covariate to these prediction models may not be sufficient to merit investing in a forecasting system for which rainfall data are routinely processed.

  16. Aggregate driver model to enable predictable behaviour

    Science.gov (United States)

    Chowdhury, A.; Chakravarty, T.; Banerjee, T.; Balamuralidhar, P.

    2015-09-01

    The categorization of driving styles, particularly in terms of aggressiveness and skill is an emerging area of interest under the broader theme of intelligent transportation. There are two possible discriminatory techniques that can be applied for such categorization; a microscale (event based) model and a macro-scale (aggregate) model. It is believed that an aggregate model will reveal many interesting aspects of human-machine interaction; for example, we may be able to understand the propensities of individuals to carry out a given task over longer periods of time. A useful driver model may include the adaptive capability of the human driver, aggregated as the individual propensity to control speed/acceleration. Towards that objective, we carried out experiments by deploying smartphone based application to be used for data collection by a group of drivers. Data is primarily being collected from GPS measurements including position & speed on a second-by-second basis, for a number of trips over a two months period. Analysing the data set, aggregate models for individual drivers were created and their natural aggressiveness were deduced. In this paper, we present the initial results for 12 drivers. It is shown that the higher order moments of the acceleration profile is an important parameter and identifier of journey quality. It is also observed that the Kurtosis of the acceleration profiles stores major information about the driving styles. Such an observation leads to two different ranking systems based on acceleration data. Such driving behaviour models can be integrated with vehicle and road model and used to generate behavioural model for real traffic scenario.

  17. Noncausal spatial prediction filtering based on an ARMA model

    Institute of Scientific and Technical Information of China (English)

    Liu Zhipeng; Chen Xiaohong; Li Jingye

    2009-01-01

    Conventional f-x prediction filtering methods are based on an autoregressive model. The error section is first computed as a source noise but is removed as additive noise to obtain the signal, which results in an assumption inconsistency before and after filtering. In this paper, an autoregressive, moving-average model is employed to avoid the model inconsistency. Based on the ARMA model, a noncasual prediction filter is computed and a self-deconvolved projection filter is used for estimating additive noise in order to suppress random noise. The 1-D ARMA model is also extended to the 2-D spatial domain, which is the basis for noncasual spatial prediction filtering for random noise attenuation on 3-D seismic data. Synthetic and field data processing indicate this method can suppress random noise more effectively and preserve the signal simultaneously and does much better than other conventional prediction filtering methods.

  18. Performance Predictable ServiceBSP Model for Grid Computing

    Institute of Scientific and Technical Information of China (English)

    TONG Weiqin; MIAO Weikai

    2007-01-01

    This paper proposes a performance prediction model for grid computing model ServiceBSP to support developing high quality applications in grid environment. In ServiceBSP model,the agents carrying computing tasks are dispatched to the local domain of the selected computation services. By using the IP (integer program) approach, the Service Selection Agent selects the computation services with global optimized QoS (quality of service) consideration. The performance of a ServiceBSP application can be predicted according to the performance prediction model based on the QoS of the selected services. The performance prediction model can help users to analyze their applications and improve them by optimized the factors which affects the performance. The experiment shows that the Service Selection Agent can provide ServiceBSP users with satisfied QoS of applications.

  19. Two Predictions of a Compound Cue Model of Priming

    OpenAIRE

    Walenski, Matthew

    2003-01-01

    This paper examines two predictions of the compound cue model of priming (Ratcliff and McKoon, 1988). While this model has been used to provide an account of a wide range of priming effects, it may not actually predict priming in these or other circumstances. In order to predict priming effects, the compound cue model relies on an assumption that all items have the same number of associates. This assumption may be true in only a restricted number of cases. This paper demonstrates that when th...

  20. Predictive significance of DNA damage and repair biomarkers in triple-negative breast cancer patients treated with neoadjuvant chemotherapy: An exploratory analysis.

    Science.gov (United States)

    Vici, Patrizia; Di Benedetto, Anna; Ercolani, Cristiana; Pizzuti, Laura; Di Lauro, Luigi; Sergi, Domenico; Sperati, Francesca; Terrenato, Irene; Dattilo, Rosanna; Botti, Claudio; Fabi, Alessandra; Ramieri, Maria Teresa; Mentuccia, Lucia; Marinelli, Camilla; Iezzi, Laura; Gamucci, Teresa; Natoli, Clara; Vitale, Ilio; Barba, Maddalena; Mottolese, Marcella; De Maria, Ruggero; Maugeri-Saccà, Marcello

    2015-12-15

    Response of cancer cells to chemotherapy-induced DNA damage is regulated by the ATM-Chk2 and ATR-Chk1 pathways. We investigated the association between phosphorylated H2AX (γ-H2AX), a marker of DNA double-strand breaks that trigger the ATM-Chk2 cascade, and phosphorylated Chk1 (pChk1), with pathological complete response (pCR) in triple-negative breast cancer (TNBC) patients treated with neoadjuvant chemotherapy. γ-H2AX and pChk1 were retrospectively assessed by immunohistochemistry in a series of pretreatment biopsies related to 66 patients. In fifty-three tumors hormone receptor status was negative in both the diagnostic biopsies and residual cancers, whereas in 13 cases there was a slight hormone receptor expression that changed after chemotherapy. Internal validation was carried out. In the entire cohort elevated levels of γ-H2AX, but not pChk1, were associated with reduced pCR rate (p = 0.009). The association tested significant in both uni- and multivariate logistic regression models (OR 4.51, 95% CI: 1.39-14.66, p = 0.012, and OR 5.07, 95% CI: 1.28-20.09, p = 0.021, respectively). Internal validation supported the predictive value of the model. The predictive ability of γ-H2AX was further confirmed in the multivariate model after exclusion of tumors that underwent changes in hormone receptor status during chemotherapy (OR 7.07, 95% CI: 1.39-36.02, p = 0.018). Finally, in residual diseases a significant decrease of γ-H2AX levels was observed (p predict pCR in TNBC and deserves larger, prospective studies.

  1. Aerodynamic Noise Prediction Using stochastic Turbulence Modeling

    Directory of Open Access Journals (Sweden)

    Arash Ahmadzadegan

    2008-01-01

    Full Text Available Amongst many approaches to determine the sound propagated from turbulent flows, hybrid methods, in which the turbulent noise source field is computed or modeled separately from the far field calculation, are frequently used. For basic estimation of sound propagation, less computationally intensive methods can be developed using stochastic models of the turbulent fluctuations (turbulent noise source field. A simple and easy to use stochastic model for generating turbulent velocity fluctuations called continuous filter white noise (CFWN model was used. This method based on the use of classical Langevian-equation to model the details of fluctuating field superimposed on averaged computed quantities. The resulting sound field due to the generated unsteady flow field was evaluated using Lighthill's acoustic analogy. Volume integral method used for evaluating the acoustic analogy. This formulation presents an advantage, as it confers the possibility to determine separately the contribution of the different integral terms and also integration regions to the radiated acoustic pressure. Our results validated by comparing the directivity and the overall sound pressure level (OSPL magnitudes with the available experimental results. Numerical results showed reasonable agreement with the experiments, both in maximum directivity and magnitude of the OSPL. This method presents a very suitable tool for the noise calculation of different engineering problems in early stages of the design process where rough estimates using cheaper methods are needed for different geometries.

  2. A Predictive Model of High Shear Thrombus Growth.

    Science.gov (United States)

    Mehrabadi, Marmar; Casa, Lauren D C; Aidun, Cyrus K; Ku, David N

    2016-08-01

    The ability to predict the timescale of thrombotic occlusion in stenotic vessels may improve patient risk assessment for thrombotic events. In blood contacting devices, thrombosis predictions can lead to improved designs to minimize thrombotic risks. We have developed and validated a model of high shear thrombosis based on empirical correlations between thrombus growth and shear rate. A mathematical model was developed to predict the growth of thrombus based on the hemodynamic shear rate. The model predicts thrombus deposition based on initial geometric and fluid mechanic conditions, which are updated throughout the simulation to reflect the changing lumen dimensions. The model was validated by comparing predictions against actual thrombus growth in six separate in vitro experiments: stenotic glass capillary tubes (diameter = 345 µm) at three shear rates, the PFA-100(®) system, two microfluidic channel dimensions (heights = 300 and 82 µm), and a stenotic aortic graft (diameter = 5.5 mm). Comparison of the predicted occlusion times to experimental results shows excellent agreement. The model is also applied to a clinical angiography image to illustrate the time course of thrombosis in a stenotic carotid artery after plaque cap rupture. Our model can accurately predict thrombotic occlusion time over a wide range of hemodynamic conditions.

  3. The application of modeling and prediction with MRA wavelet network

    Institute of Scientific and Technical Information of China (English)

    LU Shu-ping; YANG Xue-jing; ZHAO Xi-ren

    2004-01-01

    As there are lots of non-linear systems in the real engineering, it is very important to do more researches on the modeling and prediction of non-linear systems. Based on the multi-resolution analysis (MRA) of wavelet theory, this paper combined the wavelet theory with neural network and established a MRA wavelet network with the scaling function and wavelet function as its neurons. From the analysis in the frequency domain, the results indicated that MRA wavelet network was better than other wavelet networks in the ability of approaching to the signals. An essential research was carried out on modeling and prediction with MRA wavelet network in the non-linear system. Using the lengthwise sway data received from the experiment of ship model, a model of offline prediction was established and was applied to the short-time prediction of ship motion. The simulation results indicated that the forecasting model improved the prediction precision effectively, lengthened the forecasting time and had a better prediction results than that of AR linear model.The research indicates that it is feasible to use the MRA wavelet network in the short -time prediction of ship motion.

  4. Integrated analysis of microRNA and mRNA expression: Adding biological significance to microRNA target predictions

    NARCIS (Netherlands)

    M. van Iterson (Mat); S. Bervoets (Sander); E.J. de Meijer (Emile); H.P. Buermans (Henk); P.A.C. 't Hoen (Peter); R.X. Menezes (Renée); J.M. Boer (Judith)

    2013-01-01

    textabstractCurrent microRNA target predictions are based on sequence information and empirically derived rules but do not make use of the expression of microRNAs and their targets. This study aimed to improve microRNA target predictions in a given biological context, using in silico predictions, mi

  5. A COMPARISON BETWEEN THREE PREDICTIVE MODELS OF COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    DUMITRU CIOBANU

    2013-12-01

    Full Text Available Time series prediction is an open problem and many researchers are trying to find new predictive methods and improvements for the existing ones. Lately methods based on neural networks are used extensively for time series prediction. Also, support vector machines have solved some of the problems faced by neural networks and they began to be widely used for time series prediction. The main drawback of those two methods is that they are global models and in the case of a chaotic time series it is unlikely to find such model. In this paper it is presented a comparison between three predictive from computational intelligence field one based on neural networks one based on support vector machine and another based on chaos theory. We show that the model based on chaos theory is an alternative to the other two methods.

  6. Stand diameter distribution modelling and prediction based on Richards function.

    Directory of Open Access Journals (Sweden)

    Ai-guo Duan

    Full Text Available The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM or maximum likelihood estimates method (MLEM were applied to estimate the parameters of models, and the parameter prediction method (PPM and parameter recovery method (PRM were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1 R distribution presented a more accurate simulation than three-parametric Weibull function; (2 the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3 the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4 the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  7. Stand diameter distribution modelling and prediction based on Richards function.

    Science.gov (United States)

    Duan, Ai-guo; Zhang, Jian-guo; Zhang, Xiong-qing; He, Cai-yun

    2013-01-01

    The objective of this study was to introduce application of the Richards equation on modelling and prediction of stand diameter distribution. The long-term repeated measurement data sets, consisted of 309 diameter frequency distributions from Chinese fir (Cunninghamia lanceolata) plantations in the southern China, were used. Also, 150 stands were used as fitting data, the other 159 stands were used for testing. Nonlinear regression method (NRM) or maximum likelihood estimates method (MLEM) were applied to estimate the parameters of models, and the parameter prediction method (PPM) and parameter recovery method (PRM) were used to predict the diameter distributions of unknown stands. Four main conclusions were obtained: (1) R distribution presented a more accurate simulation than three-parametric Weibull function; (2) the parameters p, q and r of R distribution proved to be its scale, location and shape parameters, and have a deep relationship with stand characteristics, which means the parameters of R distribution have good theoretical interpretation; (3) the ordinate of inflection point of R distribution has significant relativity with its skewness and kurtosis, and the fitted main distribution range for the cumulative diameter distribution of Chinese fir plantations was 0.4∼0.6; (4) the goodness-of-fit test showed diameter distributions of unknown stands can be well estimated by applying R distribution based on PRM or the combination of PPM and PRM under the condition that only quadratic mean DBH or plus stand age are known, and the non-rejection rates were near 80%, which are higher than the 72.33% non-rejection rate of three-parametric Weibull function based on the combination of PPM and PRM.

  8. New Approaches for Channel Prediction Based on Sinusoidal Modeling

    Directory of Open Access Journals (Sweden)

    Ekman Torbjörn

    2007-01-01

    Full Text Available Long-range channel prediction is considered to be one of the most important enabling technologies to future wireless communication systems. The prediction of Rayleigh fading channels is studied in the frame of sinusoidal modeling in this paper. A stochastic sinusoidal model to represent a Rayleigh fading channel is proposed. Three different predictors based on the statistical sinusoidal model are proposed. These methods outperform the standard linear predictor (LP in Monte Carlo simulations, but underperform with real measurement data, probably due to nonstationary model parameters. To mitigate these modeling errors, a joint moving average and sinusoidal (JMAS prediction model and the associated joint least-squares (LS predictor are proposed. It combines the sinusoidal model with an LP to handle unmodeled dynamics in the signal. The joint LS predictor outperforms all the other sinusoidal LMMSE predictors in suburban environments, but still performs slightly worse than the standard LP in urban environments.

  9. Prediction model for spring dust weather frequency in North China

    Institute of Scientific and Technical Information of China (English)

    LANG XianMei

    2008-01-01

    It is of great social and scientific importance and also very difficult to make reliable prediction for dust weather frequency (DWF) in North China. In this paper, the correlation between spring DWF in Beijing and Tianjin observation stations, taken as examples in North China, and seasonally averaged surface air temperature, precipitation, Arctic Oscillation, Antarctic Oscillation, South Oscillation, near surface meridional wind and Eurasian westerly index is respectively calculated so as to construct a prediction model for spring DWF in North China by using these climatic factors. Two prediction models, I.e. Model-Ⅰ and model-Ⅱ, are then set up respectively based on observed climate data and the 32-year (1970--2001) extra-seasonal hindcast experiment data as reproduced by the nine-level Atmospheric General Circulation Model developed at the Institute of Atmospheric Physics (IAP9L-AGCM). It is indicated that the correlation coefficient between the observed and predicted DWF reaches 0.933 in the model-Ⅰ, suggesting a high prediction skill one season ahead. The corresponding value is high up to 0.948 for the subsequent model-Ⅱ, which involves synchronous spring climate data reproduced by the IAP9L-AGCM relative to the model-Ⅰ. The model-Ⅱ can not only make more precise prediction but also can bring forward the lead time of real-time prediction from the model-Ⅰ's one season to half year. At last, the real-time predictability of the two models is evaluated. It follows that both the models display high prediction skill for both the interannual variation and linear trend of spring DWF in North China, and each is also featured by different advantages. As for the model-Ⅱ, the prediction skill is much higher than that of original approach by use of the IAP9L-AGCM alone. Therefore, the prediction idea put forward here should be popularized in other regions in China where dust weather occurs frequently.

  10. Prediction model for spring dust weather frequency in North China

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    It is of great social and scientific importance and also very difficult to make reliable prediction for dust weather frequency (DWF) in North China. In this paper, the correlation between spring DWF in Beijing and Tianjin observation stations, taken as examples in North China, and seasonally averaged surface air temperature, precipitation, Arctic Oscillation, Antarctic Oscillation, South Oscillation, near surface meridional wind and Eurasian westerly index is respectively calculated so as to construct a prediction model for spring DWF in North China by using these climatic factors. Two prediction models, i.e. model-I and model-II, are then set up respectively based on observed climate data and the 32-year (1970 -2001) extra-seasonal hindcast experiment data as reproduced by the nine-level Atmospheric General Circulation Model developed at the Institute of Atmospheric Physics (IAP9L-AGCM). It is indicated that the correlation coefficient between the observed and predicted DWF reaches 0.933 in the model-I, suggesting a high prediction skill one season ahead. The corresponding value is high up to 0.948 for the subsequent model-II, which involves synchronous spring climate data reproduced by the IAP9L-AGCM relative to the model-I. The model-II can not only make more precise prediction but also can bring forward the lead time of real-time prediction from the model-I’s one season to half year. At last, the real-time predictability of the two models is evaluated. It follows that both the models display high prediction skill for both the interannual variation and linear trend of spring DWF in North China, and each is also featured by different advantages. As for the model-II, the prediction skill is much higher than that of original approach by use of the IAP9L-AGCM alone. Therefore, the prediction idea put forward here should be popularized in other regions in China where dust weather occurs frequently.

  11. Model Predictive Control of Sewer Networks

    DEFF Research Database (Denmark)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik;

    2016-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored...... and controlled have thus become essential factors for efficient performance of waste water treatment plants. This paper examines methods for simplified modelling and controlling a sewer network. A practical approach to the problem is used by analysing simplified design model, which is based on the Barcelona...

  12. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    Tao Wu; Edward Lester; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical, Environmental and Mining Engineering

    2006-05-15

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coal particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.

  13. An evaluation of mathematical models for predicting skin permeability.

    Science.gov (United States)

    Lian, Guoping; Chen, Longjian; Han, Lujia

    2008-01-01

    A number of mathematical models have been proposed for predicting skin permeability, mostly empirical and very few are deterministic. Early empirical models use simple lipophilicity parameters. The recent trend is to use more complicated molecular structure descriptors. There has been much debate on which models best predict skin permeability. This article evaluates various mathematical models using a comprehensive experimental dataset of skin permeability for 124 chemical compounds compiled from various sources. Of the seven models compared, the deterministic model of Mitragotri gives the best prediction. The simple quantitative structure permeability relationships (QSPR) model of Potts and Guy gives the second best prediction. The two models have many features in common. Both assume the lipid matrix as the pathway of transdermal permeation. Both use octanol-water partition coefficient and molecular size. Even the mathematical formulae are similar. All other empirical QSPR models that use more complicated molecular structure descriptors fail to provide satisfactory prediction. The molecular structure descriptors in the more complicated QSPR models are empirically related to skin permeation. The mechanism on how these descriptors affect transdermal permeation is not clear. Mathematically it is an ill-defined approach to use many colinearly related parameters rather than fewer independent parameters in multi-linear regression.

  14. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  15. Prediction of bypass transition with differential Reynolds stress models

    NARCIS (Netherlands)

    Westin, K.J.A.; Henkes, R.A.W.M.

    1998-01-01

    Boundary layer transition induced by high levels of free stream turbulence (FSl), so called bypass transition, can not be predicted with conventional stability calculations (e.g. the en-method). The use of turbulence models for transition prediction has shown some success for this type of flows, and

  16. Prediction Models of Free-Field Vibrations from Railway Traffic

    DEFF Research Database (Denmark)

    Malmborg, Jens; Persson, Kent; Persson, Peter

    2017-01-01

    and railways close to where people work and live. Annoyance from traffic-induced vibrations and noise is expected to be a growing issue. To predict the level of vibration and noise in buildings caused by railway and road traffic, calculation models are needed. In the present paper, a simplified prediction...

  17. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re-search...

  18. An assessment of a multi-model ensemble of decadal climate predictions

    Science.gov (United States)

    Bellucci, A.; Haarsma, R.; Gualdi, S.; Athanasiadis, P. J.; Caian, M.; Cassou, C.; Fernandez, E.; Germe, A.; Jungclaus, J.; Kröger, J.; Matei, D.; Müller, W.; Pohlmann, H.; Salas y Melia, D.; Sanchez, E.; Smith, D.; Terray, L.; Wyser, K.; Yang, S.

    2015-05-01

    A multi-model ensemble of decadal prediction experiments, performed in the framework of the EU-funded COMBINE (Comprehensive Modelling of the Earth System for Better Climate Prediction and Projection) Project following the 5th Coupled Model Intercomparison Project protocol is examined. The ensemble combines a variety of dynamical models, initialization and perturbation strategies, as well as data assimilation products employed to constrain the initial state of the system. Taking advantage of the multi-model approach, several aspects of decadal climate predictions are assessed, including predictive skill, impact of the initialization strategy and the level of uncertainty characterizing the predicted fluctuations of key climate variables. The present analysis adds to the growing evidence that the current generation of climate models adequately initialized have significant skill in predicting years ahead not only the anthropogenic warming but also part of the internal variability of the climate system. An important finding is that the multi-model ensemble mean does generally outperform the individual forecasts, a well-documented result for seasonal forecasting, supporting the need to extend the multi-model framework to real-time decadal predictions in order to maximize the predictive capabilities of currently available decadal forecast systems. The multi-model perspective did also allow a more robust assessment of the impact of the initialization strategy on the quality of decadal predictions, providing hints of an improved forecast skill under full-value (with respect to anomaly) initialization in the near-term range, over the Indo-Pacific equatorial region. Finally, the consistency across the different model predictions was assessed. Specifically, different systems reveal a general agreement in predicting the near-term evolution of surface temperatures, displaying positive correlations between different decadal hindcasts over most of the global domain.

  19. Space Weather: Measurements, Models and Predictions

    Science.gov (United States)

    2014-03-21

    and record high levels of cosmic ray flux. There were broad-ranging terrestrial responses to this inactivity of the Sun. BC was involved in the...techniques for converting from one coordinate system (e.g., the invariant coordinate system used for the model) to another (e.g., the latitude- radius

  20. Monotone models for prediction in data mining

    NARCIS (Netherlands)

    Velikova, M.V.

    2006-01-01

    This dissertation studies the incorporation of monotonicity constraints as a type of domain knowledge into a data mining process. Monotonicity constraints are enforced at two stages¿data preparation and data modeling. The main contributions of the research are a novel procedure to test the degree of

  1. Predicting Magazine Audiences with a Loglinear Model.

    Science.gov (United States)

    1987-07-01

    important use of e.d. estimates is in media selection ( Aaker 1975; Lee 1962, 1963; Little and Lodish 1969). All advertising campaigns have a budget. It...N.Z. Listener 6061 39.0 4 0 22 References Aaker , D.A. (1975), "ADMOD:An Advertising Decision Model," Journal of Marketing Research, February, 37-45

  2. Scanpath Based N-Gram Models for Predicting Reading Behavior

    DEFF Research Database (Denmark)

    Mishra, Abhijit; Bhattacharyya, Pushpak; Carl, Michael

    2013-01-01

    Predicting reading behavior is a difficult task. Reading behavior depends on various linguistic factors (e.g. sentence length, structural complexity etc.) and other factors (e.g individual's reading style, age etc.). Ideally, a reading model should be similar to a language model where the model i...

  3. Better predictions when models are wrong or underspecified

    NARCIS (Netherlands)

    Ommen, Matthijs van

    2015-01-01

    Many statistical methods rely on models of reality in order to learn from data and to make predictions about future data. By necessity, these models usually do not match reality exactly, but are either wrong (none of the hypotheses in the model provides an accurate description of reality) or undersp

  4. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  5. A Multistep Chaotic Model for Municipal Solid Waste Generation Prediction.

    Science.gov (United States)

    Song, Jingwei; He, Jiaying

    2014-08-01

    In this study, a univariate local chaotic model is proposed to make one-step and multistep forecasts for daily municipal solid waste (MSW) generation in Seattle, Washington. For MSW generation prediction with long history data, this forecasting model was created based on a nonlinear dynamic method called phase-space reconstruction. Compared with other nonlinear predictive models, such as artificial neural network (ANN) and partial least square-support vector machine (PLS-SVM), and a commonly used linear seasonal autoregressive integrated moving average (sARIMA) model, this method has demonstrated better prediction accuracy from 1-step ahead prediction to 14-step ahead prediction assessed by both mean absolute percentage error (MAPE) and root mean square error (RMSE). Max error, MAPE, and RMSE show that chaotic models were more reliable than the other three models. As chaotic models do not involve random walk, their performance does not vary while ANN and PLS-SVM make different forecasts in each trial. Moreover, this chaotic model was less time consuming than ANN and PLS-SVM models.

  6. Using Pareto points for model identification in predictive toxicology.

    Science.gov (United States)

    Palczewska, Anna; Neagu, Daniel; Ridley, Mick

    2013-03-22

    : Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology.

  7. On the Predictiveness of Single-Field Inflationary Models

    CERN Document Server

    Burgess, C.P.; Trott, Michael

    2014-01-01

    We re-examine the predictiveness of single-field inflationary models and discuss how an unknown UV completion can complicate determining inflationary model parameters from observations, even from precision measurements. Besides the usual naturalness issues associated with having a shallow inflationary potential, we describe another issue for inflation, namely, unknown UV physics modifies the running of Standard Model (SM) parameters and thereby introduces uncertainty into the potential inflationary predictions. We illustrate this point using the minimal Higgs Inflationary scenario, which is arguably the most predictive single-field model on the market, because its predictions for $A_s$, $r$ and $n_s$ are made using only one new free parameter beyond those measured in particle physics experiments, and run up to the inflationary regime. We find that this issue can already have observable effects. At the same time, this UV-parameter dependence in the Renormalization Group allows Higgs Inflation to occur (in prin...

  8. A Composite Model Predictive Control Strategy for Furnaces

    Institute of Scientific and Technical Information of China (English)

    Hao Zang; Hongguang Li; Jingwen Huang; Jia Wang

    2014-01-01

    Tube furnaces are essential and primary energy intensive facilities in petrochemical plants. Operational optimi-zation of furnaces could not only help to improve product quality but also benefit to reduce energy consumption and exhaust emission. Inspired by this idea, this paper presents a composite model predictive control (CMPC) strategy, which, taking advantage of distributed model predictive control architectures, combines tracking nonlinear model predictive control and economic nonlinear model predictive control metrics to keep process running smoothly and optimize operational conditions. The control ers connected with two kinds of communi-cation networks are easy to organize and maintain, and stable to process interferences. A fast solution algorithm combining interior point solvers and Newton's method is accommodated to the CMPC realization, with reason-able CPU computing time and suitable online applications. Simulation for industrial case demonstrates that the proposed approach can ensure stable operations of furnaces, improve heat efficiency, and reduce the emission effectively.

  9. Submission Form for Peer-Reviewed Cancer Risk Prediction Models

    Science.gov (United States)

    If you have information about a peer-reviewd cancer risk prediction model that you would like to be considered for inclusion on this list, submit as much information as possible through the form on this page.

  10. ACCIDENT PREDICTION MODELS FOR UNSIGNALISED URBAN JUNCTIONS IN GHANA

    Directory of Open Access Journals (Sweden)

    Mohammed SALIFU, MSc., PhD, MIHT, MGhIE

    2004-01-01

    The accident prediction models developed have a potentially wide area of application and their systematic use is likely to improve considerably the quality and delivery of the engineering aspects of accident mitigation and prevention in Ghana.

  11. Development of a multi-year climate prediction model | Alexander ...

    African Journals Online (AJOL)

    Development of a multi-year climate prediction model. ... The available water resources in Southern Africa are rapidly approaching the limits of economic exploitation. ... that could be attributed to climate change arising from human activities.

  12. Compensatory versus noncompensatory models for predicting consumer preferences

    Directory of Open Access Journals (Sweden)

    Anja Dieckmann

    2009-04-01

    Full Text Available Standard preference models in consumer research assume that people weigh and add all attributes of the available options to derive a decision, while there is growing evidence for the use of simplifying heuristics. Recently, a greedoid algorithm has been developed (Yee, Dahan, Hauser and Orlin, 2007; Kohli and Jedidi, 2007 to model lexicographic heuristics from preference data. We compare predictive accuracies of the greedoid approach and standard conjoint analysis in an online study with a rating and a ranking task. The lexicographic model derived from the greedoid algorithm was better at predicting ranking compared to rating data, but overall, it achieved lower predictive accuracy for hold-out data than the compensatory model estimated by conjoint analysis. However, a considerable minority of participants was better predicted by lexicographic strategies. We conclude that the new algorithm will not replace standard tools for analyzing preferences, but can boost the study of situational and individual differences in preferential choice processes.

  13. Preoperative Metabolic Syndrome Is Predictive of Significant Gastric Cancer Mortality after Gastrectomy: The Fujian Prospective Investigation of Cancer (FIESTA) Study.

    Science.gov (United States)

    Hu, Dan; Peng, Feng; Lin, Xiandong; Chen, Gang; Zhang, Hejun; Liang, Binying; Ji, Kaida; Lin, Jinxiu; Chen, Lin-Feng; Zheng, Xiongwei; Niu, Wenquan

    2017-02-01

    Metabolic syndrome (MetS) has been shown to be associated with an increased risk of gastric cancer. However, the impact of MetS on gastric cancer mortality remains largely unknown. Here, we prospectively examined the prediction of preoperative MetS for gastric cancer mortality by analyzing a subset of data from the ongoing Fujian prospective investigation of cancer (FIESTA) study. This study was conducted among 3012 patients with gastric cancer who received radical gastrectomy between 2000 and 2010. The latest follow-up was completed in 2015. Blood/tissue specimens, demographic and clinicopathologic characteristics were collected at baseline. During 15-year follow-up, 1331 of 3012 patients died of gastric cancer. The median survival time (MST) of patients with MetS was 31.3months, which was significantly shorter than that of MetS-free patients (157.1months). The coexistence of MetS before surgery was associated with a 2.3-fold increased risk for gastric cancer mortality (P<0.001). The multivariate-adjusted hazard ratios (HRs) were increased with invasion depth T1/T2 (HR=2.78, P<0.001), regional lymph node metastasis N0 (HR=2.65, P<0.001), positive distant metastasis (HR=2.53, P<0.001), TNM stage I/II (HR=3.00, P<0.001), intestinal type (HR=2.96, P<0.001), negative tumor embolus (HR=2.34, P<0.001), and tumor size ≤4.5cm (HR=2.49, P<0.001). Further survival tree analysis confirmed the top splitting role of TNM stage, followed by MetS or hyperglycemia with remarkable discrimination ability. In this large cohort study, preoperative MetS, especially hyperglycemia, was predictive of significant gastric cancer mortality in patients with radical gastrectomy, especially for early stage of gastric cancer. Copyright © 2016. Published by Elsevier B.V.

  14. Haskell financial data modeling and predictive analytics

    CERN Document Server

    Ryzhov, Pavel

    2013-01-01

    This book is a hands-on guide that teaches readers how to use Haskell's tools and libraries to analyze data from real-world sources in an easy-to-understand manner.This book is great for developers who are new to financial data modeling using Haskell. A basic knowledge of functional programming is not required but will be useful. An interest in high frequency finance is essential.

  15. Significance of radiation models in investigating the flow phenomena around a Jovian entry body

    Science.gov (United States)

    Tiwari, S. N.; Subramanian, S. V.

    1978-01-01

    Formulation is presented to demonstrate the significance of a simplified radiation model in investigating the flow-phenomena in the viscous radiating shock layer of a Jovian entry body. For this, a nongray absorption model for hydrogen-helium gas is developed which consists of 30 steps over the spectral range of 0-20 eV. By employing this model results were obtained for temperature, pressure, density, and radiative flux in the shock layer and along the body surface. These are compared with results of two sophisticated radiative transport models available in the literature. Use of the present radiation model results in significant reduction in computational time. Results of this model are found to be in general agreement with results of other models. It is concluded that use of the present model is justified in investigating the flow phenomena around a Jovian entry body because it is relatively simple, computationally fast, and yields fairly accurate results.

  16. Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity.

    Science.gov (United States)

    Ng, Kenney; Sun, Jimeng; Hu, Jianying; Wang, Fei

    2015-01-01

    Personalized predictive models are customized for an individual patient and trained using information from similar patients. Compared to global models trained on all patients, they have the potential to produce more accurate risk scores and capture more relevant risk factors for individual patients. This paper presents an approach for building personalized predictive models and generating personalized risk factor profiles. A locally supervised metric learning (LSML) similarity measure is trained for diabetes onset and used to find clinically similar patients. Personalized risk profiles are created by analyzing the parameters of the trained personalized logistic regression models. A 15,000 patient data set, derived from electronic health records, is used to evaluate the approach. The predictive results show that the personalized models can outperform the global model. Cluster analysis of the risk profiles show groups of patients with similar risk factors, differences in the top risk factors for different groups of patients and differences between the individual and global risk factors.

  17. Model predictive control of P-time event graphs

    Science.gov (United States)

    Hamri, H.; Kara, R.; Amari, S.

    2016-12-01

    This paper deals with model predictive control of discrete event systems modelled by P-time event graphs. First, the model is obtained by using the dater evolution model written in the standard algebra. Then, for the control law, we used the finite-horizon model predictive control. For the closed-loop control, we used the infinite-horizon model predictive control (IH-MPC). The latter is an approach that calculates static feedback gains which allows the stability of the closed-loop system while respecting the constraints on the control vector. The problem of IH-MPC is formulated as a linear convex programming subject to a linear matrix inequality problem. Finally, the proposed methodology is applied to a transportation system.

  18. Prediction of cloud droplet number in a general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Ghan, S.J.; Leung, L.R. [Pacific Northwest National Lab., Richland, WA (United States)

    1996-04-01

    We have applied the Colorado State University Regional Atmospheric Modeling System (RAMS) bulk cloud microphysics parameterization to the treatment of stratiform clouds in the National Center for Atmospheric Research Community Climate Model (CCM2). The RAMS predicts mass concentrations of cloud water, cloud ice, rain and snow, and number concnetration of ice. We have introduced the droplet number conservation equation to predict droplet number and it`s dependence on aerosols.

  19. Predictive vegetation modeling for conservation: impact of error propagation from digital elevation data.

    Science.gov (United States)

    Van Niel, Kimberly P; Austin, Mike P

    2007-01-01

    The effect of digital elevation model (DEM) error on environmental variables, and subsequently on predictive habitat models, has not been explored. Based on an error analysis of a DEM, multiple error realizations of the DEM were created and used to develop both direct and indirect environmental variables for input to predictive habitat models. The study explores the effects of DEM error and the resultant uncertainty of results on typical steps in the modeling procedure for prediction of vegetation species presence/absence. Results indicate that all of these steps and results, including the statistical significance of environmental variables, shapes of species response curves in generalized additive models (GAMs), stepwise model selection, coefficients and standard errors for generalized linear models (GLMs), prediction accuracy (Cohen's kappa and AUC), and spatial extent of predictions, were greatly affected by this type of error. Error in the DEM can affect the reliability of interpretations of model results and level of accuracy in predictions, as well as the spatial extent of the predictions. We suggest that the sensitivity of DEM-derived environmental variables to error in the DEM should be considered before including them in the modeling processes.

  20. Mixed models for predictive modeling in actuarial science

    NARCIS (Netherlands)

    Antonio, K.; Zhang, Y.

    2012-01-01

    We start with a general discussion of mixed (also called multilevel) models and continue with illustrating specific (actuarial) applications of this type of models. Technical details on (linear, generalized, non-linear) mixed models follow: model assumptions, specifications, estimation techniques

  1. Time – Delay Simulated Artificial Neural Network Models for Predicting Shelf Life of Processed Cheese

    Directory of Open Access Journals (Sweden)

    Sumit Goyal

    2012-05-01

    Full Text Available This paper highlights the significance of Time-Delay ANN models for predicting shelf life of processed cheese stored at 7-8o^C. Bayesian regularization algorithm was selected as training function. Number of neurons in single and multiple hidden layers varied from 1 to 20. The network was trained with up to 100 epochs. Mean square error, root mean square error, coefficient of determination and nash - Sutcliffe coefficient were used for calculating the prediction capability of the developed models. Time-Delay ANN models with multilayer are quite efficient in predicting the shelf life of processed cheese stored at 7-8o^C.

  2. Webinar of paper 2013, Which method predicts recidivism best? A comparison of statistical, machine learning and data mining predictive models

    NARCIS (Netherlands)

    Tollenaar, N.; Van der Heijden, P.G.M.|info:eu-repo/dai/nl/073087998

    2013-01-01

    Using criminal population criminal conviction history information, prediction models are developed that predict three types of criminal recidivism: general recidivism, violent recidivism and sexual recidivism. The research question is whether prediction techniques from modern statistics, data mining

  3. Webinar of paper 2013, Which method predicts recidivism best? A comparison of statistical, machine learning and data mining predictive models

    NARCIS (Netherlands)

    Tollenaar, N.; Van der Heijden, P.G.M.

    2013-01-01

    Using criminal population criminal conviction history information, prediction models are developed that predict three types of criminal recidivism: general recidivism, violent recidivism and sexual recidivism. The research question is whether prediction techniques from modern statistics, data mining

  4. Modeling and prediction of retardance in citric acid coated ferrofluid using artificial neural network

    Science.gov (United States)

    Lin, Jing-Fung; Sheu, Jer-Jia

    2016-06-01

    Citric acid coated (citrate-stabilized) magnetite (Fe3O4) magnetic nanoparticles have been conducted and applied in the biomedical fields. Using Taguchi-based measured retardances as the training data, an artificial neural network (ANN) model was developed for the prediction of retardance in citric acid (CA) coated ferrofluid (FF). According to the ANN simulation results in the training stage, the correlation coefficient between predicted retardances and measured retardances was found to be as high as 0.9999998. Based on the well-trained ANN model, the predicted retardance at excellent program from Taguchi method showed less error of 2.17% compared with a multiple regression (MR) analysis of statistical significance. Meanwhile, the parameter analysis at excellent program by the ANN model had the guiding significance to find out a possible program for the maximum retardance. It was concluded that the proposed ANN model had high ability for the prediction of retardance in CA coated FF.

  5. Accurate Holdup Calculations with Predictive Modeling & Data Integration

    Energy Technology Data Exchange (ETDEWEB)

    Azmy, Yousry [North Carolina State Univ., Raleigh, NC (United States). Dept. of Nuclear Engineering; Cacuci, Dan [Univ. of South Carolina, Columbia, SC (United States). Dept. of Mechanical Engineering

    2017-04-03

    In facilities that process special nuclear material (SNM) it is important to account accurately for the fissile material that enters and leaves the plant. Although there are many stages and processes through which materials must be traced and measured, the focus of this project is material that is “held-up” in equipment, pipes, and ducts during normal operation and that can accumulate over time into significant quantities. Accurately estimating the holdup is essential for proper SNM accounting (vis-à-vis nuclear non-proliferation), criticality and radiation safety, waste management, and efficient plant operation. Usually it is not possible to directly measure the holdup quantity and location, so these must be inferred from measured radiation fields, primarily gamma and less frequently neutrons. Current methods to quantify holdup, i.e. Generalized Geometry Holdup (GGH), primarily rely on simple source configurations and crude radiation transport models aided by ad hoc correction factors. This project seeks an alternate method of performing measurement-based holdup calculations using a predictive model that employs state-of-the-art radiation transport codes capable of accurately simulating such situations. Inverse and data assimilation methods use the forward transport model to search for a source configuration that best matches the measured data and simultaneously provide an estimate of the level of confidence in the correctness of such configuration. In this work the holdup problem is re-interpreted as an inverse problem that is under-determined, hence may permit multiple solutions. A probabilistic approach is applied to solving the resulting inverse problem. This approach rates possible solutions according to their plausibility given the measurements and initial information. This is accomplished through the use of Bayes’ Theorem that resolves the issue of multiple solutions by giving an estimate of the probability of observing each possible solution. To use

  6. Catalytic cracking models developed for predictive control purposes

    Directory of Open Access Journals (Sweden)

    Dag Ljungqvist

    1993-04-01

    Full Text Available The paper deals with state-space modeling issues in the context of model-predictive control, with application to catalytic cracking. Emphasis is placed on model establishment, verification and online adjustment. Both the Fluid Catalytic Cracking (FCC and the Residual Catalytic Cracking (RCC units are discussed. Catalytic cracking units involve complex interactive processes which are difficult to operate and control in an economically optimal way. The strong nonlinearities of the FCC process mean that the control calculation should be based on a nonlinear model with the relevant constraints included. However, the model can be simple compared to the complexity of the catalytic cracking plant. Model validity is ensured by a robust online model adjustment strategy. Model-predictive control schemes based on linear convolution models have been successfully applied to the supervisory dynamic control of catalytic cracking units, and the control can be further improved by the SSPC scheme.

  7. Evolutionary modeling and prediction of non-coding RNAs in Drosophila.

    Directory of Open Access Journals (Sweden)

    Robert K Bradley

    Full Text Available We performed benchmarks of phylogenetic grammar-based ncRNA gene prediction, experimenting with eight different models of structural evolution and two different programs for genome alignment. We evaluated our models using alignments of twelve Drosophila genomes. We find that ncRNA prediction performance can vary greatly between different gene predictors and subfamilies of ncRNA gene. Our estimates for false positive rates are based on simulations which preserve local islands of conservation; using these simulations, we predict a higher rate of false positives than previous computational ncRNA screens have reported. Using one of the tested prediction grammars, we provide an updated set of ncRNA predictions for D. melanogaster and compare them to previously-published predictions and experimental data. Many of our predictions show correlations with protein-coding genes. We found significant depletion of intergenic predictions near the 3' end of coding regions and furthermore depletion of predictions in the first intron of protein-coding genes. Some of our predictions are colocated with larger putative unannotated genes: for example, 17 of our predictions showing homology to the RFAM family snoR28 appear in a tandem array on the X chromosome; the 4.5 Kbp spanned by the predicted tandem array is contained within a FlyBase-annotated cDNA.

  8. The use of predictive lithostratigraphy to significantly improve the ability to forecast reservoir and source rocks? Final CRADA report.

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, R. D.; Moore, T. L.; Energy Systems

    2010-06-29

    The purpose of this CRADA, which ended in 2003, was to make reservoir and source rock distribution significantly more predictable by quantifying the fundamental controls on stratigraphic heterogeneity. To do this, the relationships among insolation, climate, sediment supply, glacioeustasy, and reservoir and source rock occurrence were investigated in detail. Work current at the inception of the CRADA had uncovered previously unrecognized associations among these processes and properties that produce a phenomenon that, when properly analyzed, will make lithostratigraphic variability (including texture, porosity, and permeability) substantially more understandable. Computer climate simulations of selected time periods, compared with the global distribution of paleoclimatic indicators, documented spatial and temporal climate changes as a function of insolation and provided quantitative changes in runoff, lake level, and glacioeustasy. The effect of elevation and climate on sediment yield was assessed numerically by analyzing digital terrain and climate data. The phase relationships of climate, yield, and glacioeustatic cycles from the Gulf of Mexico and/or other sedimentary basins were assessed by using lacunarity, a statistical technique.

  9. Data on clinical significance of second trimester inflammatory biomarkers in the amniotic fluid in predicting preterm delivery.

    Science.gov (United States)

    Kesrouani, Assaad; Chalhoub, Elie; El Rassy, Elie; Germanos, Mirna; Khazzaka, Aline; Rizkallah, Jamale; Attieh, Elie; Aouad, Norma

    2016-12-01

    In this article second trimester amniotic fluid biomarkers are measured for correlation with preterm delivery. One additional milliliter of amniotic fluid is collected during amniocentesis for dosages of IL-6, MMP-9, CRP and glucose levels, along with maternal serum CRP and glucose. MMP-9 and Il-6 levels were measured with the corresponding Human Quantikine(R) ELISA Kit (R&D systems) according to the instructions provided by the manufacturer. Cut-off values for AF MMP-9 and IL-6 were fixed by the kit sensitivity thresholds. Data includes ROC curves for glucose (Fig. 1), IL-6 (Fig. 2) and MMP-9 (Fig. 3), aiming to search for sensitivity and specificity in the prediction of premature delivery. Statistical analyses are performed with SPSS v20.0 software. Statistical significance is determined using the Mann-Whitney and one way ANOVA test. The association with preterm delivery is performed using a two proportions test. Correlations are measured using the Pearson׳'s coefficient. A p valueamniotic fluid" (A. Kesrouani, E. Chalhoub, E. El Rassy, M. Germanos, A. Khazzaka, J. Rizkallah, E. Attieh, N. Aouad, 2016) [1].

  10. Traffic Prediction Scheme based on Chaotic Models in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Xiangrong Feng

    2013-09-01

    Full Text Available Based on the local support vector algorithm of chaotic time series analysis, the Hannan-Quinn information criterion and SAX symbolization are introduced. Then a novel prediction algorithm is proposed, which is successfully applied to the prediction of wireless network traffic. For the correct prediction problems of short-term flow with smaller data set size, the weakness of the algorithms during model construction is analyzed by study and comparison to LDK prediction algorithm. It is verified the Hannan-Quinn information principle can be used to calculate the number of neighbor points to replace pervious empirical method, which uses the number of neighbor points to acquire more accurate prediction model. Finally, actual flow data is applied to confirm the accuracy rate of the proposed algorithm LSDHQ. It is testified by our experiments that it also has higher performance in adaptability than that of LSDHQ algorithm.

  11. Toward a predictive model for elastomer seals

    Science.gov (United States)

    Molinari, Nicola; Khawaja, Musab; Sutton, Adrian; Mostofi, Arash

    Nitrile butadiene rubber (NBR) and hydrogenated-NBR (HNBR) are widely used elastomers, especially as seals in oil and gas applications. During exposure to well-hole conditions, ingress of gases causes degradation of performance, including mechanical failure. We use computer simulations to investigate this problem at two different length and time-scales. First, we study the solubility of gases in the elastomer using a chemically-inspired description of HNBR based on the OPLS all-atom force-field. Starting with a model of NBR, C=C double bonds are saturated with either hydrogen or intramolecular cross-links, mimicking the hydrogenation of NBR to form HNBR. We validate against trends for the mass density and glass transition temperature for HNBR as a function of cross-link density, and for NBR as a function of the fraction of acrylonitrile in the copolymer. Second, we study mechanical behaviour using a coarse-grained model that overcomes some of the length and time-scale limitations of an all-atom approach. Nanoparticle fillers added to the elastomer matrix to enhance mechanical response are also included. Our initial focus is on understanding the mechanical properties at the elevated temperatures and pressures experienced in well-hole conditions.

  12. Active diagnosis of hybrid systems - A model predictive approach

    DEFF Research Database (Denmark)

    Tabatabaeipour, Seyed Mojtaba; Ravn, Anders P.; Izadi-Zamanabadi, Roozbeh;

    2009-01-01

    A method for active diagnosis of hybrid systems is proposed. The main idea is to predict the future output of both normal and faulty model of the system; then at each time step an optimization problem is solved with the objective of maximizing the difference between the predicted normal and faulty...... outputs constrained by tolerable performance requirements. As in standard model predictive control, the first element of the optimal input is applied to the system and the whole procedure is repeated until the fault is detected by a passive diagnoser. It is demonstrated how the generated excitation signal...

  13. Aero-acoustic noise of wind turbines. Noise prediction models

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B. [ed.

    1997-12-31

    Semi-empirical and CAA (Computational AeroAcoustics) noise prediction techniques are the subject of this expert meeting. The meeting presents and discusses models and methods. The meeting may provide answers to the following questions: What Noise sources are the most important? How are the sources best modeled? What needs to be done to do better predictions? Does it boil down to correct prediction of the unsteady aerodynamics around the rotor? Or is the difficult part to convert the aerodynamics into acoustics? (LN)

  14. Model Predictive Control of a Wave Energy Converter

    DEFF Research Database (Denmark)

    Andersen, Palle; Pedersen, Tom Søndergård; Nielsen, Kirsten Mølgaard;

    2015-01-01

    In this paper reactive control and Model Predictive Control (MPC) for a Wave Energy Converter (WEC) are compared. The analysis is based on a WEC from Wave Star A/S designed as a point absorber. The model predictive controller uses wave models based on the dominating sea states combined with a model...... connecting undisturbed wave sequences to sequences of torque. Losses in the conversion from mechanical to electrical power are taken into account in two ways. Conventional reactive controllers are tuned for each sea state with the assumption that the converter has the same efficiency back and forth. MPC...

  15. Modelling and prediction of non-stationary optical turbulence behaviour

    Science.gov (United States)

    Doelman, Niek; Osborn, James

    2016-07-01

    There is a strong need to model the temporal fluctuations in turbulence parameters, for instance for scheduling, simulation and prediction purposes. This paper aims at modelling the dynamic behaviour of the turbulence coherence length r0, utilising measurement data from the Stereo-SCIDAR instrument installed at the Isaac Newton Telescope at La Palma. Based on an estimate of the power spectral density function, a low order stochastic model to capture the temporal variability of r0 is proposed. The impact of this type of stochastic model on the prediction of the coherence length behaviour is shown.

  16. Research on Drag Torque Prediction Model for the Wet Clutches

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Considering the surface tension effect and centrifugal effect, a mathematical model based on Reynolds equation for predicting the drag torque of disengage wet clutches is presented. The model indicates that the equivalent radius is a function of clutch speed and flow rate. The drag torque achieves its peak at a critical speed. Above this speed, drag torque drops due to the shrinking of the oil film. The model also points out that viscosity and flow rate effects on drag torque. Experimental results indicate that the model is reasonable and it performs well for predicting the drag torque peak.

  17. Development and application of chronic disease risk prediction models.

    Science.gov (United States)

    Oh, Sun Min; Stefani, Katherine M; Kim, Hyeon Chang

    2014-07-01

    Currently, non-communicable chronic diseases are a major cause of morbidity and mortality worldwide, and a large proportion of chronic diseases are preventable through risk factor management. However, the prevention efficacy at the individual level is not yet satisfactory. Chronic disease prediction models have been developed to assist physicians and individuals in clinical decision-making. A chronic disease prediction model assesses multiple risk factors together and estimates an absolute disease risk for the individual. Accurate prediction of an individual's future risk for a certain disease enables the comparison of benefits and risks of treatment, the costs of alternative prevention strategies, and selection of the most efficient strategy for the individual. A large number of chronic disease prediction models, especially targeting cardiovascular diseases and cancers, have been suggested, and some of them have been adopted in the clinical practice guidelines and recommendations of many countries. Although few chronic disease prediction tools have been suggested in the Korean population, their clinical utility is not as high as expected. This article reviews methodologies that are commonly used for developing and evaluating a chronic disease prediction model and discusses the current status of chronic disease prediction in Korea.

  18. Evaluation of burst pressure prediction models for line pipes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xian-Kui, E-mail: zhux@battelle.org [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States); Leis, Brian N. [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States)

    2012-01-15

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487-492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: Black-Right-Pointing-Pointer This paper evaluates different burst pressure prediction models for line pipes. Black-Right-Pointing-Pointer The existing models are categorized into two major groups of Tresca and von Mises solutions. Black-Right-Pointing-Pointer Prediction quality of each model is assessed statistically using a large full-scale burst test database. Black-Right-Pointing-Pointer The Zhu-Leis solution is identified as the best predictive model.

  19. Outcome Prediction in Mathematical Models of Immune Response to Infection.

    Directory of Open Access Journals (Sweden)

    Manuel Mai

    Full Text Available Clinicians need to predict patient outcomes with high accuracy as early as possible after disease inception. In this manuscript, we show that patient-to-patient variability sets a fundamental limit on outcome prediction accuracy for a general class of mathematical models for the immune response to infection. However, accuracy can be increased at the expense of delayed prognosis. We investigate several systems of ordinary differential equations (ODEs that model the host immune response to a pathogen load. Advantages of systems of ODEs for investigating the immune response to infection include the ability to collect data on large numbers of 'virtual patients', each with a given set of model parameters, and obtain many time points during the course of the infection. We implement patient-to-patient variability v in the ODE models by randomly selecting the model parameters from distributions with coefficients of variation v that are centered on physiological values. We use logistic regression with one-versus-all classification to predict the discrete steady-state outcomes of the system. We find that the prediction algorithm achieves near 100% accuracy for v = 0, and the accuracy decreases with increasing v for all ODE models studied. The fact that multiple steady-state outcomes can be obtained for a given initial condition, i.e. the basins of attraction overlap in the space of initial conditions, limits the prediction accuracy for v > 0. Increasing the elapsed time of the variables used to train and test the classifier, increases the prediction accuracy, while adding explicit external noise to the ODE models decreases the prediction accuracy. Our results quantify the competition between early prognosis and high prediction accuracy that is frequently encountered by clinicians.

  20. A Wake Model for the Prediction of Propeller Performance at Low Advance Ratios

    Directory of Open Access Journals (Sweden)

    Ye Tian

    2012-01-01

    Full Text Available A low order panel method is used to predict the performance of propellers. A wake alignment model based on a pseudounsteady scheme is proposed and implemented. The results from this full wake alignment (FWA model are correlated with available experimental data, and results from RANS for some propellers at design and low advance ratios. Significant improvements have been found in the predicted integrated forces and pressure distributions.

  1. Cross–Project Defect Prediction With Respect To Code Ownership Model: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Marian Jureczko

    2015-06-01

    Full Text Available The paper presents an analysis of 83 versions of industrial, open-source and academic projects. We have empirically evaluated whether those project types constitute separate classes of projects with regard to defect prediction. Statistical tests proved that there exist significant differences between the models trained on the aforementioned project classes. This work makes the next step towards cross-project reusability of defect prediction models and facilitates their adoption, which has been very limited so far.

  2. Integrating a calibrated groundwater flow model with error-correcting data-driven models to improve predictions

    Science.gov (United States)

    Demissie, Yonas K.; Valocchi, Albert J.; Minsker, Barbara S.; Bailey, Barbara A.

    2009-01-01

    SummaryPhysically-based groundwater models (PBMs), such as MODFLOW, contain numerous parameters which are usually estimated using statistically-based methods, which assume that the underlying error is white noise. However, because of the practical difficulties of representing all the natural subsurface complexity, numerical simulations are often prone to large uncertainties that can result in both random and systematic model error. The systematic errors can be attributed to conceptual, parameter, and measurement uncertainty, and most often it can be difficult to determine their physical cause. In this paper, we have developed a framework to handle systematic error in physically-based groundwater flow model applications that uses error-correcting data-driven models (DDMs) in a complementary fashion. The data-driven models are separately developed to predict the MODFLOW head prediction errors, which were subsequently used to update the head predictions at existing and proposed observation wells. The framework is evaluated using a hypothetical case study developed based on a phytoremediation site at the Argonne National Laboratory. This case study includes structural, parameter, and measurement uncertainties. In terms of bias and prediction uncertainty range, the complementary modeling framework has shown substantial improvements (up to 64% reduction in RMSE and prediction error ranges) over the original MODFLOW model, in both the calibration and the verification periods. Moreover, the spatial and temporal correlations of the prediction errors are significantly reduced, thus resulting in reduced local biases and structures in the model prediction errors.

  3. Model Predictive Control of Wind Turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian

    the need for maintenance of the wind turbine. Either way, better total-cost-of-ownership for wind turbine operators can be achieved by improved control of the wind turbines. Wind turbine control can be improved in two ways, by improving the model on which the controller bases its design or by improving......Wind turbines play a major role in the transformation from a fossil fuel based energy production to a more sustainable production of energy. Total-cost-of-ownership is an important parameter when investors decide in which energy technology they should place their capital. Modern wind turbines...... are controlled by pitching the blades and by controlling the electro-magnetic torque of the generator, thus slowing the rotation of the blades. Improved control of wind turbines, leading to reduced fatigue loads, can be exploited by using less materials in the construction of the wind turbine or by reducing...

  4. Numerical modeling capabilities to predict repository performance

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used.

  5. Quantum mechanically based estimation of perturbed-chain polar statistical associating fluid theory parameters for analyzing their physical significance and predicting properties.

    Science.gov (United States)

    Nhu, Nguyen Van; Singh, Mahendra; Leonhard, Kai

    2008-05-08

    We have computed molecular descriptors for sizes, shapes, charge distributions, and dispersion interactions for 67 compounds using quantum chemical ab initio and density functional theory methods. For the same compounds, we have fitted the three perturbed-chain polar statistical associating fluid theory (PCP-SAFT) equation of state (EOS) parameters to experimental data and have performed a statistical analysis for relations between the descriptors and the EOS parameters. On this basis, an analysis of the physical significance of the parameters, the limits of the present descriptors, and the PCP-SAFT EOS has been performed. The result is a method that can be used to estimate the vapor pressure curve including the normal boiling point, the liquid volume, the enthalpy of vaporization, the critical data, mixture properties, and so on. When only two of the three parameters are predicted and one is adjusted to experimental normal boiling point data, excellent predictions of all investigated pure compound and mixture properties are obtained. We are convinced that the methodology presented in this work will lead to new EOS applications as well as improved EOS models whose predictive performance is likely to surpass that of most present quantum chemically based, quantitative structure-property relationship, and group contribution methods for a broad range of chemical substances.

  6. Significance of peak height velocity as a predictive factor for curve progression in patients with idiopathic scoliosis

    Science.gov (United States)

    2015-01-01

    progression in patients with IS. Conclusions These findings indicate that 31.5 degrees of spinal curvature when patients are at PHV is a significant predictive indicator for progression of the curve to a magnitude requiring surgery. We suggest that the curve-progression risk assessment in patients with IS should include PHV, along with measures of skeletal and non-skeletal maturities. PMID:25815057

  7. Significance of Vestibular Testing on Distinguishing the Nerve of Origin for Vestibular Schwannoma and Predicting the Preservation of Hearing

    Institute of Scientific and Technical Information of China (English)

    Yu-Bo He; Chun-Jiang Yu; Hong-Ming Ji; Yan-Ming Qu; Ning Chen

    2016-01-01

    Background:Determining the nerve of origin for vestibular schwannoma (VS),as a method for predicting hearing prognosis,has not been systematically considered.The vestibular test can be used to investigate the function of the superior vestibular nerve (SVN) and the inferior vestibular nerve (IVN).This study aimed to preoperatively distinguish the nerve of origin for VS patients using the vestibular test,and determine if this correlated with hearing preservation.Methods:A total of 106 patients with unilateral VS were enrolled in this study prospectively.Each patient received a caloric test,vestibular-evoked myogenic potential (VEMP) test,and cochlear nerve function test (hearing) before the operation and 1 week,3,and 6 months,postoperatively.All patients underwent surgical removal of the VS using the suboccipital approach.During the operation,the nerve of tumor origin (SVN or IVN) was identified by the surgeon.Tumor size was measured by preoperative magnetic resonance imaging.Results:The nerve of tumor origin could not be unequivocally identified in 38 patients (38/106,35.80%).These patients were not subsequently evaluated.In 26 patients (nine females,seventeen males),tumors arose from the SVN and in 42 patients (18 females,24 males),tumors arose from the IVN.Comparing with the nerve of origins (SVN and IVN) of tumors,the results of the caloric tests and VEMP tests were significantly different in tumors originating from the SVN and the IVN in our study.Hearing was preserved in 16 of 26 patients (61.54%) with SVN-originating tumors,whereas hearing was preserved in only seven of 42 patients (16.67%) with IVN-originating tumors.Conclusions:Our data suggest that caloric and VEMP tests might help to identify whether VS tumors originate from the SVN or IVN.These tests could also be used to evaluate the residual function of the nerves after surgery.Using this information,we might better predict the preservation of hearing for patients.

  8. Ensemble models on palaeoclimate to predict India's groundwater challenge

    Directory of Open Access Journals (Sweden)

    Partha Sarathi Datta

    2013-09-01

    Full Text Available In many parts of the world, freshwater crisis is largely due to increasing water consumption and pollution by rapidly growing population and aspirations for economic development, but, ascribed usually to the climate. However, limited understanding and knowledge gaps in the factors controlling climate and uncertainties in the climate models are unable to assess the probable impacts on water availability in tropical regions. In this context, review of ensemble models on δ18O and δD in rainfall and groundwater, 3H- and 14C- ages of groundwater and 14C- age of lakes sediments helped to reconstruct palaeoclimate and long-term recharge in the North-west India; and predict future groundwater challenge. The annual mean temperature trend indicates both warming/cooling in different parts of India in the past and during 1901–2010. Neither the GCMs (Global Climate Models nor the observational record indicates any significant change/increase in temperature and rainfall over the last century, and climate change during the last 1200 yrs BP. In much of the North-West region, deep groundwater renewal occurred from past humid climate, and shallow groundwater renewal from limited modern recharge over the past decades. To make water management to be more responsive to climate change, the gaps in the science of climate change need to be bridged.

  9. Comparison of Linear Prediction Models for Audio Signals

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available While linear prediction (LP has become immensely popular in speech modeling, it does not seem to provide a good approach for modeling audio signals. This is somewhat surprising, since a tonal signal consisting of a number of sinusoids can be perfectly predicted based on an (all-pole LP model with a model order that is twice the number of sinusoids. We provide an explanation why this result cannot simply be extrapolated to LP of audio signals. If noise is taken into account in the tonal signal model, a low-order all-pole model appears to be only appropriate when the tonal components are uniformly distributed in the Nyquist interval. Based on this observation, different alternatives to the conventional LP model can be suggested. Either the model should be changed to a pole-zero, a high-order all-pole, or a pitch prediction model, or the conventional LP model should be preceded by an appropriate frequency transform, such as a frequency warping or downsampling. By comparing these alternative LP models to the conventional LP model in terms of frequency estimation accuracy, residual spectral flatness, and perceptual frequency resolution, we obtain several new and promising approaches to LP-based audio modeling.

  10. Comparison of Linear Prediction Models for Audio Signals

    Directory of Open Access Journals (Sweden)

    van Waterschoot Toon

    2008-01-01

    Full Text Available While linear prediction (LP has become immensely popular in speech modeling, it does not seem to provide a good approach for modeling audio signals. This is somewhat surprising, since a tonal signal consisting of a number of sinusoids can be perfectly predicted based on an (all-pole LP model with a model order that is twice the number of sinusoids. We provide an explanation why this result cannot simply be extrapolated to LP of audio signals. If noise is taken into account in the tonal signal model, a low-order all-pole model appears to be only appropriate when the tonal components are uniformly distributed in the Nyquist interval. Based on this observation, different alternatives to the conventional LP model can be suggested. Either the model should be changed to a pole-zero, a high-order all-pole, or a pitch prediction model, or the conventional LP model should be preceded by an appropriate frequency transform, such as a frequency warping or downsampling. By comparing these alternative LP models to the conventional LP model in terms of frequency estimation accuracy, residual spectral flatness, and perceptual frequency resolution, we obtain several new and promising approaches to LP-based audio modeling.

  11. Hidden Markov models for prediction of protein features

    DEFF Research Database (Denmark)

    Bystroff, Christopher; Krogh, Anders

    2008-01-01

    Hidden Markov Models (HMMs) are an extremely versatile statistical representation that can be used to model any set of one-dimensional discrete symbol data. HMMs can model protein sequences in many ways, depending on what features of the protein are represented by the Markov states. For protein...... structure prediction, states have been chosen to represent either homologous sequence positions, local or secondary structure types, or transmembrane locality. The resulting models can be used to predict common ancestry, secondary or local structure, or membrane topology by applying one of the two standard...... algorithms for comparing a sequence to a model. In this chapter, we review those algorithms and discuss how HMMs have been constructed and refined for the purpose of protein structure prediction....

  12. Modelling of physical properties - databases, uncertainties and predictive power

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical and thermodynamic property in the form of raw data or estimated values for pure compounds and mixtures are important pre-requisites for performing tasks such as, process design, simulation and optimization; computer aided molecular/mixture (product) design; and, product-process analysis...... connectivity approach. The development of these models requires measured property data and based on them, the regression of model parameters is performed. Although this class of models is empirical by nature, they do allow extrapolation from the regressed model parameters to predict properties of chemicals...... not included in the measured data-set. Therefore, they are also considered as predictive models. The paper will highlight different issues/challenges related to the role of the databases and the mathematical and thermodynamic consistency of the measured/estimated data, the predictive nature of the developed...

  13. Modeling, Prediction, and Control of Heating Temperature for Tube Billet

    Directory of Open Access Journals (Sweden)

    Yachun Mao

    2015-01-01

    Full Text Available Annular furnaces have multivariate, nonlinear, large time lag, and cross coupling characteristics. The prediction and control of the exit temperature of a tube billet are important but difficult. We establish a prediction model for the final temperature of a tube billet through OS-ELM-DRPLS method. We address the complex production characteristics, integrate the advantages of PLS and ELM algorithms in establishing linear and nonlinear models, and consider model update and data lag. Based on the proposed model, we design a prediction control algorithm for tube billet temperature. The algorithm is validated using the practical production data of Baosteel Co., Ltd. Results show that the model achieves the precision required in industrial applications. The temperature of the tube billet can be controlled within the required temperature range through compensation control method.

  14. Predicting artificailly drained areas by means of selective model ensemble

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Beucher, Amélie; Iversen, Bo Vangsø

    . The approaches employed include decision trees, discriminant analysis, regression models, neural networks and support vector machines amongst others. Several models are trained with each method, using variously the original soil covariates and principal components of the covariates. With a large ensemble...... out since the mid-19th century, and it has been estimated that half of the cultivated area is artificially drained (Olesen, 2009). A number of machine learning approaches can be used to predict artificially drained areas in geographic space. However, instead of choosing the most accurate model....... The study aims firstly to train a large number of models to predict the extent of artificially drained areas using various machine learning approaches. Secondly, the study will develop a method for selecting the models, which give a good prediction of artificially drained areas, when used in conjunction...

  15. Hidden markov model for the prediction of transmembrane proteins using MATLAB.

    Science.gov (United States)

    Chaturvedi, Navaneet; Shanker, Sudhanshu; Singh, Vinay Kumar; Sinha, Dhiraj; Pandey, Paras Nath

    2011-01-01

    Since membranous proteins play a key role in drug targeting therefore transmembrane proteins prediction is active and challenging area of biological sciences. Location based prediction of transmembrane proteins are significant for functional annotation of protein sequences. Hidden markov model based method was widely applied for transmembrane topology prediction. Here we have presented a revised and a better understanding model than an existing one for transmembrane protein prediction. Scripting on MATLAB was built and compiled for parameter estimation of model and applied this model on amino acid sequence to know the transmembrane and its adjacent locations. Estimated model of transmembrane topology was based on TMHMM model architecture. Only 7 super states are defined in the given dataset, which were converted to 96 states on the basis of their length in sequence. Accuracy of the prediction of model was observed about 74 %, is a good enough in the area of transmembrane topology prediction. Therefore we have concluded the hidden markov model plays crucial role in transmembrane helices prediction on MATLAB platform and it could also be useful for drug discovery strategy. The database is available for free at bioinfonavneet@gmail.comvinaysingh@bhu.ac.in.

  16. Experimental study on prediction model for maximum rebound ratio

    Institute of Scientific and Technical Information of China (English)

    LEI Wei-dong; TENG Jun; A.HEFNY; ZHAO Jian; GUAN Jiong

    2007-01-01

    The proposed prediction model for estimating the maximum rebound ratio was applied to a field explosion test, Mandai test in Singapore.The estimated possible maximum Deak particle velocities(PPVs)were compared with the field records.Three of the four available field-recorded PPVs lie exactly below the estimated possible maximum values as expected.while the fourth available field-recorded PPV lies close to and a bit higher than the estimated maximum possible PPV The comparison results show that the predicted PPVs from the proposed prediction model for the maximum rebound ratio match the field.recorded PPVs better than those from two empirical formulae.The very good agreement between the estimated and field-recorded values validates the proposed prediction model for estimating PPV in a rock mass with a set of ipints due to application of a two dimensional compressional wave at the boundary of a tunnel or a borehole.

  17. Groundwater Level Prediction using M5 Model Trees

    Science.gov (United States)

    Nalarajan, Nitha Ayinippully; Mohandas, C.

    2015-01-01

    Groundwater is an important resource, readily available and having high economic value and social benefit. Recently, it had been considered a dependable source of uncontaminated water. During the past two decades, increased rate of extraction and other greedy human actions have resulted in the groundwater crisis, both qualitatively and quantitatively. Under prevailing circumstances, the availability of predicted groundwater levels increase the importance of this valuable resource, as an aid in the planning of groundwater resources. For this purpose, data-driven prediction models are widely used in the present day world. M5 model tree (MT) is a popular soft computing method emerging as a promising method for numeric prediction, producing understandable models. The present study discusses the groundwater level predictions using MT employing only the historical groundwater levels from a groundwater monitoring well. The results showed that MT can be successively used for forecasting groundwater levels.

  18. A Fusion Model for CPU Load Prediction in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dayu Xu

    2013-11-01

    Full Text Available Load prediction plays a key role in cost-optimal resource allocation and datacenter energy saving. In this paper, we use real-world traces from Cloud platform and propose a fusion model to forecast the future CPU loads. First, long CPU load time series data are divided into short sequences with same length from the historical data on the basis of cloud control cycle. Then we use kernel fuzzy c-means clustering algorithm to put the subsequences into different clusters. For each cluster, with current load sequence, a genetic algorithm optimized wavelet Elman neural network prediction model is exploited to predict the CPU load in next time interval. Finally, we obtain the optimal cloud computing CPU load prediction results from the cluster and its corresponding predictor with minimum forecasting error. Experimental results show that our algorithm performs better than other models reported in previous works.

  19. Modelling proteins' hidden conformations to predict antibiotic resistance

    Science.gov (United States)

    Hart, Kathryn M.; Ho, Chris M. W.; Dutta, Supratik; Gross, Michael L.; Bowman, Gregory R.

    2016-10-01

    TEM β-lactamase confers bacteria with resistance to many antibiotics and rapidly evolves activity against new drugs. However, functional changes are not easily explained by differences in crystal structures. We employ Markov state models to identify hidden conformations and explore their role in determining TEM's specificity. We integrate these models with existing drug-design tools to create a new technique, called Boltzmann docking, which better predicts TEM specificity by accounting for conformational heterogeneity. Using our MSMs, we identify hidden states whose populations correlate with activity against cefotaxime. To experimentally detect our predicted hidden states, we use rapid mass spectrometric footprinting and confirm our models' prediction that increased cefotaxime activity correlates with reduced Ω-loop flexibility. Finally, we design novel variants to stabilize the hidden cefotaximase states, and find their populations predict activity against cefotaxime in vitro and in vivo. Therefore, we expect this framework to have numerous applications in drug and protein design.

  20. A Hybrid Neural Network Prediction Model of Air Ticket Sales

    Directory of Open Access Journals (Sweden)

    Han-Chen Huang

    2013-11-01

    Full Text Available Air ticket sales revenue is an important source of revenue for travel agencies, and if future air ticket sales revenue can be accurately forecast, travel agencies will be able to advance procurement to achieve a sufficient amount of cost-effective tickets. Therefore, this study applied the Artificial Neural Network (ANN and Genetic Algorithms (GA to establish a prediction model of travel agency air ticket sales revenue. By verifying the empirical data, this study proved that the established prediction model has accurate prediction power, and MAPE (mean absolute percentage error is only 9.11%. The established model can provide business operators with reliable and efficient prediction data as a reference for operational decisions.

  1. E-commerce business model mining and prediction

    Institute of Scientific and Technical Information of China (English)

    Zhou-zhou HE; Zhong-fei ZHANG; Chun-ming CHEN; Zheng-gang WANG

    2015-01-01

    We study the problem of business model mining and prediction in the e-commerce context. Unlike most existing approaches where this is typically formulated as a regression problem or a time-series prediction problem, we take a different formulation to this problem by noting that these existing approaches fail to consider the potential relationships both among the consumers (consumer infl uence) and among the shops (competitions or collaborations). Taking this observation into consideration, we propose a new method for e-commerce business model mining and prediction, called EBMM, which combines regression with community analysis. The challenge is that the links in the network are typically not directly observed, which is addressed by applying information diffusion theory through the consumer-shop network. Extensive evaluations using Alibaba Group e-commerce data demonstrate the promise and superiority of EBMM to the state-of-the-art methods in terms of business model mining and prediction.

  2. Model for Predicting Passage of Invasive Fish Species Through Culverts

    Science.gov (United States)

    Neary, V.

    2010-12-01

    Conservation efforts to promote or inhibit fish passage include the application of simple fish passage models to determine whether an open channel flow allows passage of a given fish species. Derivations of simple fish passage models for uniform and nonuniform flow conditions are presented. For uniform flow conditions, a model equation is developed that predicts the mean-current velocity threshold in a fishway, or velocity barrier, which causes exhaustion at a given maximum distance of ascent. The derivation of a simple expression for this exhaustion-threshold (ET) passage model is presented using kinematic principles coupled with fatigue curves for threatened and endangered fish species. Mean current velocities at or above the threshold predict failure to pass. Mean current velocities below the threshold predict successful passage. The model is therefore intuitive and easily applied to predict passage or exclusion. The ET model’s simplicity comes with limitations, however, including its application only to uniform flow, which is rarely found in the field. This limitation is addressed by deriving a model that accounts for nonuniform conditions, including backwater profiles and drawdown curves. Comparison of these models with experimental data from volitional swimming studies of fish indicates reasonable performance, but limitations are still present due to the difficulty in predicting fish behavior and passage strategies that can vary among individuals and different fish species.

  3. Significance of genetic information in risk assessment and individual classification using silicosis as a case model

    Energy Technology Data Exchange (ETDEWEB)

    McCanlies, E.; Landsittel, D.P.; Yucesoy, B.; Vallyathan, V.; Luster, M.L.; Sharp, D.S. [NIOSH, Morgantown, WV (United States)

    2002-06-01

    Over the last decade the role of genetic data in epidemiological research has expanded considerably. The authors recently published a case-control study that evaluated the interaction between silica exposure and minor variants in the genes coding for interleukin-1alpha. (IL-1alpha), interleukin-1 receptor antagonist (IL-1RA) and tumor necrosis factor alpha (TNFalpha) as risk factors associated with silicosis, a fibrotic lung disease. In contrast, this report uses data generated from these studies to illustrate the utility of genetic information for the purposes of risk assessment and clinical prediction. Specifically, this study addresses how, given a known exposure, genetic information affects the characterization of risk groups. Relative operating characteristic (ROC) curves were then used to determine the impact of genetic information on individual classification. Logistic regression modeling procedures were used to estimate the predicted probability of developing silicosis. This probability was then used to construct predicted risk deciles, first for a model with occupational exposure only and then for a model containing occupational exposure and genetic main effects and interactions. The results indicate that genetic information plays a valuable role in effectively characterizing risk groups and mechanisms of disease operating in a substantial proportion of the population. However, in the case of fibrotic lung disease caused by silica exposure, information about the presence or absence of the minor variants of IL-1alpha, IL-1RA and TNFalpha is unlikely to be a useful tool for individual classification.

  4. Some Remarks on CFD Drag Prediction of an Aircraft Model

    Science.gov (United States)

    Peng, S. H.; Eliasson, P.

    Observed in CFD drag predictions for the DLR-F6 aircraft model with various configurations, some issues are addressed. The emphasis is placed on the effect of turbulence modeling and grid resolution. With several different turbulence models, the predicted flow feature around the aircraft is highlighted. It is shown that the prediction of the separation bubble in the wing-body junction is closely related to the inherent modeling mechanism of turbulence production. For the configuration with an additional fairing, which has effectively removed the separation bubble, it is illustrated that the drag prediction may be altered even for attached turbulent boundary layer when different turbulence models are used. Grid sensitivity studies are performed with two groups of subsequently refined grids. It is observed that, in contrast to the lift, the drag prediction is rather sensitive to the grid refinement, as well as to the artificial diffusion added for solving the turbulence transport equation. It is demonstrated that an effective grid refinement should drive the predicted drag components monotonically and linearly converged to a finite value.

  5. Multi-model ensemble hydrologic prediction and uncertainties analysis

    Directory of Open Access Journals (Sweden)

    S. Jiang

    2014-09-01

    Full Text Available Modelling uncertainties (i.e. input errors, parameter uncertainties and model structural errors inevitably exist in hydrological prediction. A lot of recent attention has focused on these, of which input error modelling, parameter optimization and multi-model ensemble strategies are the three most popular methods to demonstrate the impacts of modelling uncertainties. In this paper the Xinanjiang model, the Hybrid rainfall–runoff model and the HYMOD model were applied to the Mishui Basin, south China, for daily streamflow ensemble simulation and uncertainty analysis. The three models were first calibrated by two parameter optimization algorithms, namely, the Shuffled Complex Evolution method (SCE-UA and the Shuffled Complex Evolution Metropolis method (SCEM-UA; next, the input uncertainty was accounted for by introducing a normally-distributed error multiplier; then, the simulation sets calculated from the three models were combined by Bayesian model averaging (BMA. The results show that both these parameter optimization algorithms generate good streamflow simulations; specifically the SCEM-UA can imply parameter uncertainty and give the posterior distribution of the parameters. Considering the precipitation input uncertainty, the streamflow simulation precision does not improve very much. While the BMA combination not only improves the streamflow prediction precision, it also gives quantitative uncertainty bounds for the simulation sets. The SCEM-UA calculated prediction interval is better than the SCE-UA calculated one. These results suggest that considering the model parameters' uncertainties and doing multi-model ensemble simulations are very practical for streamflow prediction and flood forecasting, from which more precision prediction and more reliable uncertainty bounds can be generated.

  6. Model predictive torque control with an extended prediction horizon for electrical drive systems

    Science.gov (United States)

    Wang, Fengxiang; Zhang, Zhenbin; Kennel, Ralph; Rodríguez, José

    2015-07-01

    This paper presents a model predictive torque control method for electrical drive systems. A two-step prediction horizon is achieved by considering the reduction of the torque ripples. The electromagnetic torque and the stator flux error between predicted values and the references, and an over-current protection are considered in the cost function design. The best voltage vector is selected by minimising the value of the cost function, which aims to achieve a low torque ripple in two intervals. The study is carried out experimentally. The results show that the proposed method achieves good performance in both steady and transient states.

  7. Validation of Biomarker-based risk prediction models

    OpenAIRE

    Taylor, Jeremy M.G.; Ankerst, Donna P.; Andridge, Rebecca R.

    2008-01-01

    The increasing availability and use of predictive models to facilitate informed decision making highlights the need for careful assessment of the validity of these models. In particular, models involving biomarkers require careful validation for two reasons: issues with overfitting when complex models involve a large number of biomarkers, and inter-laboratory variation in assays used to measure biomarkers. In this paper we distinguish between internal and external statistical validation. Inte...

  8. Prediction error, ketamine and psychosis: An updated model.

    Science.gov (United States)

    Corlett, Philip R; Honey, Garry D; Fletcher, Paul C

    2016-11-01

    In 2007, we proposed an explanation of delusion formation as aberrant prediction error-driven associative learning. Further, we argued that the NMDA receptor antagonist ketamine provided a good model for this process. Subsequently, we validated the model in patients with psychosis, relating aberrant prediction error signals to delusion severity. During the ensuing period, we have developed these ideas, drawing on the simple principle that brains build a model of the world and refine it by minimising prediction errors, as well as using it to guide perceptual inferences. While previously we focused on the prediction error signal per se, an updated view takes into account its precision, as well as the precision of prior expectations. With this expanded perspective, we see several possible routes to psychotic symptoms - which may explain the heterogeneity of psychotic illness, as well as the fact that other drugs, with different pharmacological actions, can produce psychotomimetic effects. In this article, we review the basic principles of this model and highlight specific ways in which prediction errors can be perturbed, in particular considering the reliability and uncertainty of predictions. The expanded model explains hallucinations as perturbations of the uncertainty mediated balance between expectation and prediction error. Here, expectations dominate and create perceptions by suppressing or ignoring actual inputs. Negative symptoms may arise due to poor reliability of predictions in service of action. By mapping from biology to belief and perception, the account proffers new explanations of psychosis. However, challenges remain. We attempt to address some of these concerns and suggest future directions, incorporating other symptoms into the model, building towards better understanding of psychosis. © The Author(s) 2016.

  9. Settlement prediction model of slurry suspension based on sedimentation rate attenuation

    Directory of Open Access Journals (Sweden)

    Shuai-jie GUO

    2012-03-01

    Full Text Available This paper introduces a slurry suspension settlement prediction model for cohesive sediment in a still water environment. With no sediment input and a still water environment condition, control forces between settling particles are significantly different in the process of sedimentation rate attenuation, and the settlement process includes the free sedimentation stage, the log-linear attenuation stage, and the stable consolidation stage according to sedimentation rate attenuation. Settlement equations for sedimentation height and time were established based on sedimentation rate attenuation properties of different sedimentation stages. Finally, a slurry suspension settlement prediction model based on slurry parameters was set up with a foundation being that the model parameters were determined by the basic parameters of slurry. The results of the settlement prediction model show good agreement with those of the settlement column experiment and reflect the main characteristics of cohesive sediment. The model can be applied to the prediction of cohesive soil settlement in still water environments.

  10. Improvement of the Blast Furnace Viscosity Prediction Model Based on Discrete Points Data

    Science.gov (United States)

    Guo, Hongwei; Zhu, Mengyi; Li, Xinyu; Guo, Jian; Du, Shen; Zhang, Jianliang

    2015-02-01

    Viscosity is considered to be a significant indicator of the metallurgical property of blast furnace slag. An improved model for viscosity prediction based on the Chou model was presented in this article. The updated model has optimized the selection strategy of distance algorithm and negative weights at the reference points. Therefore, the extensionality prediction disadvantage in the original model was ameliorated by this approach. The model prediction was compared with viscosity data of slags of compositions typical to BF operations obtained from a domestic steel plant. The results show that the approach can predict the viscosity with average error of 9.23 pct and mean standard deviation of 0.046 Pa s.

  11. Modeling bulk canopy resistance from climatic variables for predicting hourly evapotranspiration of maize and buckwheat

    Science.gov (United States)

    Yan, Haofang; Shi, Haibin; Hiroki, Oue; Zhang, Chuan; Xue, Zhu; Cai, Bin; Wang, Guoqing

    2015-06-01

    This study presents models for predicting hourly canopy resistance ( r c) and evapotranspiration (ETc) based on Penman-Monteith approach. The micrometeorological data and ET c were observed during maize and buckwheat growing seasons in 2006 and 2009 in China and Japan, respectively. The proposed models of r c were developed by a climatic resistance ( r *) that depends on climatic variables. Non-linear relationships between r c and r * were applied. The measured ETc using Bowen ratio energy balance method was applied for model validation. The statistical analysis showed that there were no significant differences between predicted ETc by proposed models and measured ETc for both maize and buckwheat crops. The model for predicting ETc at maize field showed better performance than predicting ETc at buckwheat field, the coefficients of determination were 0.92 and 0.84, respectively. The study provided an easy way for the application of Penman-Monteith equation with only general available meteorological database.

  12. Predictive significance of residual ischemia proved by dobutamine stress: Echocardiography test in patients early after the first uncomplicated myocardial infarction

    Directory of Open Access Journals (Sweden)

    Karanović Nevena

    2004-01-01

    Full Text Available Background. To evaluate the long-term prognostic value of dobutamine stress-echocardiography (ECG test for new coronary events (new episodes of angina pectoris, cardiac-related deaths, and reinfarctions early after the first uncomplicated myocardial infarction. Methods. Dobutamine stress-echocardiography tests were performed in all of 104 patients 10-20 days after the first myocardial infarction. Patients were followed-up for 36 (29 ± 7 months. Kaplan-Meier cumulative survival curves were tested by Breslow test (Log Rank. Results. Two cardiac deaths (1.92%, nine nonfatal myocardial infarctions (8.65%, and three cases of recurrent angina pectoris (2.88% occurred during the prospective follow-up. Cumulative survival curves showed that in patients with negative findings of dobutamine stress-echocardiography test, survival time without significant events was 35.31 months, while in the group with positive findings of dobutamine stress-echocardiography test it was 30.91 months (log Rank 7.22; p<0.01. Prognostic value of dobutamine stress-echocardiography test was analyzed by Cox regression model and was 2.92, meaning that the risk of significant events was 2.92 times higher in the group of patients with positive findings of dobutamine stress-echocardiography test. Conclusion. Patients with negative findings of dobutamine stress-echocardiography test were with significantly higher possibility of surviving without significant events in comparison with the patients in whom the findings of dobutamine stress-echocardiography test were positive. In combination with clinical signs and ECG results, the results of dobutamine stress-echocardiography test improved prognostic value in the patients with the first uncomplicated myocardial infarction, and in that way influenced the strategy of their further treatment.

  13. Predicting Solar Cycle 25 using Surface Flux Transport Model

    Science.gov (United States)

    Imada, Shinsuke; Iijima, Haruhisa; Hotta, Hideyuki; Shiota, Daiko; Kusano, Kanya

    2017-08-01

    It is thought that the longer-term variations of the solar activity may affect the Earth’s climate. Therefore, predicting the next solar cycle is crucial for the forecast of the “solar-terrestrial environment”. To build prediction schemes for the next solar cycle is a key for the long-term space weather study. Recently, the relationship between polar magnetic field at the solar minimum and next solar activity is intensively discussed. Because we can determine the polar magnetic field at the solar minimum roughly 3 years before the next solar maximum, we may discuss the next solar cycle 3years before. Further, the longer term (~5 years) prediction might be achieved by estimating the polar magnetic field with the Surface Flux Transport (SFT) model. Now, we are developing a prediction scheme by SFT model as a part of the PSTEP (Project for Solar-Terrestrial Environment Prediction) and adapting to the Cycle 25 prediction. The predicted polar field strength of Cycle 24/25 minimum is several tens of percent smaller than Cycle 23/24 minimum. The result suggests that the amplitude of Cycle 25 is weaker than the current cycle. We also try to obtain the meridional flow, differential rotation, and turbulent diffusivity from recent modern observations (Hinode and Solar Dynamics Observatory). These parameters will be used in the SFT models to predict the polar magnetic fields strength at the solar minimum. In this presentation, we will explain the outline of our strategy to predict the next solar cycle and discuss the initial results for Cycle 25 prediction.

  14. Predicting soil acidification trends at Plynlimon using the SAFE model

    Directory of Open Access Journals (Sweden)

    B. Reynolds

    1997-01-01

    Full Text Available The SAFE model has been applied to an acid grassland site, located on base-poor stagnopodzol soils derived from Lower Palaeozoic greywackes. The model predicts that acidification of the soil has occurred in response to increased acid deposition following the industrial revolution. Limited recovery is predicted following the decline in sulphur deposition during the mid to late 1970s. Reducing excess sulphur and NOx deposition in 1998 to 40% and 70% of 1980 levels results in further recovery but soil chemical conditions (base saturation, soil water pH and ANC do not return to values predicted in pre-industrial times. The SAFE model predicts that critical loads (expressed in terms of the (Ca+Mg+K:Alcrit ratio for six vegetation species found in acid grassland communities are not exceeded despite the increase in deposited acidity following the industrial revolution. The relative growth response of selected vegetation species characteristic of acid grassland swards has been predicted using a damage function linking growth to soil solution base cation to aluminium ratio. The results show that very small growth reductions can be expected for 'acid tolerant' plants growing in acid upland soils. For more sensitive species such as Holcus lanatus, SAFE predicts that growth would have been reduced by about 20% between 1951 and 1983, when acid inputs were greatest. Recovery to c. 90% of normal growth (under laboratory conditions is predicted as acidic inputs decline.

  15. Predicting weed problems in maize cropping by species distribution modelling

    Directory of Open Access Journals (Sweden)

    Bürger, Jana

    2014-02-01

    Full Text Available Increasing maize cultivation and changed cropping practices promote the selection of typical maize weeds that may also profit strongly from climate change. Predicting potential weed problems is of high interest for plant production. Within the project KLIFF, experiments were combined with species distribution modelling for this task in the region of Lower Saxony, Germany. For our study, we modelled ecological and damage niches of nine weed species that are significant and wide spread in maize cropping in a number of European countries. Species distribution models describe the ecological niche of a species, these are the environmental conditions under which a species can maintain a vital population. It is also possible to estimate a damage niche, i.e. the conditions under which a species causes damage in agricultural crops. For this, we combined occurrence data of European national data bases with high resolution climate, soil and land use data. Models were also projected to simulated climate conditions for the time horizon 2070 - 2100 in order to estimate climate change effects. Modelling results indicate favourable conditions for typical maize weed occurrence virtually all over the study region, but only a few species are important in maize cropping. This is in good accordance with the findings of an earlier maize weed monitoring. Reaction to changing climate conditions is species-specific, for some species neutral (E. crus-galli, other species may gain (Polygonum persicaria or loose (Viola arvensis large areas of suitable habitats. All species with damage potential under present conditions will remain important in maize cropping, some more species will gain regional importance (Calystegia sepium, Setara viridis.

  16. Physics-Informed Machine Learning for Predictive Turbulence Modeling: A Priori Assessment of Prediction Confidence

    CERN Document Server

    Wu, Jin-Long; Xiao, Heng; Ling, Julia

    2016-01-01

    Although Reynolds-Averaged Navier-Stokes (RANS) equations are still the dominant tool for engineering design and analysis applications involving turbulent flows, standard RANS models are known to be unreliable in many flows of engineering relevance, including flows with separation, strong pressure gradients or mean flow curvature. With increasing amounts of 3-dimensional experimental data and high fidelity simulation data from Large Eddy Simulation (LES) and Direct Numerical Simulation (DNS), data-driven turbulence modeling has become a promising approach to increase the predictive capability of RANS simulations. Recently, a data-driven turbulence modeling approach via machine learning has been proposed to predict the Reynolds stress anisotropy of a given flow based on high fidelity data from closely related flows. In this work, the closeness of different flows is investigated to assess the prediction confidence a priori. Specifically, the Mahalanobis distance and the kernel density estimation (KDE) technique...

  17. Extended Range Hydrological Predictions: Uncertainty Associated with Model Parametrization

    Science.gov (United States)

    Joseph, J.; Ghosh, S.; Sahai, A. K.

    2016-12-01

    The better understanding of various atmospheric processes has led to improved predictions of meteorological conditions at various temporal scale, ranging from short term which cover a period up to 2 days to long term covering a period of more than 10 days. Accurate prediction of hydrological variables can be done using these predicted meteorological conditions, which would be helpful in proper management of water resources. Extended range hydrological simulation includes the prediction of hydrological variables for a period more than 10 days. The main sources of uncertainty in hydrological predictions include the uncertainty in the initial conditions, meteorological forcing and model parametrization. In the present study, the Extended Range Prediction developed for India for monsoon by Indian Institute of Tropical Meteorology (IITM), Pune is used as meteorological forcing for the Variable Infiltration Capacity (VIC) model. Sensitive hydrological parameters, as derived from literature, along with a few vegetation parameters are assumed to be uncertain and 1000 random values are generated given their prescribed ranges. Uncertainty bands are generated by performing Monte-Carlo Simulations (MCS) for the generated sets of parameters and observed meteorological forcings. The basins with minimum human intervention, within the Indian Peninsular region, are identified and validation of results are carried out using the observed gauge discharge. Further, the uncertainty bands are generated for the extended range hydrological predictions by performing MCS for the same set of parameters and extended range meteorological predictions. The results demonstrate the uncertainty associated with the model parametrisation for the extended range hydrological simulations. Keywords: Extended Range Prediction, Variable Infiltration Capacity model, Monte Carlo Simulation.

  18. Predictive modeling of coral disease distribution within a reef system.

    Directory of Open Access Journals (Sweden)

    Gareth J Williams

    Full Text Available Diseases often display complex and distinct associations with their environment due to differences in etiology, modes of transmission between hosts, and the shifting balance between pathogen virulence and host resistance. Statistical modeling has been underutilized in coral disease research to explore the spatial patterns that result from this triad of interactions. We tested the hypotheses that: 1 coral diseases show distinct associations with multiple environmental factors, 2 incorporating interactions (synergistic collinearities among environmental variables is important when predicting coral disease spatial patterns, and 3 modeling overall coral disease prevalence (the prevalence of multiple diseases as a single proportion value will increase predictive error relative to modeling the same diseases independently. Four coral diseases: Porites growth anomalies (PorGA, Porites tissue loss (PorTL, Porites trematodiasis (PorTrem, and Montipora white syndrome (MWS, and their interactions with 17 predictor variables were modeled using boosted regression trees (BRT within a reef system in Hawaii. Each disease showed distinct associations with the predictors. Environmental predictors showing the strongest overall associations with the coral diseases were both biotic and abiotic. PorGA was optimally predicted by a negative association with turbidity, PorTL and MWS by declines in butterflyfish and juvenile parrotfish abundance respectively, and PorTrem by a modal relationship with Porites host cover. Incorporating interactions among predictor variables contributed to the predictive power of our models, particularly for PorTrem. Combining diseases (using overall disease prevalence as the model response, led to an average six-fold increase in cross-validation predictive deviance over modeling the diseases individually. We therefore recommend coral diseases to be modeled separately, unless known to have etiologies that respond in a similar manner to

  19. A CHAID Based Performance Prediction Model in Educational Data Mining

    Directory of Open Access Journals (Sweden)

    R. Bhaskaran

    2010-01-01

    Full Text Available The performance in higher secondary school education in India is a turning point in the academic lives of all students. As this academic performance is influenced by many factors, it is essential to develop predictive data mining model for students' performance so as to identify the slow learners and study the influence of the dominant factors on their academic performance. In the present investigation, a survey cum experimental methodology was adopted to generate a database and it was constructed from a primary and a secondary source. While the primary data was collected from the regular students, the secondary data was gathered from the school and office of the Chief Educational Officer (CEO. A total of 1000 datasets of the year 2006 from five different schools in three different districts of Tamilnadu were collected. The raw data was preprocessed in terms of filling up missing values, transforming values in one form into another and relevant attribute/ variable selection. As a result, we had 772 student records, which were used for CHAID prediction model construction. A set of prediction rules were extracted from CHIAD prediction model and the efficiency of the generated CHIAD prediction model was found. The accuracy of the present model was compared with other model and it has been found to be satisfactory.

  20. Significant uncertainty in global scale hydrological modeling from precipitation data errors

    NARCIS (Netherlands)

    Weiland, Frederiek C. Sperna; Vrugt, Jasper A.; van Beek, Rens (L. ) P. H.; Weerts, Albrecht H.; Bierkens, Marc F. P.

    2015-01-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we

  1. Significant uncertainty in global scale hydrological modeling from precipitation data erros

    NARCIS (Netherlands)

    Sperna Weiland, F.; Vrugt, J.A.; Beek, van P.H.; Weerts, A.H.; Bierkens, M.F.P.

    2015-01-01

    In the past decades significant progress has been made in the fitting of hydrologic models to data. Most of this work has focused on simple, CPU-efficient, lumped hydrologic models using discharge, water table depth, soil moisture, or tracer data from relatively small river basins. In this paper, we

  2. A Physics-Informed Machine Learning Framework for RANS-based Predictive Turbulence Modeling

    Science.gov (United States)

    Xiao, Heng; Wu, Jinlong; Wang, Jianxun; Ling, Julia

    2016-11-01

    Numerical models based on the Reynolds-averaged Navier-Stokes (RANS) equations are widely used in turbulent flow simulations in support of engineering design and optimization. In these models, turbulence modeling introduces significant uncertainties in the predictions. In light of the decades-long stagnation encountered by the traditional approach of turbulence model development, data-driven methods have been proposed as a promising alternative. We will present a data-driven, physics-informed machine-learning framework for predictive turbulence modeling based on RANS models. The framework consists of three components: (1) prediction of discrepancies in RANS modeled Reynolds stresses based on machine learning algorithms, (2) propagation of improved Reynolds stresses to quantities of interests with a modified RANS solver, and (3) quantitative, a priori assessment of predictive confidence based on distance metrics in the mean flow feature space. Merits of the proposed framework are demonstrated in a class of flows featuring massive separations. Significant improvements over the baseline RANS predictions are observed. The favorable results suggest that the proposed framework is a promising path toward RANS-based predictive turbulence in the era of big data. (SAND2016-7435 A).

  3. Precise methods for conducted EMI modeling,analysis,and prediction

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  4. Precise methods for conducted EMI modeling,analysis, and prediction

    Institute of Scientific and Technical Information of China (English)

    MA WeiMing; ZHAO ZhiHua; MENG Jin; PAN QiJun; ZHANG Lei

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0-10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  5. Predictive Control, Competitive Model Business Planning, and Innovation ERP

    DEFF Research Database (Denmark)

    Nourani, Cyrus F.; Lauth, Codrina

    2015-01-01

    is not viewed as the sum of its component elements, but the product of their interactions. The paper starts with introducing a systems approach to business modeling. A competitive business modeling technique, based on the author's planning techniques is applied. Systemic decisions are based on common......New optimality principles are put forth based on competitive model business planning. A Generalized MinMax local optimum dynamic programming algorithm is presented and applied to business model computing where predictive techniques can determine local optima. Based on a systems model an enterprise...... Loops, are applied to complex management decisions. Predictive modeling specifics are briefed. A preliminary optimal game modeling technique is presented in brief with applications to innovation and R&D management. Conducting gap and risk analysis can assist with this process. Example application areas...

  6. Predicting nucleosome positioning using a duration Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Widom Jonathan

    2010-06-01

    Full Text Available Abstract Background The nucleosome is the fundamental packing unit of DNAs in eukaryotic cells. Its detailed positioning on the genome is closely related to chromosome functions. Increasing evidence has shown that genomic DNA sequence itself is highly predictive of nucleosome positioning genome-wide. Therefore a fast software tool for predicting nucleosome positioning can help understanding how a genome's nucleosome organization may facilitate genome function. Results We present a duration Hidden Markov model for nucleosome positioning prediction by explicitly modeling the linker DNA length. The nucleosome and linker models trained from yeast data are re-scaled when making predictions for other species to adjust for differences in base composition. A software tool named NuPoP is developed in three formats for free download. Conclusions Simulation studies show that modeling the linker length distribution and utilizing a base composition re-scaling method both improve the prediction of nucleosome positioning regarding sensitivity and false discovery rate. NuPoP provides a user-friendly software tool for predicting the nucleosome occupancy and the most probable nucleosome positioning map for genomic sequences of any size. When compared with two existing methods, NuPoP shows improved performance in sensitivity.

  7. Model structural uncertainty quantification and hydrologic parameter and prediction error analysis using airborne electromagnetic data

    DEFF Research Database (Denmark)

    Minsley, B. J.; Christensen, Nikolaj Kruse; Christensen, Steen

    Model structure, or the spatial arrangement of subsurface lithological units, is fundamental to the hydrological behavior of Earth systems. Knowledge of geological model structure is critically important in order to make informed hydrological predictions and management decisions. Model structure...... indicator simulation, we produce many realizations of model structure that are consistent with observed datasets and prior knowledge. Given estimates of model structural uncertainty, we incorporate hydrologic observations to evaluate the errors in hydrologic parameter or prediction errors that occur when...... is never perfectly known, however, and incorrect assumptions can be a significant source of error when making model predictions. We describe a systematic approach for quantifying model structural uncertainty that is based on the integration of sparse borehole observations and large-scale airborne...

  8. Accuracy of depolarization and delay spread predictions using advanced ray-based modeling in indoor scenarios

    Directory of Open Access Journals (Sweden)

    Mani Francesco

    2011-01-01

    Full Text Available Abstract This article investigates the prediction accuracy of an advanced deterministic propagation model in terms of channel depolarization and frequency selectivity for indoor wireless propagation. In addition to specular reflection and diffraction, the developed ray tracing tool considers penetration through dielectric blocks and/or diffuse scattering mechanisms. The sensitivity and prediction accuracy analysis is based on two measurement campaigns carried out in a warehouse and an office building. It is shown that the implementation of diffuse scattering into RT significantly increases the accuracy of the cross-polar discrimination prediction, whereas the delay-spread prediction is only marginally improved.

  9. Experience-based model predictive control using reinforcement learning

    NARCIS (Netherlands)

    Negenborn, R.R.; De Schutter, B.; Wiering, M.A.; Hellendoorn, J.

    2004-01-01

    Model predictive control (MPC) is becoming an increasingly popular method to select actions for controlling dynamic systems. TraditionallyMPC uses a model of the system to be controlled and a performance function to characterize the desired behavior of the system. The MPC agent finds actions over a

  10. Evaluation of preformance of Predictive Models for Deoxynivalenol in Wheat

    NARCIS (Netherlands)

    Fels, van der H.J.

    2014-01-01

    The aim of this study was to evaluate the performance of two predictive models for deoxynivalenol contamination of wheat at harvest in the Netherlands, including the use of weather forecast data and external model validation. Data were collected in a different year and from different wheat fields th

  11. Katz model prediction of Caenorhabditis elegans mutagenesis on STS-42

    Science.gov (United States)

    Cucinotta, Francis A.; Wilson, John W.; Katz, Robert; Badhwar, Gautam D.

    1992-01-01

    Response parameters that describe the production of recessive lethal mutations in C. elegans from ionizing radiation are obtained with the Katz track structure model. The authors used models of the space radiation environment and radiation transport to predict and discuss mutation rates for C. elegans on the IML-1 experiment aboard STS-42.

  12. A model to predict the sound reflection from forests

    NARCIS (Netherlands)

    Wunderli, J.M.; Salomons, E.M.

    2009-01-01

    A model is presented to predict the reflection of sound at forest edges. A single tree is modelled as a vertical cylinder. For the reflection at a cylinder an analytical solution is given based on the theory of scattering of spherical waves. The entire forest is represented by a line of cylinders

  13. Atmospheric modelling for seasonal prediction at the CSIR

    CSIR Research Space (South Africa)

    Landman, WA

    2014-10-01

    Full Text Available by observed monthly sea-surface temperature (SST) and sea-ice fields. The AGCM is the conformal-cubic atmospheric model (CCAM) administered by the Council for Scientific and Industrial Research. Since the model is forced with observed rather than predicted...

  14. A model to predict the sound reflection from forests

    NARCIS (Netherlands)

    Wunderli, J.M.; Salomons, E.M.

    2009-01-01

    A model is presented to predict the reflection of sound at forest edges. A single tree is modelled as a vertical cylinder. For the reflection at a cylinder an analytical solution is given based on the theory of scattering of spherical waves. The entire forest is represented by a line of cylinders pl

  15. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a network

  16. Prediction of speech intelligibility based on an auditory preprocessing model

    DEFF Research Database (Denmark)

    Christiansen, Claus Forup Corlin; Pedersen, Michael Syskind; Dau, Torsten

    2010-01-01

    in noise experiment was used for training and an ideal binary mask experiment was used for evaluation. All three models were able to capture the trends in the speech in noise training data well, but the proposed model provides a better prediction of the binary mask test data, particularly when the binary...... masks degenerate to a noise vocoder....

  17. Evaluation of preformance of Predictive Models for Deoxynivalenol in Wheat

    NARCIS (Netherlands)

    Fels, van der H.J.

    2014-01-01

    The aim of this study was to evaluate the performance of two predictive models for deoxynivalenol contamination of wheat at harvest in the Netherlands, including the use of weather forecast data and external model validation. Data were collected in a different year and from different wheat fields

  18. A Climate System Model, Numerical Simulation and Climate Predictability

    Institute of Scientific and Technical Information of China (English)

    ZENG Qingcun; WANG Huijun; LIN Zhaohui; ZHOU Guangqing; YU Yongqiang

    2007-01-01

    @@ The implementation of the project has lasted for more than 20 years. As a result, the following key innovative achievements have been obtained, ranging from the basic theory of climate dynamics, numerical model development and its related computational theory to the dynamical climate prediction using the climate system models:

  19. Prediction horizon effects on stochastic modelling hints for neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Drossu, R.; Obradovic, Z. [Washington State Univ., Pullman, WA (United States)

    1995-12-31

    The objective of this paper is to investigate the relationship between stochastic models and neural network (NN) approaches to time series modelling. Experiments on a complex real life prediction problem (entertainment video traffic) indicate that prior knowledge can be obtained through stochastic analysis both with respect to an appropriate NN architecture as well as to an appropriate sampling rate, in the case of a prediction horizon larger than one. An improvement of the obtained NN predictor is also proposed through a bias removal post-processing, resulting in much better performance than the best stochastic model.

  20. Three-model ensemble wind prediction in southern Italy

    Science.gov (United States)

    Torcasio, Rosa Claudia; Federico, Stefano; Calidonna, Claudia Roberta; Avolio, Elenio; Drofa, Oxana; Landi, Tony Christian; Malguzzi, Piero; Buzzi, Andrea; Bonasoni, Paolo

    2016-03-01

    Quality of wind prediction is of great importance since a good wind forecast allows the prediction of available wind power, improving the penetration of renewable energies into the energy market. Here, a 1-year (1 December 2012 to 30 November 2013) three-model ensemble (TME) experiment for wind prediction is considered. The models employed, run operationally at National Research Council - Institute of Atmospheric Sciences and Climate (CNR-ISAC), are RAMS (Regional Atmospheric Modelling System), BOLAM (BOlogna Limited Area Model), and MOLOCH (MOdello LOCale in H coordinates). The area considered for the study is southern Italy and the measurements used for the forecast verification are those of the GTS (Global Telecommunication System). Comparison with observations is made every 3 h up to 48 h of forecast lead time. Results show that the three-model ensemble outperforms the forecast of each individual model. The RMSE improvement compared to the best model is between 22 and 30 %, depending on the season. It is also shown that the three-model ensemble outperforms the IFS (Integrated Forecasting System) of the ECMWF (European Centre for Medium-Range Weather Forecast) for the surface wind forecasts. Notably, the three-model ensemble forecast performs better than each unbiased model, showing the added value of the ensemble technique. Finally, the sensitivity of the three-model ensemble RMSE to the length of the training period is analysed.