WorldWideScience

Sample records for model predicts greater

  1. Hybrid equation/agent-based model of ischemia-induced hyperemia and pressure ulcer formation predicts greater propensity to ulcerate in subjects with spinal cord injury.

    Directory of Open Access Journals (Sweden)

    Alexey Solovyev

    Full Text Available Pressure ulcers are costly and life-threatening complications for people with spinal cord injury (SCI. People with SCI also exhibit differential blood flow properties in non-ulcerated skin. We hypothesized that a computer simulation of the pressure ulcer formation process, informed by data regarding skin blood flow and reactive hyperemia in response to pressure, could provide insights into the pathogenesis and effective treatment of post-SCI pressure ulcers. Agent-Based Models (ABM are useful in settings such as pressure ulcers, in which spatial realism is important. Ordinary Differential Equation-based (ODE models are useful when modeling physiological phenomena such as reactive hyperemia. Accordingly, we constructed a hybrid model that combines ODEs related to blood flow along with an ABM of skin injury, inflammation, and ulcer formation. The relationship between pressure and the course of ulcer formation, as well as several other important characteristic patterns of pressure ulcer formation, was demonstrated in this model. The ODE portion of this model was calibrated to data related to blood flow following experimental pressure responses in non-injured human subjects or to data from people with SCI. This model predicted a higher propensity to form ulcers in response to pressure in people with SCI vs. non-injured control subjects, and thus may serve as novel diagnostic platform for post-SCI ulcer formation.

  2. Disorganized attachment in infancy predicts greater amygdala volume in adulthood.

    Science.gov (United States)

    Lyons-Ruth, K; Pechtel, P; Yoon, S A; Anderson, C M; Teicher, M H

    2016-07-15

    Early life stress in rodents is associated with increased amygdala volume in adulthood. In humans, the amygdala develops rapidly during the first two years of life. Thus, disturbed care during this period may be particularly important to amygdala development. In the context of a 30-year longitudinal study of impoverished, highly stressed families, we assessed whether disorganization of the attachment relationship in infancy was related to amygdala volume in adulthood. Amygdala volumes were assessed among 18 low-income young adults (8M/10F, 29.33±0.49years) first observed in infancy (8.5±5.6months) and followed longitudinally to age 29. In infancy (18.58±1.02mos), both disorganized infant attachment behavior and disrupted maternal communication were assessed in the standard Strange Situation Procedure (SSP). Increased left amygdala volume in adulthood was associated with both maternal and infant components of disorganized attachment interactions at 18 months of age (overall r=0.679, pamygdala volume. Left amygdala volume was further associated with dissociation and limbic irritability in adulthood. Finally, left amygdala volume mediated the prediction from attachment disturbance in infancy to limbic irritability in adulthood. Results point to the likely importance of quality of early care for amygdala development in human children as well as in rodents. The long-term prediction found here suggests that the first two years of life may be an early sensitive period for amygdala development during which clinical intervention could have particularly important consequences for later child outcomes. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Predicting Greater Prairie-Chicken Lek Site Suitability to Inform Conservation Actions.

    Directory of Open Access Journals (Sweden)

    Torre J Hovick

    Full Text Available The demands of a growing human population dictates that expansion of energy infrastructure, roads, and other development frequently takes place in native rangelands. Particularly, transmission lines and roads commonly divide rural landscapes and increase fragmentation. This has direct and indirect consequences on native wildlife that can be mitigated through thoughtful planning and proactive approaches to identifying areas of high conservation priority. We used nine years (2003-2011 of Greater Prairie-Chicken (Tympanuchus cupido lek locations totaling 870 unique leks sites in Kansas and seven geographic information system (GIS layers describing land cover, topography, and anthropogenic structures to model habitat suitability across the state. The models obtained had low omission rates (0.81, indicating high model performance and reliability of predicted habitat suitability for Greater Prairie-Chickens. We found that elevation was the most influential in predicting lek locations, contributing three times more predictive power than any other variable. However, models were improved by the addition of land cover and anthropogenic features (transmission lines, roads, and oil and gas structures. Overall, our analysis provides a hierarchal understanding of Greater Prairie-Chicken habitat suitability that is broadly based on geomorphological features followed by land cover suitability. We found that when land features and vegetation cover are suitable for Greater Prairie-Chickens, fragmentation by anthropogenic sources such as roadways and transmission lines are a concern. Therefore, it is our recommendation that future human development in Kansas avoid areas that our models identified as highly suitable for Greater Prairie-Chickens and focus development on land cover types that are of lower conservation concern.

  4. Migratory behaviour predicts greater parasite diversity in ungulates.

    Science.gov (United States)

    Teitelbaum, Claire S; Huang, Shan; Hall, Richard J; Altizer, Sonia

    2018-03-28

    Long-distance animal movements can increase exposure to diverse parasites, but can also reduce infection risk through escape from contaminated habitats or culling of infected individuals. These mechanisms have been demonstrated within and between populations in single-host/single-parasite interactions, but how long-distance movement behaviours shape parasite diversity and prevalence across host taxa is largely unknown. Using a comparative approach, we analyse the parasite communities of 93 migratory, nomadic and resident ungulate species. We find that migrants have higher parasite species richness than residents or nomads, even after considering other factors known to influence parasite diversity, such as body size and host geographical range area. Further analyses support a novel 'environmental tracking' hypothesis, whereby migration allows parasites to experience environments favourable to transmission year-round. In addition, the social aggregation and large group sizes that facilitate migration might increase infection risk for migrants. By contrast, we find little support for previously proposed hypotheses, including migratory escape and culling, in explaining the relationship between host movement and parasitism in mammals at this cross-species scale. Our findings, which support mechanistic links between long-distance movement and increased parasite richness at the species level, could help predict the effects of future environmental change on parasitism in migratory animals. © 2018 The Author(s).

  5. Inverse and Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-27

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.

  6. Modeling Phosphorus Transport and Cycling in the Greater Everglades Ecosystem

    Science.gov (United States)

    James, A. I.; Grace, K. A.; Jawitz, J. W.; Muller, S.; Munoz-Carpena, R.; Flaig, E. G.

    2005-12-01

    A solute transport model was used to predict phosphorus mobility in the northern Everglades. Over the past several decades, agricultural drainage waters discharged into the northern Everglades, have been enriched in phosphorus (P) relative to the historic rainfall-driven inputs. While methods of reducing total P concentrations in the discharge water have been actively pursued through implementation of agricultural Best Management Practices (BMPs), a major parallel effort has focused on the construction of a network of constructed wetlands for P removal before these waters enter the Everglades. This study describes the development of a water quality model for P transport and cycling and its application to a large constructed wetland: Stormwater Treatment Area 1 West (STA 1W), located southeast of Lake Okeechobee on the eastern perimeter of the Everglades Agricultural Area (EAA). In STA 1W agricultural nutrients such as phosphorus (P) are removed from EAA runoff before entering the adjacent Water Conservation Areas (WCAs) and the Everglades. STA 1W is divided by levees into 4 cells, which are flooded for most of the year; thus the dominant mechanism for flow and transport is overland flow. P is removed either through deposition into sediments or is taken up by plants; in either case the soils end up being significantly enriched in P. The model has been applied and calibrated to several years of water quality data from Cell 4 within STA 1W. Most existing P models have been applied to agricultural/upland systems, with only a few relevant to treatment wetlands such as STA 1W. To ensure sufficient flexibility in selecting appropriate system components and reactions, the model has been designed to incorporate a wide range of user-selectable mechanisms for P uptake and release parameters between soils and inflowing water. The model can track a large number of mobile and nonmobile components and utilizes a Godunov-style operator-splitting technique for the transported

  7. Cultural Resource Predictive Modeling

    Science.gov (United States)

    2017-10-01

    refining formal, inductive predictive models is the quality of the archaeological and environmental data. To build models efficiently, relevant...geomorphology, and historic information . Lessons Learned: The original model was focused on the identification of prehistoric resources. This...system but uses predictive modeling informally . For example, there is no probability for buried archaeological deposits on the Burton Mesa, but there is

  8. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  9. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  10. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  11. Zephyr - the prediction models

    DEFF Research Database (Denmark)

    Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg

    2001-01-01

    utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models.......This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish...

  12. Uncertainties in Tidally Adjusted Estimates of Sea Level Rise Flooding (Bathtub Model for the Greater London

    Directory of Open Access Journals (Sweden)

    Ali P. Yunus

    2016-04-01

    Full Text Available Sea-level rise (SLR from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC fifth assessment report (AR5 and UK climatic projections 2009 (UKCP09 using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.

  13. Strength in numbers: achieving greater accuracy in MHC-I binding prediction by combining the results from multiple prediction tools

    Science.gov (United States)

    Trost, Brett; Bickis, Mik; Kusalik, Anthony

    2007-01-01

    Background Peptides derived from endogenous antigens can bind to MHC class I molecules. Those which bind with high affinity can invoke a CD8+ immune response, resulting in the destruction of infected cells. Much work in immunoinformatics has involved the algorithmic prediction of peptide binding affinity to various MHC-I alleles. A number of tools for MHC-I binding prediction have been developed, many of which are available on the web. Results We hypothesize that peptides predicted by a number of tools are more likely to bind than those predicted by just one tool, and that the likelihood of a particular peptide being a binder is related to the number of tools that predict it, as well as the accuracy of those tools. To this end, we have built and tested a heuristic-based method of making MHC-binding predictions by combining the results from multiple tools. The predictive performance of each individual tool is first ascertained. These performance data are used to derive weights such that the predictions of tools with better accuracy are given greater credence. The combined tool was evaluated using ten-fold cross-validation and was found to signicantly outperform the individual tools when a high specificity threshold is used. It performs comparably well to the best-performing individual tools at lower specificity thresholds. Finally, it also outperforms the combination of the tools resulting from linear discriminant analysis. Conclusion A heuristic-based method of combining the results of the individual tools better facilitates the scanning of large proteomes for potential epitopes, yielding more actual high-affinity binders while reporting very few false positives. PMID:17381846

  14. Bright minds and dark attitudes: lower cognitive ability predicts greater prejudice through right-wing ideology and low intergroup contact.

    Science.gov (United States)

    Hodson, Gordon; Busseri, Michael A

    2012-02-01

    Despite their important implications for interpersonal behaviors and relations, cognitive abilities have been largely ignored as explanations of prejudice. We proposed and tested mediation models in which lower cognitive ability predicts greater prejudice, an effect mediated through the endorsement of right-wing ideologies (social conservatism, right-wing authoritarianism) and low levels of contact with out-groups. In an analysis of two large-scale, nationally representative United Kingdom data sets (N = 15,874), we found that lower general intelligence (g) in childhood predicts greater racism in adulthood, and this effect was largely mediated via conservative ideology. A secondary analysis of a U.S. data set confirmed a predictive effect of poor abstract-reasoning skills on antihomosexual prejudice, a relation partially mediated by both authoritarianism and low levels of intergroup contact. All analyses controlled for education and socioeconomic status. Our results suggest that cognitive abilities play a critical, albeit underappreciated, role in prejudice. Consequently, we recommend a heightened focus on cognitive ability in research on prejudice and a better integration of cognitive ability into prejudice models.

  15. Greater hunger and less restraint predict weight loss success with phentermine treatment.

    Science.gov (United States)

    Thomas, Elizabeth A; Mcnair, Bryan; Bechtell, Jamie L; Ferland, Annie; Cornier, Marc-Andre; Eckel, Robert H

    2016-01-01

    Phentermine is thought to cause weight loss through a reduction in hunger. It was hypothesized that higher hunger ratings would predict greater weight loss with phentermine. This is an observational pilot study in which all subjects were treated with phentermine for 8 weeks and appetite and eating behaviors were measured at baseline and week 8. Outcomes were compared in subjects with ≥5% vs. hunger (P = 0.017), desire to eat (P =0.003), and prospective food consumption (0.006) and lower baseline cognitive restraint (P = 0.01). In addition, higher baseline home prospective food consumption (P = 0.002) and lower baseline cognitive restraint (P hunger and less restraint are more likely to achieve significant weight loss with phentermine. This information can be used clinically to determine who might benefit most from phentermine treatment. © 2015 The Obesity Society.

  16. Predicting geographically distributed adult dental decay in the greater Auckland region of New Zealand.

    Science.gov (United States)

    Rocha, C M; Kruger, E; Whyman, R; Tennant, M

    2014-06-01

    To model the geographic distribution of current (and treated) dental decay on a high-resolution geographic basis for the Auckland region of New Zealand. The application of matrix-based mathematics to modelling adult dental disease-based on known population risk profiles to provide a detailed map of the dental caries distribution for the greater Auckland region. Of the 29 million teeth in adults in the region some 1.2 million (4%) are suffering decay whilst 7.2 million (25%) have previously suffered decay and are now restored. The model provides a high-resolution picture of where the disease burden lies geographically and presents to health planners a method for developing future service plans.

  17. Greater tactile sensitivity and less use of immature psychological defense mechanisms predict women's penile-vaginal intercourse orgasm.

    Science.gov (United States)

    Brody, Stuart; Houde, Stephanie; Hess, Ursula

    2010-09-01

    Previous research has suggested that diminished tactile sensitivity might be associated with reduced sexual activity and function. Research has also demonstrated significant physiological and psychological differences between sexual behaviors, including immature psychological defense mechanisms (associated with various psychopathologies) impairing specifically women's orgasm from penile-vaginal intercourse (PVI). To examine the extent to which orgasm triggered by PVI (distinguished from other sexual activities) is associated with both greater tactile sensitivity and lesser use of immature psychological defenses. Seventy French-Canadian female university students (aged 18-30) had their finger sensitivity measured with von Frey type microfilaments, completed the Defense Style Questionnaire and a short form of the Marlowe-Crowne social desirability scale, and provided details of the 1 month (and ever) frequencies of engaging in, and having an orgasm from, PVI, masturbation, anal intercourse, partner masturbation, and cunnilingus. Logistic and linear regression prediction of orgasm triggered by PVI from tactile sensitivity, age, social desirability responding, and immature psychological defenses. Having a PVI orgasm in the past month was associated with greater tactile sensitivity (odds ratio=4.0 for each filament point) and less use of immature defense mechanisms (odds ratio=5.1 for each scale point). Lifetime PVI orgasm was associated only with less use of immature defense mechanisms (and lower social desirability responding score). Orgasms triggered by other activities were not associated with either tactile sensitivity or immature defense mechanisms. Tactile sensitivity was also associated with greater past month PVI frequency (inclusion of PVI frequency in a logistic regression model displaced tactile sensitivity), and lesser use of immature defenses was associated with greater past month PVI and PVI orgasm frequencies. Both diminished physical sensitivity and the

  18. Fathers' decline in testosterone and synchrony with partner testosterone during pregnancy predicts greater postpartum relationship investment.

    Science.gov (United States)

    Saxbe, Darby E; Edelstein, Robin S; Lyden, Hannah M; Wardecker, Britney M; Chopik, William J; Moors, Amy C

    2017-04-01

    The transition to parenthood has been associated with declines in testosterone among partnered fathers, which may reflect males' motivation to invest in the family. Moreover, preliminary evidence has found that couples show correlations in hormone levels across pregnancy that may also be linked to fathers' preparation for parenthood. The current study used repeated-measures sampling of testosterone across pregnancy to explore whether fathers' change in T, and correlations with mothers' T, were associated with fathers' and mothers' postpartum investment. In a sample of 27 couples (54 individuals) expecting their first child, both parents' salivary testosterone was measured multiple times across pregnancy. At approximately 3.5months postpartum, participants rated their investment, commitment, and satisfaction with their partner. A multilevel model was used to measure change in testosterone over time and associations between mother and father testosterone. Fathers who showed stronger declines in T across pregnancy, and stronger correlations with mothers' testosterone, reported higher postpartum investment, commitment, and satisfaction. Mothers reported more postpartum investment and satisfaction if fathers showed greater prenatal declines in T. These results held even after controlling for paternal investment, commitment, and satisfaction measured prenatally at study entry. Our results suggest that changes in paternal testosterone across pregnancy, and hormonal linkage with the pregnant partner, may underlie fathers' dedication to the partner relationship across the transition to parenthood. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  20. Multimodal route choice models of public transport passengers in the Greater Copenhagen Area

    DEFF Research Database (Denmark)

    Anderson, Marie Karen; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Understanding route choice behavior is crucial to explain travelers’ preferences and to predict traffic flows under different scenarios. A growing body of literature has concentrated on public transport users without, however, concentrating on multimodal public transport networks because......,641 public transport users in the Greater Copenhagen Area.A two-stage approach consisting of choice set generation and route choice model estimation allowed uncovering the preferences of the users of this multimodal large-scale public transport network. The results illustrate the rates of substitution...... not only of the in-vehicle times for different public transport modes, but also of the other time components (e.g., access, walking, waiting, transfer) composing the door-to-door experience of using a multimodal public transport network, differentiating by trip length and purpose, and accounting...

  1. Predicted Liquefaction in the Greater Oakland and Northern Santa Clara Valley Areas for a Repeat of the 1868 Hayward Earthquake

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2008-12-01

    Probabilities of surface manifestations of liquefaction due to a repeat of the 1868 (M6.7-7.0) earthquake on the southern segment of the Hayward Fault were calculated for two areas along the margin of San Francisco Bay, California: greater Oakland and the northern Santa Clara Valley. Liquefaction is predicted to be more common in the greater Oakland area than in the northern Santa Clara Valley owing to the presence of 57 km2 of susceptible sandy artificial fill. Most of the fills were placed into San Francisco Bay during the first half of the 20th century to build military bases, port facilities, and shoreline communities like Alameda and Bay Farm Island. Probabilities of liquefaction in the area underlain by this sandy artificial fill range from 0.2 to ~0.5 for a M7.0 earthquake, and decrease to 0.1 to ~0.4 for a M6.7 earthquake. In the greater Oakland area, liquefaction probabilities generally are less than 0.05 for Holocene alluvial fan deposits, which underlie most of the remaining flat-lying urban area. In the northern Santa Clara Valley for a M7.0 earthquake on the Hayward Fault and an assumed water-table depth of 1.5 m (the historically shallowest water level), liquefaction probabilities range from 0.1 to 0.2 along Coyote and Guadalupe Creeks, but are less than 0.05 elsewhere. For a M6.7 earthquake, probabilities are greater than 0.1 along Coyote Creek but decrease along Guadalupe Creek to less than 0.1. Areas with high probabilities in the Santa Clara Valley are underlain by latest Holocene alluvial fan levee deposits where liquefaction and lateral spreading occurred during large earthquakes in 1868 and 1906. The liquefaction scenario maps were created with ArcGIS ModelBuilder. Peak ground accelerations first were computed with the new Boore and Atkinson NGA attenuation relation (2008, Earthquake Spectra, 24:1, p. 99-138), using VS30 to account for local site response. Spatial liquefaction probabilities were then estimated using the predicted ground motions

  2. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...

  3. Lower inhibitory control interacts with greater pain catastrophizing to predict greater pain intensity in women with migraine and overweight/obesity.

    Science.gov (United States)

    Galioto, Rachel; O'Leary, Kevin C; Thomas, J Graham; Demos, Kathryn; Lipton, Richard B; Gunstad, John; Pavlović, Jelena M; Roth, Julie; Rathier, Lucille; Bond, Dale S

    2017-12-01

    Pain catastrophizing (PC) is associated with more severe and disabling migraine attacks. However, factors that moderate this relationship are unknown. Failure of inhibitory control (IC), or the ability to suppress automatic or inappropriate responses, may be one such factor given previous research showing a relationship between higher PC and lower IC in non-migraine samples, and research showing reduced IC in migraine. Therefore, we examined whether lower IC interacts with increased PC to predict greater migraine severity as measured by pain intensity, attack frequency, and duration. Women (n = 105) aged 18-50 years old (M = 38.0 ± 1.2) with overweight/obesity and migraine who were seeking behavioral treatment for weight loss and migraine reduction completed a 28-day smartphone-based headache diary assessing migraine headache severity. Participants then completed a modified computerized Stroop task as a measure of IC and self-report measures of PC (Pain Catastrophizing Scale [PCS]), anxiety, and depression. Linear regression was used to examine independent and joint associations of PC and IC with indices of migraine severity after controlling for age, body mass index (BMI) depression, and anxiety. Participants on average had BMI of 35.1 ± 6.5 kg/m 2 and reported 5.3 ± 2.6 migraine attacks (8.3 ± 4.4 migraine days) over 28 days that produced moderate pain intensity (5.9 ± 1.4 out of 10) with duration of 20.0 ± 14.2 h. After adjusting for covariates, higher PCS total (β = .241, SE = .14, p = .03) and magnification subscale (β = .311, SE = .51, p < .01) scores were significant independent correlates of longer attack duration. IC interacted with total PCS (β = 1.106, SE = .001, p = .03) rumination (β = 1.098, SE = .001, p = .04), and helplessness (β = 1.026, SE = .001, p = .04) subscale scores to predict headache pain intensity, such that the association between PC

  4. Modelling the emerging pollutant diclofenac with the GREAT-ER model: Application to the Llobregat River Basin

    Energy Technology Data Exchange (ETDEWEB)

    Aldekoa, Joana, E-mail: joaalma2@cam.upv.es [Universitat Politècnica de València, Camino de Vera s/n, 46022 Valencia (Spain); Medici, Chiara [Universitat Politècnica de València, Camino de Vera s/n, 46022 Valencia (Spain); Osorio, Victoria; Pérez, Sandra [Institute of Environmental Assessment and Water Research, Jordi Girona 18-26, 08034 Barcelona (Spain); Marcé, Rafael [Catalan Institute for Water Research, Emili Grahit 101, 17003 Girona (Spain); Barceló, Damià [Institute of Environmental Assessment and Water Research, Jordi Girona 18-26, 08034 Barcelona (Spain); Francés, Félix [Universitat Politècnica de València, Camino de Vera s/n, 46022 Valencia (Spain)

    2013-12-15

    Highlights: • Diclofenac levels were measured in 14 sampling sites of the Llobregat River (Spain). • GREAT-ER model was used to simulate diclofenac concentrations in the Llobregat River. • Deterministic and stochastic modelling approaches were contrasted. • Diclofenac discharge into the basin was estimated for the studied period. • Consistent degradation rates were predicted and compared with literature values. -- Abstract: The present research aims at giving an insight into the increasingly important issue of water pollution due to emerging contaminants. In particular, the source and fate of the non-steroidal anti-inflammatory drug diclofenac have been analyzed at catchment scale for the Llobregat River in Catalonia (Spain). In fact, water from the Llobregat River is used to supply a significant part of the Metropolitan Area of Barcelona. At the same time, 59 wastewater treatment plants discharge into this basin. GREAT-ER model has been implemented in this basin in order to reproduce a static balance for this pollutant for two field campaigns data set. The results highlighted the ability of GREAT-ER to simulate the diclofenac concentrations in the Llobregat Catchment; however, this study also pointed out the urgent need for longer time series of observed data and a better knowledge of wastewater plants outputs and their parameterization in order to obtain more reliable results.

  5. Melanoma risk prediction models

    Directory of Open Access Journals (Sweden)

    Nikolić Jelena

    2014-01-01

    Full Text Available Background/Aim. The lack of effective therapy for advanced stages of melanoma emphasizes the importance of preventive measures and screenings of population at risk. Identifying individuals at high risk should allow targeted screenings and follow-up involving those who would benefit most. The aim of this study was to identify most significant factors for melanoma prediction in our population and to create prognostic models for identification and differentiation of individuals at risk. Methods. This case-control study included 697 participants (341 patients and 356 controls that underwent extensive interview and skin examination in order to check risk factors for melanoma. Pairwise univariate statistical comparison was used for the coarse selection of the most significant risk factors. These factors were fed into logistic regression (LR and alternating decision trees (ADT prognostic models that were assessed for their usefulness in identification of patients at risk to develop melanoma. Validation of the LR model was done by Hosmer and Lemeshow test, whereas the ADT was validated by 10-fold cross-validation. The achieved sensitivity, specificity, accuracy and AUC for both models were calculated. The melanoma risk score (MRS based on the outcome of the LR model was presented. Results. The LR model showed that the following risk factors were associated with melanoma: sunbeds (OR = 4.018; 95% CI 1.724- 9.366 for those that sometimes used sunbeds, solar damage of the skin (OR = 8.274; 95% CI 2.661-25.730 for those with severe solar damage, hair color (OR = 3.222; 95% CI 1.984-5.231 for light brown/blond hair, the number of common naevi (over 100 naevi had OR = 3.57; 95% CI 1.427-8.931, the number of dysplastic naevi (from 1 to 10 dysplastic naevi OR was 2.672; 95% CI 1.572-4.540; for more than 10 naevi OR was 6.487; 95%; CI 1.993-21.119, Fitzpatricks phototype and the presence of congenital naevi. Red hair, phototype I and large congenital naevi were

  6. Higher dynamic medial knee load predicts greater cartilage loss over 12 months in medial knee osteoarthritis.

    Science.gov (United States)

    Bennell, Kim L; Bowles, Kelly-Ann; Wang, Yuanyuan; Cicuttini, Flavia; Davies-Tuck, Miranda; Hinman, Rana S

    2011-10-01

    Mechanical factors, in particular increased medial knee joint load, are believed to be important in the structural progression of knee osteoarthritis. This study evaluated the relationship of medial knee load during walking to indices of structural disease progression, measured on MRI, in people with medial knee osteoarthritis. A longitudinal cohort design utilising a subset of participants (n=144, 72%) enrolled in a randomised controlled trial of lateral wedge insoles was employed. Medial knee load parameters including the peak knee adduction moment (KAM) and the KAM impulse were measured at baseline using three-dimensional gait analysis during walking. MRI at baseline and at 12 months was used to assess structural indices. Multiple regression with adjustment for covariates assessed the relationship between medial knee load parameters and the annual change in medial tibial cartilage volume. Binary logistic regression was used for the dichotomous variables of progression of medial tibiofemoral cartilage defects and bone marrow lesions (BML). A higher KAM impulse, but not peak KAM, at baseline was independently associated with greater loss of medial tibial cartilage volume over 12 months (β=29.9, 95% CI 6.3 to 53.5, p=0.01). No significant relationships were seen between medial knee load parameters and the progression of medial tibiofemoral cartilage defects or BML. This study suggests knee loading, in particular the KAM impulse, may be a risk factor for loss of medial tibial cartilage volume. As knee load is modifiable, load-modifying treatments may potentially slow disease progression.

  7. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    In medical statistics, many alternative strategies are available for building a prediction model based on training data. Prediction models are routinely compared by means of their prediction performance in independent validation data. If only one data set is available for training and validation......, then rival strategies can still be compared based on repeated bootstraps of the same data. Often, however, the overall performance of rival strategies is similar and it is thus difficult to decide for one model. Here, we investigate the variability of the prediction models that results when the same...... to distinguish rival prediction models with similar prediction performances. Furthermore, on the subject level a confidence score may provide useful supplementary information for new patients who want to base a medical decision on predicted risk. The ideas are illustrated and discussed using data from cancer...

  8. Self-reported posttraumatic growth predicts greater subsequent posttraumatic stress amidst war and terrorism.

    Science.gov (United States)

    Zalta, Alyson K; Gerhart, James; Hall, Brian J; Rajan, Kumar B; Vechiu, Catalina; Canetti, Daphna; Hobfoll, Stevan E

    2017-03-01

    This study tested three alternative explanations for research indicating a positive, but heterogeneous relationship between self-reported posttraumatic growth (PTG) and posttraumatic stress symptoms (PSS): (a) the third-variable hypothesis that the relationship between PTG and PSS is a spurious one driven by positive relationships with resource loss, (b) the growth over time hypothesis that the relationship between PTG and PSS is initially a positive one, but becomes negative over time, and (c) the moderator hypothesis that resource loss moderates the relationship between PTG and PSS such that PTG is associated with lower levels of PSS as loss increases. A nationally representative sample (N = 1622) of Israelis was assessed at three time points during a period of ongoing violence. PTG, resource loss, and the interaction between PTG and loss were examined as lagged predictors of PSS to test the proposed hypotheses. Results were inconsistent with all three hypotheses, showing that PTG positively predicted subsequent PSS when accounting for main and interactive effects of loss. Our results suggest that self-reported PTG is a meaningful but counterintuitive predictor of poorer mental health following trauma.

  9. Living alongside more affluent neighbors predicts greater involvement in antisocial behavior among low-income boys.

    Science.gov (United States)

    Odgers, Candice L; Donley, Sachiko; Caspi, Avshalom; Bates, Christopher J; Moffitt, Terrie E

    2015-10-01

    The creation of economically mixed communities has been proposed as one way to improve the life outcomes of children growing up in poverty. However, whether low-income children benefit from living alongside more affluent neighbors is unknown. Prospectively gathered data on over 1,600 children from the Environmental Risk (E-Risk) Longitudinal Twin Study living in urban environments is used to test whether living alongside more affluent neighbors (measured via high-resolution geo-spatial indices) predicts low-income children's antisocial behavior (reported by mothers and teachers at the ages of 5, 7, 10, and 12). Results indicated that low-income boys (but not girls) surrounded by more affluent neighbors had higher levels of antisocial behavior than their peers embedded in concentrated poverty. The negative effect of growing up alongside more affluent neighbors on low-income boys' antisocial behavior held across childhood and after controlling for key neighborhood and family-level factors. Findings suggest that efforts to create more economically mixed communities for children, if not properly supported, may have iatrogenic effects on boys' antisocial behavior. © 2015 Association for Child and Adolescent Mental Health.

  10. Assessing the Wave Energy Potential of Jamaica, a Greater Antilles Island, through Dynamic Modelling

    Science.gov (United States)

    Daley, A. P., Jr.; Dorville, J. F. M.; Taylor, M. A.

    2017-12-01

    Globally wave energy has been on the rise as a result of the impacts of climate change and continuous fluctuation in oil prices. The water's inertia provides waves with greater stability than that of other renewable energy sources such as solar and wind. Jamaica is part of the Greater Antilles Arc and has over 1000 km of coast line with an abundance of shallow water approximately 80% within a 50km band. This configuration provides a wealth of sites for wave exploitation even in minimal wave energy conditions. Aside from harnessing the oceans waves converters can be viewed as a tool for protection of coastal areas against natural marine occurrences. Jamica has done extensive studies where solar, hydro and wind resouces are concerned. However, there has been no studies done to date on the country's wave energy resources.The aim of this study is to bridge this gap by characterizing Jamaica's wave energy resources generating in a half-closed Caribbean Sea using data available from: buoys, altimetric satellite, and numerical model. Available data has been used to assess the available resource on the coastal area for the last 12 years. Statistical analysis of the available energy is determined using the sea state (Hs, Tp and Dir) and the atmospheric forcing (10m-wind, atmospheric pressure, sea-air temperature) relating to the season.The chain of dynamical model is presented (WW3-SWAN-SWASH), allowing for the tracking of the propagation of the wave energy from an offshore region to nearshore zone along with their interaction with areas of shallow depth. This will provide a better assessment of the energy and the quality of the waves closer to the electrical grid.Climate prediction is used to estimate the sea state and wave energy exploitable up to 2100. An analysis of the possible usage of the available coastal resource up to 2100. The main results present small but exploitable resources with seasonal variability in the energy available but not wave direction.

  11. Modeling a Spatio-Temporal Individual Travel Behavior Using Geotagged Social Network Data: a Case Study of Greater Cincinnati

    Science.gov (United States)

    Saeedimoghaddam, M.; Kim, C.

    2017-10-01

    Understanding individual travel behavior is vital in travel demand management as well as in urban and transportation planning. New data sources including mobile phone data and location-based social media (LBSM) data allow us to understand mobility behavior on an unprecedented level of details. Recent studies of trip purpose prediction tend to use machine learning (ML) methods, since they generally produce high levels of predictive accuracy. Few studies used LSBM as a large data source to extend its potential in predicting individual travel destination using ML techniques. In the presented research, we created a spatio-temporal probabilistic model based on an ensemble ML framework named "Random Forests" utilizing the travel extracted from geotagged Tweets in 419 census tracts of Greater Cincinnati area for predicting the tract ID of an individual's travel destination at any time using the information of its origin. We evaluated the model accuracy using the travels extracted from the Tweets themselves as well as the travels from household travel survey. The Tweets and survey based travels that start from same tract in the south western parts of the study area is more likely to select same destination compare to the other parts. Also, both Tweets and survey based travels were affected by the attraction points in the downtown of Cincinnati and the tracts in the north eastern part of the area. Finally, both evaluations show that the model predictions are acceptable, but it cannot predict destination using inputs from other data sources as precise as the Tweets based data.

  12. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  13. Prediction models in complex terrain

    DEFF Research Database (Denmark)

    Marti, I.; Nielsen, Torben Skov; Madsen, Henrik

    2001-01-01

    The objective of the work is to investigatethe performance of HIRLAM in complex terrain when used as input to energy production forecasting models, and to develop a statistical model to adapt HIRLAM prediction to the wind farm. The features of the terrain, specially the topography, influence...... the performance of HIRLAM in particular with respect to wind predictions. To estimate the performance of the model two spatial resolutions (0,5 Deg. and 0.2 Deg.) and different sets of HIRLAM variables were used to predict wind speed and energy production. The predictions of energy production for the wind farms...... are calculated using on-line measurements of power production as well as HIRLAM predictions as input thus taking advantage of the auto-correlation, which is present in the power production for shorter pediction horizons. Statistical models are used to discribe the relationship between observed energy production...

  14. Greater deciduous shrub abundance extends tundra peak season and increases modeled net CO2 uptake.

    Science.gov (United States)

    Sweet, Shannan K; Griffin, Kevin L; Steltzer, Heidi; Gough, Laura; Boelman, Natalie T

    2015-06-01

    Satellite studies of the terrestrial Arctic report increased summer greening and longer overall growing and peak seasons since the 1980s, which increases productivity and the period of carbon uptake. These trends are attributed to increasing air temperatures and reduced snow cover duration in spring and fall. Concurrently, deciduous shrubs are becoming increasingly abundant in tundra landscapes, which may also impact canopy phenology and productivity. Our aim was to determine the influence of greater deciduous shrub abundance on tundra canopy phenology and subsequent impacts on net ecosystem carbon exchange (NEE) during the growing and peak seasons in the arctic foothills region of Alaska. We compared deciduous shrub-dominated and evergreen/graminoid-dominated community-level canopy phenology throughout the growing season using the normalized difference vegetation index (NDVI). We used a tundra plant-community-specific leaf area index (LAI) model to estimate LAI throughout the green season and a tundra-specific NEE model to estimate the impact of greater deciduous shrub abundance and associated shifts in both leaf area and canopy phenology on tundra carbon flux. We found that deciduous shrub canopies reached the onset of peak greenness 13 days earlier and the onset of senescence 3 days earlier compared to evergreen/graminoid canopies, resulting in a 10-day extension of the peak season. The combined effect of the longer peak season and greater leaf area of deciduous shrub canopies almost tripled the modeled net carbon uptake of deciduous shrub communities compared to evergreen/graminoid communities, while the longer peak season alone resulted in 84% greater carbon uptake in deciduous shrub communities. These results suggest that greater deciduous shrub abundance increases carbon uptake not only due to greater leaf area, but also due to an extension of the period of peak greenness, which extends the period of maximum carbon uptake. © 2015 John Wiley & Sons Ltd.

  15. MODEL PREDICTIVE CONTROL FUNDAMENTALS

    African Journals Online (AJOL)

    2012-07-02

    Jul 2, 2012 ... Linear MPC. 1. Uses linear model: ˙x = Ax + Bu. 2. Quadratic cost function: F = xT Qx + uT Ru. 3. Linear constraints: Hx + Gu < 0. 4. Quadratic program. Nonlinear MPC. 1. Nonlinear model: ˙x = f(x, u). 2. Cost function can be nonquadratic: F = (x, u). 3. Nonlinear constraints: h(x, u) < 0. 4. Nonlinear program.

  16. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  17. Modelling bankruptcy prediction models in Slovak companies

    Directory of Open Access Journals (Sweden)

    Kovacova Maria

    2017-01-01

    Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.

  18. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. Predictive models of moth development

    Science.gov (United States)

    Degree-day models link ambient temperature to insect life-stages, making such models valuable tools in integrated pest management. These models increase management efficacy by predicting pest phenology. In Wisconsin, the top insect pest of cranberry production is the cranberry fruitworm, Acrobasis v...

  20. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  1. Predictions models with neural nets

    Directory of Open Access Journals (Sweden)

    Vladimír Konečný

    2008-01-01

    Full Text Available The contribution is oriented to basic problem trends solution of economic pointers, using neural networks. Problems include choice of the suitable model and consequently configuration of neural nets, choice computational function of neurons and the way prediction learning. The contribution contains two basic models that use structure of multilayer neural nets and way of determination their configuration. It is postulate a simple rule for teaching period of neural net, to get most credible prediction.Experiments are executed with really data evolution of exchange rate Kč/Euro. The main reason of choice this time series is their availability for sufficient long period. In carry out of experiments the both given basic kind of prediction models with most frequent use functions of neurons are verified. Achieve prediction results are presented as in numerical and so in graphical forms.

  2. Greater ankle strength, anaerobic and aerobic capacity, and agility predict Ground Combat Military Occupational School graduation in female Marines.

    Science.gov (United States)

    Allison, Katelyn Fleishman; Keenan, Karen A; Wohleber, Meleesa F; Perlsweig, Katherine A; Pletcher, Erin R; Lovalekar, Mita; Beals, Kim; Coleman, Lawrence C; Nindl, Bradley C

    2017-11-01

    Women can serve in all military occupational specialties (MOS); however, musculoskeletal and physiological characteristics that predict successful completion of ground combat MOS schools by female Marines are unknown. To determine which demographic, musculoskeletal, and physiological characteristics predict graduation from infantry and vehicle ground combat MOS schools in female Marines. Prospective cohort study. Prior to MOS school, the following were assessed in 62 female Marines (22.0±3.0yrs, 163.9±5.8cm, 63.4±7.2kg): isokinetic shoulder, trunk, and knee and isometric ankle strength; body composition; anaerobic power (AP)/capacity (AC); maximal oxygen uptake (VO 2 max); and field-based fitness tests (broad jump, medicine ball throw, pro-agility). Both absolute and normalized (%body mass: %BM) values were utilized for strength, AP, AC, and VO 2 max. Select tests from each Marine's most recent Physical Fitness Test (PFT: abdominal crunches, 3-mile run time) and Combat Fitness Test (CFT: Maneuver Under Fire, Movement to Contact) were recorded. Participants were classified as graduated (N=46) or did not graduate (N=16). Simple logistic regression was performed to determine predictors of MOS school graduation. Statistical significance was set a priori at α=0.05. Absolute and normalized ankle inversion and eversion strength, normalized anaerobic capacity, absolute and normalized VO 2 max, right pro-agility, and PFT 3-mile run time significantly predicted MOS school graduation (pagility, and greater anaerobic and aerobic capacity are important for successful completion of ground combat MOS school in female Marines. Prior to entering ground combat MOS school, it is recommended that female Marines should train to optimize these mobility-centric characteristics. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  3. Greater striatopallidal adaptive coding during cue-reward learning and food reward habituation predict future weight gain.

    Science.gov (United States)

    Burger, Kyle S; Stice, Eric

    2014-10-01

    Animal experiments indicate that after repeated pairings of palatable food receipt and cues that predict palatable food receipt, dopamine signaling increases in response to predictive cues, but decreases in response to food receipt. Using functional MRI and mixed effects growth curve models with 35 females (M age=15.5±0.9; M BMI=24.5±5.4) we documented an increase in BOLD response in the caudate (r=.42) during exposure to cues predicting impending milkshake receipt over repeated exposures, demonstrating a direct measure of in vivo cue-reward learning in humans. Further, we observed a simultaneous decrease in putamen (r=-.33) and ventral pallidum (r=-.45) response during milkshake receipt that occurred over repeated exposures, putatively reflecting food reward habitation. We then tested whether cue-reward learning and habituation slopes predicted future weight over 2-year follow-up. Those who exhibited the greatest escalation in ventral pallidum responsivity to cues and the greatest decrease in caudate response to milkshake receipt showed significantly larger increases in BMI (r=.39 and -.69 respectively). Interestingly, cue-reward learning propensity and food reward habituation were not correlated, implying that these factors may constitute qualitatively distinct vulnerability pathways to excess weight gain. These two individual difference factors may provide insight as to why certain people have shown obesity onset in response to the current obesogenic environment in western cultures, whereas others have not. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-07-01

    Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan. Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities. Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems. Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk. Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product. Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  5. LIS-HYMAP coupled Hydrological Modeling in the Nile River Basin and the Greater Horn of Africa

    Science.gov (United States)

    Jung, H. C.; Getirana, A.; Policelli, F. S.

    2015-12-01

    Water scarcity and resources in Africa have been exacerbated by periodic droughts and floods. However, few studies show the quantitative analysis of water balance or basin-scale hydrological modeling in Northeast Africa. The NASA Land Information System (LIS) is implemented to simulate land surface processes in the Nile River Basin and the Greater Horn of Africa. In this context, the Noah land surface model (LSM) and the Hydrological Modeling and Analysis Platform (HYMAP) are used to reproduce the water budget and surface water (rivers and floodplains) dynamics in that region. The Global Data Assimilation System (GDAS) meteorological dataset is used to force the system . Due to the unavailability of recent ground-based observations, satellite data are considered to evaluate first model outputs. Water levels at 10 Envisat virtual stations and water discharges at a gauging station are used to provide model performance coefficients (e.g. Nash-Sutcliffe, delay index, relative error). We also compare the spatial and temporal variations of flooded areas from the model with the Global Inundation Extent from Multi-Satellites (GIEMS) and the Alaska Satellite Facility (ASF)'s MEaSUREs Wetland data. Finally, we estimate surface water storage variations using a hypsographic curve approach with Shuttle Radar Topography Mission (SRTM) topographic data and evaluate the model-derived water storage changes in both river and floodplain. This study demonstrates the feasibility of using LIS-HYMAP coupled modeling to support seasonal forecast methods for prediction of decision-relevant metrics of hydrologic extremes.

  6. What do saliency models predict?

    Science.gov (United States)

    Koehler, Kathryn; Guo, Fei; Zhang, Sheng; Eckstein, Miguel P.

    2014-01-01

    Saliency models have been frequently used to predict eye movements made during image viewing without a specified task (free viewing). Use of a single image set to systematically compare free viewing to other tasks has never been performed. We investigated the effect of task differences on the ability of three models of saliency to predict the performance of humans viewing a novel database of 800 natural images. We introduced a novel task where 100 observers made explicit perceptual judgments about the most salient image region. Other groups of observers performed a free viewing task, saliency search task, or cued object search task. Behavior on the popular free viewing task was not best predicted by standard saliency models. Instead, the models most accurately predicted the explicit saliency selections and eye movements made while performing saliency judgments. Observers' fixations varied similarly across images for the saliency and free viewing tasks, suggesting that these two tasks are related. The variability of observers' eye movements was modulated by the task (lowest for the object search task and greatest for the free viewing and saliency search tasks) as well as the clutter content of the images. Eye movement variability in saliency search and free viewing might be also limited by inherent variation of what observers consider salient. Our results contribute to understanding the tasks and behavioral measures for which saliency models are best suited as predictors of human behavior, the relationship across various perceptual tasks, and the factors contributing to observer variability in fixational eye movements. PMID:24618107

  7. Gas phase and aerosol model simulations in the greater Athens area

    Science.gov (United States)

    Bossioli, E.; Tombrou, M.; Dandou, A.

    2003-04-01

    This study analyzes air quality data provided by numerical simulations for the Greater Athens Area (GAA) using the latest release of the emission inventory (industry, traffic, off road activities, airport, railway, harbor). The three-dimensional photochemical Urban Airshed Model (UAM-V) was coupled with the meteorological Mesoscale Model (MM5). All the simulated days favored high concentration levels of air pollutants. The concentrations of the air pollutants produced by the simulations were compared with routine measurements from the operating stations of the existing air pollution monitor network in Athens. The comparison revealed good agreement for the stations sited in the center of Athens while the observed discrepancies in a few suburban stations could be explained by the fact that few sectors (e.g. biogenic) are not included in the Athens emission inventory. Moreover, the importance of the VOCs reactivity on photochemical modeling, especially on ozone productivity, was investigated after constructing various speciation profiles of the VOCs emissions in agreement with the different land uses (urban, semi-urban). These profiles were derived from a large number of VOC species (about 200) contained in detailed emission inventories. Furthermore, the role of biogenic emissions was examined by incorporating the rural environments. Finally, a modeling contribution to the aerosols’ concentration levels in the Greater Athens Area is attempted using the three dimensional Regional Modeling System for Aerosols and Deposition (REMSAD). The aerosol distribution/deposition and toxic chemistry is examined, making use of the emissions of particulate matter included in the emission inventory such as PM, NH3 and toxics (Hg, Pb, Zn, As, Cu). Further simulations are performed by considering changes in the PM speciation. Finally, the correlation between the gaseous pollutants and the aerosol species is performed in order to provide important conclusions in areas or time

  8. Evaluation of the US Army fallout prediction model

    International Nuclear Information System (INIS)

    Pernick, A.; Levanon, I.

    1987-01-01

    The US Army fallout prediction method was evaluated against an advanced fallout prediction model--SIMFIC (Simplified Fallout Interpretive Code). The danger zone areas of the US Army method were found to be significantly greater (up to a factor of 8) than the areas of corresponding radiation hazard as predicted by SIMFIC. Nonetheless, because the US Army's method predicts danger zone lengths that are commonly shorter than the corresponding hot line distances of SIMFIC, the US Army's method is not reliably conservative

  9. Longer weekly sleep duration predicts greater 3-month BMI reduction among obese adolescents attending a clinical multidisciplinary weight management program.

    Science.gov (United States)

    Sallinen, Bethany J; Hassan, Fauziya; Olszewski, Amy; Maupin, Angela; Hoban, Timothy F; Chervin, Ronald D; Woolford, Susan J

    2013-01-01

    To determine whether baseline levels of self-reported sleep and sleep problems among obese adolescents referred to an outpatient multidisciplinary family-based weight management program predict reduction in BMI 3 months later. A retrospective medical chart review was conducted for 83 obese adolescents. The following baseline variables were extracted: self-reported sleep duration (weekdays and weekends), and presence of snoring, daytime fatigue, suspected sleep apnea, and physician-diagnosed sleep apnea. Anthropometric data at baseline and 3 months were also collected. On average, adolescents reported significantly less sleeping on weeknights (7.7 ± 1.3 h) compared to weekend nights (10.0 ± 1.8 h), t(82) = 10.5, p = 0.0001. Reduction in BMI after 3 months of treatment was predicted by more weekly sleep at baseline (R² = 0.113, F(1, 80) = 10.2, p = 0.002). Adolescents who reduced their BMI by ≥1 kg/m² reported greater weekly sleep at baseline compared to adolescents who experienced <1 kg/m² reduction (60.7 ± 7.5 h vs. 56.4 ± 8.6 h; F(1, 80) = 5.7, p = 0.02). Findings from this study, though correlational, raise the possibility that increased duration of sleep may be associated with weight loss among obese adolescents enrolled in a weight management program. Evidence-based behavioral techniques to improve sleep hygiene and increase sleep duration should be explored in pediatric weight management settings. Copyright © 2013 S. Karger GmbH, Freiburg

  10. Longer Weekly Sleep Duration Predicts Greater 3-Month BMI Reduction among Obese Adolescents Attending a Clinical Multidisciplinary Weight Management Program

    Directory of Open Access Journals (Sweden)

    Bethany J. Sallinen

    2013-05-01

    Full Text Available Aims: To determine whether baseline levels of self-reported sleep and sleep problems among obese adolescents referred to an outpatient multidisciplinary family-based weight management program predict reduction in BMI 3 months later. Methods: A retrospective medical chart review was conducted for 83 obese adolescents. The following baseline variables were extracted: self-reported sleep duration (weekdays and weekends, and presence of snoring, daytime fatigue, suspected sleep apnea, and physician-diagnosed sleep apnea. Anthropometric data at baseline and 3 months were also collected. Results: On average, adolescents reported significantly less sleeping on weeknights (7.7 ± 1.3 h compared to weekend nights (10.0 ± 1.8 h, t(82 = 10.5, p = 0.0001. Reduction in BMI after 3 months of treatment was predicted by more weekly sleep at baseline (R2 = 0.113, F(1, 80 = 10.2, p = 0.002. Adolescents who reduced their BMI by ≥1 kg/m2 reported greater weekly sleep at baseline compared to adolescents who experienced 2 reduction (60.7 ± 7.5 h vs. 56.4 ± 8.6 h; F(1, 80 = 5.7, p = 0.02. Conclusion: Findings from this study, though correlational, raise the possibility that increased duration of sleep may be associated with weight loss among obese adolescents enrolled in a weight management program. Evidence-based behavioral techniques to improve sleep hygiene and increase sleep duration should be explored in pediatric weight management settings.

  11. Towards greater realism in inclusive fitness models: the case of worker reproduction in insect societies.

    Science.gov (United States)

    Wenseleers, Tom; Helanterä, Heikki; Alves, Denise A; Dueñez-Guzmán, Edgar; Pamilo, Pekka

    2013-01-01

    The conflicts over sex allocation and male production in insect societies have long served as an important test bed for Hamilton's theory of inclusive fitness, but have for the most part been considered separately. Here, we develop new coevolutionary models to examine the interaction between these two conflicts and demonstrate that sex ratio and colony productivity costs of worker reproduction can lead to vastly different outcomes even in species that show no variation in their relatedness structure. Empirical data on worker-produced males in eight species of Melipona bees support the predictions from a model that takes into account the demographic details of colony growth and reproduction. Overall, these models contribute significantly to explaining behavioural variation that previous theories could not account for.

  12. A first approximation for modeling the liquid diffusion pathway at the greater confinement disposal facilities

    International Nuclear Information System (INIS)

    Olague, N.E.; Price, L.L.

    1991-01-01

    The greater confinement disposal (GCD) project is an ongoing project examining the disposal of orphan wastes in Area 5 of the Nevada Test Site. One of the major tasks for the project is performance assessment. With regard to performance assessment, a preliminary conceptual model for ground-water flow and radionuclide transport to the accessible environment at the GCD facilities has been developed. One of the transport pathways that has been postulated is diffusion of radionuclides in the liquid phase upward to the land surface. This pathway is not usually considered in a performance assessment, but is included in the GCD conceptual model because of relatively low recharge estimates at the GCD site and the proximity of the waste to the land surface. These low recharge estimates indicate that convective flow downward to the water table may be negligible; thus, diffusion upward to the land surface may then become important. As part of a preliminary performance assessment which considered a basecase scenario and a climate-change scenario, a first approximation for modeling the liquid-diffusion pathway was formulated. The model includes an analytical solution that incorporates both diffusion and radioactivity decay. Overall, these results indicate that, despite the configuration of the GCD facilities that establishes the need for considering the liquid-diffusion pathway, the GCD disposal concept appears to be a technically feasible method for disposing of orphan wastes. Future analyses will consist of investigating the underlying assumptions of the liquid-diffusion model, refining the model is necessary, and reducing uncertainty in the input parameters. 11 refs., 6 figs

  13. Crop classification modelling using remote sensing and environmental data in the Greater Platte River Basin, USA

    Science.gov (United States)

    Howard, Daniel M.; Wylie, Bruce K.; Tieszen, Larry L.

    2012-01-01

    With an ever expanding population, potential climate variability and an increasing demand for agriculture-based alternative fuels, accurate agricultural land-cover classification for specific crops and their spatial distributions are becoming critical to researchers, policymakers, land managers and farmers. It is important to ensure the sustainability of these and other land uses and to quantify the net impacts that certain management practices have on the environment. Although other quality crop classification products are often available, temporal and spatial coverage gaps can create complications for certain regional or time-specific applications. Our goal was to develop a model capable of classifying major crops in the Greater Platte River Basin (GPRB) for the post-2000 era to supplement existing crop classification products. This study identifies annual spatial distributions and area totals of corn, soybeans, wheat and other crops across the GPRB from 2000 to 2009. We developed a regression tree classification model based on 2.5 million training data points derived from the National Agricultural Statistics Service (NASS) Cropland Data Layer (CDL) in relation to a variety of other relevant input environmental variables. The primary input variables included the weekly 250 m US Geological Survey Earth Observing System Moderate Resolution Imaging Spectroradiometer normalized differential vegetation index, average long-term growing season temperature, average long-term growing season precipitation and yearly start of growing season. An overall model accuracy rating of 78% was achieved for a test sample of roughly 215 000 independent points that were withheld from model training. Ten 250 m resolution annual crop classification maps were produced and evaluated for the GPRB region, one for each year from 2000 to 2009. In addition to the model accuracy assessment, our validation focused on spatial distribution and county-level crop area totals in comparison with the

  14. Indoor air quality in the Greater Beirut area: a characterization and modeling assessment

    International Nuclear Information System (INIS)

    El-Fadel, Mutasem; El-Hougeiri, Nisrine; Oulabi, Mawiya

    2003-01-01

    This report presents the assessment of IAQ at various environments selected from different geographic categories from the Greater Beirut area (GBA) in Lebanon. For this purpose, background information about indoor air quality was reviewed, existing conditions were characterized, an air-sampling program was implemented and mathematical modeling was conducted. Twenty-eight indoor buildings were selected from various geographic categories representing different environments (commercial and residential...). Indoor and outdoor air samples were collected and analyzed using carbon monoxide (CO), particulate matter (TSP), nitrogen dioxide (NO 2 ) and total volatile organic compounds (TVOC) as indicators of indoor air pollution (IAP).Samples were further analyzed using the energy dispersive x-ray fluorescence technique (EDXRF) for the presence of major priority metals including iron (Fe), calcium (Ca), zinc (Zn), lead (Pb), manganese (Mn), copper (Cu) and bromine (Br). Indoor and outdoor measured levels were compared to the American Society of Heating Refrigerating and Air-Conditioning Engineers (ASHRAE) and health-based National Ambient Air Quality standards (NAAQS), respectively. For the priority metals, on the other hand, indoor measured values were compared to occupational standards recommended by the National Institute of Occupational Safety and Health (NIOSH) and Occupational Safety and Health Administration (OSHA)

  15. A path model of different forms of impulsivity with externalizing and internalizing psychopathology: Towards greater specificity.

    Science.gov (United States)

    Johnson, Sheri L; Tharp, Jordan A; Peckham, Andrew D; Carver, Charles S; Haase, Claudia M

    2017-09-01

    A growing empirical literature indicates that emotion-related impulsivity (compared to impulsivity that is unrelated to emotion) is particularly relevant for understanding a broad range of psychopathologies. Recent work, however, has differentiated two forms of emotion-related impulsivity: A factor termed Pervasive Influence of Feelings captures tendencies for emotions (mostly negative emotions) to quickly shape thoughts, and a factor termed Feelings Trigger Action captures tendencies for positive and negative emotions to quickly and reflexively shape behaviour and speech. This study used path modelling to consider links from emotion-related and non-emotion-related impulsivity to a broad range of psychopathologies. Undergraduates completed self-report measures of impulsivity, depression, anxiety, aggression, and substance use symptoms. A path model (N = 261) indicated specificity of these forms of impulsivity. Pervasive Influence of Feelings was related to anxiety and depression, whereas Feelings Trigger Action and non-emotion-related impulsivity were related to aggression and substance use. The findings of this study suggest that emotion-relevant impulsivity could be a potentially important treatment target for a set of psychopathologies. Recent work has differentiated two forms of emotion-related impulsivity. This study tests a multivariate path model linking emotion-related and non-emotion-related impulsivity with multiple forms of psychopathology. Impulsive thoughts in response to negative emotions were related to anxiety and depression. Impulsive actions in response to emotions were related to aggression and substance use, as did non-emotion-related impulsivity. The study was limited by the reliance on self-report measures of impulsivity and psychopathology. There is a need for longitudinal work on how these forms of impulsivity predict the onset and course of psychopathology. © 2017 The British Psychological Society.

  16. Selecting the best stable isotope mixing model to estimate grizzly bear diets in the Greater Yellowstone Ecosystem.

    Directory of Open Access Journals (Sweden)

    John B Hopkins

    Full Text Available Past research indicates that whitebark pine seeds are a critical food source for Threatened grizzly bears (Ursus arctos in the Greater Yellowstone Ecosystem (GYE. In recent decades, whitebark pine forests have declined markedly due to pine beetle infestation, invasive blister rust, and landscape-level fires. To date, no study has reliably estimated the contribution of whitebark pine seeds to the diets of grizzlies through time. We used stable isotope ratios (expressed as δ13C, δ15N, and δ34S values measured in grizzly bear hair and their major food sources to estimate the diets of grizzlies sampled in Cooke City Basin, Montana. We found that stable isotope mixing models that included different combinations of stable isotope values for bears and their foods generated similar proportional dietary contributions. Estimates generated by our top model suggest that whitebark pine seeds (35±10% and other plant foods (56±10% were more important than meat (9±8% to grizzly bears sampled in the study area. Stable isotope values measured in bear hair collected elsewhere in the GYE and North America support our conclusions about plant-based foraging. We recommend that researchers consider model selection when estimating the diets of animals using stable isotope mixing models. We also urge researchers to use the new statistical framework described here to estimate the dietary responses of grizzlies to declines in whitebark pine seeds and other important food sources through time in the GYE (e.g., cutthroat trout, as such information could be useful in predicting how the population will adapt to future environmental change.

  17. Greater exposure to sexual content in popular movies predicts earlier sexual debut and increased sexual risk taking.

    Science.gov (United States)

    O'Hara, Ross E; Gibbons, Frederick X; Gerrard, Meg; Li, Zhigang; Sargent, James D

    2012-09-01

    Early sexual debut is associated with risky sexual behavior and an increased risk of unplanned pregnancy and sexually transmitted infections later in life. The relations among early movie sexual exposure (MSE), sexual debut, and risky sexual behavior in adulthood (i.e., multiple sexual partners and inconsistent condom use) were examined in a longitudinal study of U.S. adolescents. MSE was measured using the Beach method, a comprehensive procedure for media content coding. Controlling for characteristics of adolescents and their families, analyses showed that MSE predicted age of sexual debut, both directly and indirectly through changes in sensation seeking. MSE also predicted engagement in risky sexual behaviors both directly and indirectly via early sexual debut. These results suggest that MSE may promote sexual risk taking both by modifying sexual behavior and by accelerating the normal rise in sensation seeking during adolescence.

  18. Empirical validation of landscape resistance models: insights from the Greater Sage-Grouse (Centrocercus urophasianus)

    Science.gov (United States)

    Andrew J. Shirk; Michael A. Schroeder; Leslie A. Robb; Samuel A. Cushman

    2015-01-01

    The ability of landscapes to impede species’ movement or gene flow may be quantified by resistance models. Few studies have assessed the performance of resistance models parameterized by expert opinion. In addition, resistance models differ in terms of spatial and thematic resolution as well as their focus on the ecology of a particular species or more generally on the...

  19. Iowa calibration of MEPDG performance prediction models.

    Science.gov (United States)

    2013-06-01

    This study aims to improve the accuracy of AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) pavement : performance predictions for Iowa pavement systems through local calibration of MEPDG prediction models. A total of 130 : representative p...

  20. Model complexity control for hydrologic prediction

    NARCIS (Netherlands)

    Schoups, G.; Van de Giesen, N.C.; Savenije, H.H.G.

    2008-01-01

    A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore

  1. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    Science.gov (United States)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  2. Staying Power of Churn Prediction Models

    NARCIS (Netherlands)

    Risselada, Hans; Verhoef, Peter C.; Bijmolt, Tammo H. A.

    In this paper, we study the staying power of various churn prediction models. Staying power is defined as the predictive performance of a model in a number of periods after the estimation period. We examine two methods, logit models and classification trees, both with and without applying a bagging

  3. Forecasting sagebrush ecosystem components and greater sage-grouse habitat for 2050: learning from past climate patterns and Landsat imagery to predict the future

    Science.gov (United States)

    Homer, Collin G.; Xian, George Z.; Aldridge, Cameron L.; Meyer, Debra K.; Loveland, Thomas R.; O'Donnell, Michael S.

    2015-01-01

    Sagebrush (Artemisia spp.) ecosystems constitute the largest single North American shrub ecosystem and provide vital ecological, hydrological, biological, agricultural, and recreational ecosystem services. Disturbances have altered and reduced this ecosystem historically, but climate change may ultimately represent the greatest future risk. Improved ways to quantify, monitor, and predict climate-driven gradual change in this ecosystem is vital to its future management. We examined the annual change of Daymet precipitation (daily gridded climate data) and five remote sensing ecosystem sagebrush vegetation and soil components (bare ground, herbaceous, litter, sagebrush, and shrub) from 1984 to 2011 in southwestern Wyoming. Bare ground displayed an increasing trend in abundance over time, and herbaceous, litter, shrub, and sagebrush showed a decreasing trend. Total precipitation amounts show a downward trend during the same period. We established statistically significant correlations between each sagebrush component and historical precipitation records using a simple least squares linear regression. Using the historical relationship between sagebrush component abundance and precipitation in a linear model, we forecasted the abundance of the sagebrush components in 2050 using Intergovernmental Panel on Climate Change (IPCC) precipitation scenarios A1B and A2. Bare ground was the only component that increased under both future scenarios, with a net increase of 48.98 km2 (1.1%) across the study area under the A1B scenario and 41.15 km2 (0.9%) under the A2 scenario. The remaining components decreased under both future scenarios: litter had the highest net reductions with 49.82 km2 (4.1%) under A1B and 50.8 km2 (4.2%) under A2, and herbaceous had the smallest net reductions with 39.95 km2 (3.8%) under A1B and 40.59 km2 (3.3%) under A2. We applied the 2050 forecast sagebrush component values to contemporary (circa 2006) greater sage-grouse (Centrocercus

  4. Predicting forest height using the GOST, Landsat 7 ETM+, and airborne LiDAR for sloping terrains in the Greater Khingan Mountains of China

    Science.gov (United States)

    Gu, Chengyan; Clevers, Jan G. P. W.; Liu, Xiao; Tian, Xin; Li, Zhouyuan; Li, Zengyuan

    2018-03-01

    Sloping terrain of forests is an overlooked factor in many models simulating the canopy bidirectional reflectance distribution function, which limits the estimation accuracy of forest vertical structure parameters (e.g., forest height). The primary objective of this study was to predict forest height on sloping terrain over large areas with the Geometric-Optical Model for Sloping Terrains (GOST) using airborne Light Detection and Ranging (LiDAR) data and Landsat 7 imagery in the western Greater Khingan Mountains of China. The Sequential Maximum Angle Convex Cone (SMACC) algorithm was used to generate image endmembers and corresponding abundances in Landsat imagery. Then, LiDAR-derived forest metrics, topographical factors and SMACC abundances were used to calibrate and validate the GOST, which aimed to accurately decompose the SMACC mixed forest pixels into sunlit crown, sunlit background and shade components. Finally, the forest height of the study area was retrieved based on a back-propagation neural network and a look-up table. Results showed good performance for coniferous forests on all slopes and at all aspects, with significant coefficients of determination above 0.70 and root mean square errors (RMSEs) between 0.50 m and 1.00 m based on ground observed validation data. Higher RMSEs were found in areas with forest heights below 5 m and above 17 m. For 90% of the forested area, the average RMSE was 3.58 m. Our study demonstrates the tremendous potential of the GOST for quantitative mapping of forest height on sloping terrains with multispectral and LiDAR inputs.

  5. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which is a r...

  6. Land Use Scenarios for Greater Copenhagen: Modelling the Impact of the Fingerplan

    DEFF Research Database (Denmark)

    Fertner, Christian; Jørgensen, Gertrud; Nielsen, Thomas Alexander Sick

    2012-01-01

    short period of time. The set-up and the results were discussed with a few experts from the Danish Ministry of the Environment and its value as discussion input recognized. The approach offers a lot of possibilities to discuss urban growth and spatial planning policies, even in a country with a strong...... region, as well as an approach to understand urban development patterns outside the ‘spatial masterplan’. In this context we will present the results of a modelling exercise addressing future land use change in the metropolitan area of Copenhagen, Denmark, and the impact of the current regional planning...... framework, the “Fingerplan 2007”. We test three policy scenarios and analyse different effects on urban growth by using the Metronamica model from the Dutch-based Research Institute for Knowledge Systems (RIKS). We analyse the possibilities to elaborate a practical and useful outcome within a relatively...

  7. Calibration of PMIS pavement performance prediction models.

    Science.gov (United States)

    2012-02-01

    Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...

  8. Predictive Model Assessment for Count Data

    National Research Council Canada - National Science Library

    Czado, Claudia; Gneiting, Tilmann; Held, Leonhard

    2007-01-01

    .... In case studies, we critique count regression models for patent data, and assess the predictive performance of Bayesian age-period-cohort models for larynx cancer counts in Germany. Key words: Calibration...

  9. Modeling and Prediction Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Juhl, Rune; Møller, Jan Kloppenborg; Jørgensen, John Bagterp

    2016-01-01

    deterministic and can predict the future perfectly. A more realistic approach would be to allow for randomness in the model due to e.g., the model be too simple or errors in input. We describe a modeling and prediction setup which better reflects reality and suggests stochastic differential equations (SDEs......) for modeling and forecasting. It is argued that this gives models and predictions which better reflect reality. The SDE approach also offers a more adequate framework for modeling and a number of efficient tools for model building. A software package (CTSM-R) for SDE-based modeling is briefly described....... that describes the variation between subjects. The ODE setup implies that the variation for a single subject is described by a single parameter (or vector), namely the variance (covariance) of the residuals. Furthermore the prediction of the states is given as the solution to the ODEs and hence assumed...

  10. Predictive models for arteriovenous fistula maturation.

    Science.gov (United States)

    Al Shakarchi, Julien; McGrogan, Damian; Van der Veer, Sabine; Sperrin, Matthew; Inston, Nicholas

    2016-05-07

    Haemodialysis (HD) is a lifeline therapy for patients with end-stage renal disease (ESRD). A critical factor in the survival of renal dialysis patients is the surgical creation of vascular access, and international guidelines recommend arteriovenous fistulas (AVF) as the gold standard of vascular access for haemodialysis. Despite this, AVFs have been associated with high failure rates. Although risk factors for AVF failure have been identified, their utility for predicting AVF failure through predictive models remains unclear. The objectives of this review are to systematically and critically assess the methodology and reporting of studies developing prognostic predictive models for AVF outcomes and assess them for suitability in clinical practice. Electronic databases were searched for studies reporting prognostic predictive models for AVF outcomes. Dual review was conducted to identify studies that reported on the development or validation of a model constructed to predict AVF outcome following creation. Data were extracted on study characteristics, risk predictors, statistical methodology, model type, as well as validation process. We included four different studies reporting five different predictive models. Parameters identified that were common to all scoring system were age and cardiovascular disease. This review has found a small number of predictive models in vascular access. The disparity between each study limits the development of a unified predictive model.

  11. Model Predictive Control Fundamentals | Orukpe | Nigerian Journal ...

    African Journals Online (AJOL)

    Model Predictive Control (MPC) has developed considerably over the last two decades, both within the research control community and in industries. MPC strategy involves the optimization of a performance index with respect to some future control sequence, using predictions of the output signal based on a process model, ...

  12. Unreachable Setpoints in Model Predictive Control

    DEFF Research Database (Denmark)

    Rawlings, James B.; Bonné, Dennis; Jørgensen, John Bagterp

    2008-01-01

    In this work, a new model predictive controller is developed that handles unreachable setpoints better than traditional model predictive control methods. The new controller induces an interesting fast/slow asymmetry in the tracking response of the system. Nominal asymptotic stability of the optim...

  13. A cervid vocal fold model suggests greater glottal efficiency in calling at high frequencies.

    Directory of Open Access Journals (Sweden)

    Ingo R Titze

    2010-08-01

    Full Text Available Male Rocky Mountain elk (Cervus elaphus nelsoni produce loud and high fundamental frequency bugles during the mating season, in contrast to the male European Red Deer (Cervus elaphus scoticus who produces loud and low fundamental frequency roaring calls. A critical step in understanding vocal communication is to relate sound complexity to anatomy and physiology in a causal manner. Experimentation at the sound source, often difficult in vivo in mammals, is simulated here by a finite element model of the larynx and a wave propagation model of the vocal tract, both based on the morphology and biomechanics of the elk. The model can produce a wide range of fundamental frequencies. Low fundamental frequencies require low vocal fold strain, but large lung pressure and large glottal flow if sound intensity level is to exceed 70 dB at 10 m distance. A high-frequency bugle requires both large muscular effort (to strain the vocal ligament and high lung pressure (to overcome phonation threshold pressure, but at least 10 dB more intensity level can be achieved. Glottal efficiency, the ration of radiated sound power to aerodynamic power at the glottis, is higher in elk, suggesting an advantage of high-pitched signaling. This advantage is based on two aspects; first, the lower airflow required for aerodynamic power and, second, an acoustic radiation advantage at higher frequencies. Both signal types are used by the respective males during the mating season and probably serve as honest signals. The two signal types relate differently to physical qualities of the sender. The low-frequency sound (Red Deer call relates to overall body size via a strong relationship between acoustic parameters and the size of vocal organs and body size. The high-frequency bugle may signal muscular strength and endurance, via a 'vocalizing at the edge' mechanism, for which efficiency is critical.

  14. Clinical Prediction Models for Cardiovascular Disease: Tufts Predictive Analytics and Comparative Effectiveness Clinical Prediction Model Database.

    Science.gov (United States)

    Wessler, Benjamin S; Lai Yh, Lana; Kramer, Whitney; Cangelosi, Michael; Raman, Gowri; Lutz, Jennifer S; Kent, David M

    2015-07-01

    Clinical prediction models (CPMs) estimate the probability of clinical outcomes and hold the potential to improve decision making and individualize care. For patients with cardiovascular disease, there are numerous CPMs available although the extent of this literature is not well described. We conducted a systematic review for articles containing CPMs for cardiovascular disease published between January 1990 and May 2012. Cardiovascular disease includes coronary heart disease, heart failure, arrhythmias, stroke, venous thromboembolism, and peripheral vascular disease. We created a novel database and characterized CPMs based on the stage of development, population under study, performance, covariates, and predicted outcomes. There are 796 models included in this database. The number of CPMs published each year is increasing steadily over time. Seven hundred seventeen (90%) are de novo CPMs, 21 (3%) are CPM recalibrations, and 58 (7%) are CPM adaptations. This database contains CPMs for 31 index conditions, including 215 CPMs for patients with coronary artery disease, 168 CPMs for population samples, and 79 models for patients with heart failure. There are 77 distinct index/outcome pairings. Of the de novo models in this database, 450 (63%) report a c-statistic and 259 (36%) report some information on calibration. There is an abundance of CPMs available for a wide assortment of cardiovascular disease conditions, with substantial redundancy in the literature. The comparative performance of these models, the consistency of effects and risk estimates across models and the actual and potential clinical impact of this body of literature is poorly understood. © 2015 American Heart Association, Inc.

  15. Hybrid approaches to physiologic modeling and prediction

    Science.gov (United States)

    Olengü, Nicholas O.; Reifman, Jaques

    2005-05-01

    This paper explores how the accuracy of a first-principles physiological model can be enhanced by integrating data-driven, "black-box" models with the original model to form a "hybrid" model system. Both linear (autoregressive) and nonlinear (neural network) data-driven techniques are separately combined with a first-principles model to predict human body core temperature. Rectal core temperature data from nine volunteers, subject to four 30/10-minute cycles of moderate exercise/rest regimen in both CONTROL and HUMID environmental conditions, are used to develop and test the approach. The results show significant improvements in prediction accuracy, with average improvements of up to 30% for prediction horizons of 20 minutes. The models developed from one subject's data are also used in the prediction of another subject's core temperature. Initial results for this approach for a 20-minute horizon show no significant improvement over the first-principles model by itself.

  16. Open Access Papers Have a Greater Citation Advantage in the Author-Pays Model

    Directory of Open Access Journals (Sweden)

    Elaine Sullo

    2016-03-01

    Full Text Available Objective – To investigate the citation performance of open access (OA and toll access (TA papers published in author-pays open access journals. Design – Longitudinal citation analysis. Setting – Publications in Springer and Elsevier’s author-pays open access journals. Subjects – 633 journals published using the author-pays model. This model encompasses both journals where the article processing charge (APC is required and journals in which authors can request open access and voluntarily pay APCs for accepted manuscripts. Methods – The authors identified APC funded journals (journals funded by mandatory author processing charges as well as those where authors voluntarily paid a fee in order to have their articles openly accessible from both Springer and Elsevier, and analyzed papers published in these journals from 2007 to 2011. The authors excluded journals that adopted the APC model later than 2007. To identify Springer titles, the authors created a search strategy to identify open access articles in SpringerLink. A total of 576 journals were identified and double checked in the Sherpa-Romeo database (a database of copyright and open access self-archiving policies of academic journals to verify their open access policies. The authors then downloaded the journal content using SpringerLink, and using Springer Author-Mapper, separated out the open access articles from the toll access articles. In order to identify the Elsevier APC funded journals, the authors referred to “Open Access Journal Directory: A-Z,” which contained 35 OA journals (p. 584. Once the authors consulted “Sponsored articles” issued by Elsevier and verified titles in Sherpa-Romeo, they identified 57 journals that fit the “author-pays” model. The bibliographic information was downloaded and OA articles were separated from TA articles. The authors confirmed that all journals were indeed OA publications by downloading the full-text from off-campus locations

  17. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  18. Model predictive control classical, robust and stochastic

    CERN Document Server

    Kouvaritakis, Basil

    2016-01-01

    For the first time, a textbook that brings together classical predictive control with treatment of up-to-date robust and stochastic techniques. Model Predictive Control describes the development of tractable algorithms for uncertain, stochastic, constrained systems. The starting point is classical predictive control and the appropriate formulation of performance objectives and constraints to provide guarantees of closed-loop stability and performance. Moving on to robust predictive control, the text explains how similar guarantees may be obtained for cases in which the model describing the system dynamics is subject to additive disturbances and parametric uncertainties. Open- and closed-loop optimization are considered and the state of the art in computationally tractable methods based on uncertainty tubes presented for systems with additive model uncertainty. Finally, the tube framework is also applied to model predictive control problems involving hard or probabilistic constraints for the cases of multiplic...

  19. A Global Model for Bankruptcy Prediction.

    Science.gov (United States)

    Alaminos, David; Del Castillo, Agustín; Fernández, Manuel Ángel

    2016-01-01

    The recent world financial crisis has increased the number of bankruptcies in numerous countries and has resulted in a new area of research which responds to the need to predict this phenomenon, not only at the level of individual countries, but also at a global level, offering explanations of the common characteristics shared by the affected companies. Nevertheless, few studies focus on the prediction of bankruptcies globally. In order to compensate for this lack of empirical literature, this study has used a methodological framework of logistic regression to construct predictive bankruptcy models for Asia, Europe and America, and other global models for the whole world. The objective is to construct a global model with a high capacity for predicting bankruptcy in any region of the world. The results obtained have allowed us to confirm the superiority of the global model in comparison to regional models over periods of up to three years prior to bankruptcy.

  20. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  1. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  2. Predictive Model of Systemic Toxicity (SOT)

    Science.gov (United States)

    In an effort to ensure chemical safety in light of regulatory advances away from reliance on animal testing, USEPA and L’Oréal have collaborated to develop a quantitative systemic toxicity prediction model. Prediction of human systemic toxicity has proved difficult and remains a ...

  3. Testicular Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Pancreatic Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Esophageal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Breast Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Lung Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  15. Predicting and Modeling RNA Architecture

    Science.gov (United States)

    Westhof, Eric; Masquida, Benoît; Jossinet, Fabrice

    2011-01-01

    SUMMARY A general approach for modeling the architecture of large and structured RNA molecules is described. The method exploits the modularity and the hierarchical folding of RNA architecture that is viewed as the assembly of preformed double-stranded helices defined by Watson-Crick base pairs and RNA modules maintained by non-Watson-Crick base pairs. Despite the extensive molecular neutrality observed in RNA structures, specificity in RNA folding is achieved through global constraints like lengths of helices, coaxiality of helical stacks, and structures adopted at the junctions of helices. The Assemble integrated suite of computer tools allows for sequence and structure analysis as well as interactive modeling by homology or ab initio assembly with possibilities for fitting within electronic density maps. The local key role of non-Watson-Crick pairs guides RNA architecture formation and offers metrics for assessing the accuracy of three-dimensional models in a more useful way than usual root mean square deviation (RMSD) values. PMID:20504963

  16. Multiple Steps Prediction with Nonlinear ARX Models

    OpenAIRE

    Zhang, Qinghua; Ljung, Lennart

    2007-01-01

    NLARX (NonLinear AutoRegressive with eXogenous inputs) models are frequently used in black-box nonlinear system identication. Though it is easy to make one step ahead prediction with such models, multiple steps prediction is far from trivial. The main difficulty is that in general there is no easy way to compute the mathematical expectation of an output conditioned by past measurements. An optimal solution would require intensive numerical computations related to nonlinear filltering. The pur...

  17. Predictability of extreme values in geophysical models

    Directory of Open Access Journals (Sweden)

    A. E. Sterk

    2012-09-01

    Full Text Available Extreme value theory in deterministic systems is concerned with unlikely large (or small values of an observable evaluated along evolutions of the system. In this paper we study the finite-time predictability of extreme values, such as convection, energy, and wind speeds, in three geophysical models. We study whether finite-time Lyapunov exponents are larger or smaller for initial conditions leading to extremes. General statements on whether extreme values are better or less predictable are not possible: the predictability of extreme values depends on the observable, the attractor of the system, and the prediction lead time.

  18. Model complexity control for hydrologic prediction

    Science.gov (United States)

    Schoups, G.; van de Giesen, N. C.; Savenije, H. H. G.

    2008-12-01

    A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore needed. We compare three model complexity control methods for hydrologic prediction, namely, cross validation (CV), Akaike's information criterion (AIC), and structural risk minimization (SRM). Results show that simulation of water flow using non-physically-based models (polynomials in this case) leads to increasingly better calibration fits as the model complexity (polynomial order) increases. However, prediction uncertainty worsens for complex non-physically-based models because of overfitting of noisy data. Incorporation of physically based constraints into the model (e.g., storage-discharge relationship) effectively bounds prediction uncertainty, even as the number of parameters increases. The conclusion is that overparameterization and equifinality do not lead to a continued increase in prediction uncertainty, as long as models are constrained by such physical principles. Complexity control of hydrologic models reduces parameter equifinality and identifies the simplest model that adequately explains the data, thereby providing a means of hydrologic generalization and classification. SRM is a promising technique for this purpose, as it (1) provides analytic upper bounds on prediction uncertainty, hence avoiding the computational burden of CV, and (2) extends the applicability of classic methods such as AIC to finite data. The main hurdle in applying SRM is the need for an a priori estimation of the complexity of the hydrologic model, as measured by its Vapnik-Chernovenkis (VC) dimension. Further research is needed in this area.

  19. Quantifying predictive accuracy in survival models.

    Science.gov (United States)

    Lirette, Seth T; Aban, Inmaculada

    2017-12-01

    For time-to-event outcomes in medical research, survival models are the most appropriate to use. Unlike logistic regression models, quantifying the predictive accuracy of these models is not a trivial task. We present the classes of concordance (C) statistics and R 2 statistics often used to assess the predictive ability of these models. The discussion focuses on Harrell's C, Kent and O'Quigley's R 2 , and Royston and Sauerbrei's R 2 . We present similarities and differences between the statistics, discuss the software options from the most widely used statistical analysis packages, and give a practical example using the Worcester Heart Attack Study dataset.

  20. Predictive power of nuclear-mass models

    Directory of Open Access Journals (Sweden)

    Yu. A. Litvinov

    2013-12-01

    Full Text Available Ten different theoretical models are tested for their predictive power in the description of nuclear masses. Two sets of experimental masses are used for the test: the older set of 2003 and the newer one of 2011. The predictive power is studied in two regions of nuclei: the global region (Z, N ≥ 8 and the heavy-nuclei region (Z ≥ 82, N ≥ 126. No clear correlation is found between the predictive power of a model and the accuracy of its description of the masses.

  1. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  2. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  3. Accuracy assessment of landslide prediction models

    International Nuclear Information System (INIS)

    Othman, A N; Mohd, W M N W; Noraini, S

    2014-01-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones

  4. Evolutionary neural network modeling for software cumulative failure time prediction

    International Nuclear Information System (INIS)

    Tian Liang; Noore, Afzel

    2005-01-01

    An evolutionary neural network modeling approach for software cumulative failure time prediction based on multiple-delayed-input single-output architecture is proposed. Genetic algorithm is used to globally optimize the number of the delayed input neurons and the number of neurons in the hidden layer of the neural network architecture. Modification of Levenberg-Marquardt algorithm with Bayesian regularization is used to improve the ability to predict software cumulative failure time. The performance of our proposed approach has been compared using real-time control and flight dynamic application data sets. Numerical results show that both the goodness-of-fit and the next-step-predictability of our proposed approach have greater accuracy in predicting software cumulative failure time compared to existing approaches

  5. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  6. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  7. Risk assessment and management of brucellosis in the southern greater Yellowstone area (I): A citizen-science based risk model for bovine brucellosis transmission from elk to cattle.

    Science.gov (United States)

    Kauffman, Mandy; Peck, Dannele; Scurlock, Brandon; Logan, Jim; Robinson, Timothy; Cook, Walt; Boroff, Kari; Schumaker, Brant

    2016-09-15

    Livestock producers and state wildlife agencies have used multiple management strategies to control bovine brucellosis in the Greater Yellowstone Area (GYA). However, spillover from elk to domestic bison and cattle herds continues to occur. Although knowledge is increasing about the location and behavior of elk in the SGYA, predicting spatiotemporal overlap between elk and cattle requires locations of livestock operations and observations of elk contact by producers. We queried all producers in a three-county area using a questionnaire designed to determine location of cattle and whether producers saw elk comingle with their animals. This information was used to parameterize a spatially-explicit risk model to estimate the number of elk expected to overlap with cattle during the brucellosis transmission risk period. Elk-cattle overlap was predicted in areas further from roads and forest boundaries in areas with wolf activity, with higher slopes, lower hunter densities, and where the cost-distance to feedgrounds was very low or very high. The model was used to estimate the expected number of years until a cattle reactor will be detected, under alternative management strategies. The model predicted cattle cases every 4.28 years in the highest risk herd unit, a higher prediction than the one case in 26 years we have observed. This difference likely indicates that ongoing management strategies are at least somewhat effective in preventing potential elk-cattle brucellosis transmission in these areas. Using this model, we can infer the expected effectiveness of various management strategies for reducing the risk of brucellosis spillover from elk to cattle. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Posterior predictive checking of multiple imputation models.

    Science.gov (United States)

    Nguyen, Cattram D; Lee, Katherine J; Carlin, John B

    2015-07-01

    Multiple imputation is gaining popularity as a strategy for handling missing data, but there is a scarcity of tools for checking imputation models, a critical step in model fitting. Posterior predictive checking (PPC) has been recommended as an imputation diagnostic. PPC involves simulating "replicated" data from the posterior predictive distribution of the model under scrutiny. Model fit is assessed by examining whether the analysis from the observed data appears typical of results obtained from the replicates produced by the model. A proposed diagnostic measure is the posterior predictive "p-value", an extreme value of which (i.e., a value close to 0 or 1) suggests a misfit between the model and the data. The aim of this study was to evaluate the performance of the posterior predictive p-value as an imputation diagnostic. Using simulation methods, we deliberately misspecified imputation models to determine whether posterior predictive p-values were effective in identifying these problems. When estimating the regression parameter of interest, we found that more extreme p-values were associated with poorer imputation model performance, although the results highlighted that traditional thresholds for classical p-values do not apply in this context. A shortcoming of the PPC method was its reduced ability to detect misspecified models with increasing amounts of missing data. Despite the limitations of posterior predictive p-values, they appear to have a valuable place in the imputer's toolkit. In addition to automated checking using p-values, we recommend imputers perform graphical checks and examine other summaries of the test quantity distribution. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Predicting Protein Secondary Structure with Markov Models

    DEFF Research Database (Denmark)

    Fischer, Paul; Larsen, Simon; Thomsen, Claus

    2004-01-01

    we are considering here, is to predict the secondary structure from the primary one. To this end we train a Markov model on training data and then use it to classify parts of unknown protein sequences as sheets, helices or coils. We show how to exploit the directional information contained...... in the Markov model for this task. Classifications that are purely based on statistical models might not always be biologically meaningful. We present combinatorial methods to incorporate biological background knowledge to enhance the prediction performance....

  10. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  11. Comparative Study of Bancruptcy Prediction Models

    Directory of Open Access Journals (Sweden)

    Isye Arieshanti

    2013-09-01

    Full Text Available Early indication of bancruptcy is important for a company. If companies aware of  potency of their bancruptcy, they can take a preventive action to anticipate the bancruptcy. In order to detect the potency of a bancruptcy, a company can utilize a a model of bancruptcy prediction. The prediction model can be built using a machine learning methods. However, the choice of machine learning methods should be performed carefully. Because the suitability of a model depends on the problem specifically. Therefore, in this paper we perform a comparative study of several machine leaning methods for bancruptcy prediction. According to the comparative study, the performance of several models that based on machine learning methods (k-NN, fuzzy k-NN, SVM, Bagging Nearest Neighbour SVM, Multilayer Perceptron(MLP, Hybrid of MLP + Multiple Linear Regression, it can be showed that fuzzy k-NN method achieve the best performance with accuracy 77.5%

  12. Prediction Models for Dynamic Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    Aman, Saima; Frincu, Marc; Chelmis, Charalampos; Noor, Muhammad; Simmhan, Yogesh; Prasanna, Viktor K.

    2015-11-02

    As Smart Grids move closer to dynamic curtailment programs, Demand Response (DR) events will become necessary not only on fixed time intervals and weekdays predetermined by static policies, but also during changing decision periods and weekends to react to real-time demand signals. Unique challenges arise in this context vis-a-vis demand prediction and curtailment estimation and the transformation of such tasks into an automated, efficient dynamic demand response (D2R) process. While existing work has concentrated on increasing the accuracy of prediction models for DR, there is a lack of studies for prediction models for D2R, which we address in this paper. Our first contribution is the formal definition of D2R, and the description of its challenges and requirements. Our second contribution is a feasibility analysis of very-short-term prediction of electricity consumption for D2R over a diverse, large-scale dataset that includes both small residential customers and large buildings. Our third, and major contribution is a set of insights into the predictability of electricity consumption in the context of D2R. Specifically, we focus on prediction models that can operate at a very small data granularity (here 15-min intervals), for both weekdays and weekends - all conditions that characterize scenarios for D2R. We find that short-term time series and simple averaging models used by Independent Service Operators and utilities achieve superior prediction accuracy. We also observe that workdays are more predictable than weekends and holiday. Also, smaller customers have large variation in consumption and are less predictable than larger buildings. Key implications of our findings are that better models are required for small customers and for non-workdays, both of which are critical for D2R. Also, prediction models require just few days’ worth of data indicating that small amounts of

  13. Are animal models predictive for humans?

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2009-01-01

    Full Text Available Abstract It is one of the central aims of the philosophy of science to elucidate the meanings of scientific terms and also to think critically about their application. The focus of this essay is the scientific term predict and whether there is credible evidence that animal models, especially in toxicology and pathophysiology, can be used to predict human outcomes. Whether animals can be used to predict human response to drugs and other chemicals is apparently a contentious issue. However, when one empirically analyzes animal models using scientific tools they fall far short of being able to predict human responses. This is not surprising considering what we have learned from fields such evolutionary and developmental biology, gene regulation and expression, epigenetics, complexity theory, and comparative genomics.

  14. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  15. Model predictive controller design of hydrocracker reactors

    OpenAIRE

    GÖKÇE, Dila

    2014-01-01

    This study summarizes the design of a Model Predictive Controller (MPC) in Tüpraş, İzmit Refinery Hydrocracker Unit Reactors. Hydrocracking process, in which heavy vacuum gasoil is converted into lighter and valuable products at high temperature and pressure is described briefly. Controller design description, identification and modeling studies are examined and the model variables are presented. WABT (Weighted Average Bed Temperature) equalization and conversion increase are simulate...

  16. Multi-Model Ensemble Wake Vortex Prediction

    Science.gov (United States)

    Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.

    2015-01-01

    Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.

  17. The quest for the perfect model: Pre World War 1. Military land use modeling of the Greater Copenhagen area

    DEFF Research Database (Denmark)

    Svenningsen, Stig Roar; Brandt, Jesper; Christensen, Andreas Aagaard

    the rotational system. At first the survey campaign seems to be going very well, but relative quickly did the military run into problems. The rapid urbanization of the landscape north of Copenhagen meant, that farming did not take place and at the island of Amager southwest of Copenhagen the farmers didn’t use......Anthropogenic land use practices are the single most important factor in the changing European landscapes. Respectively much attention has been devoted within Landscape Ecology to analyze changing patterns of land use and develop research strategies to understand the processes behind these changes...... and to inform policy makers. Models are used as an important tool in this research partly due to the revolution in information technologies during the last 30 years, which has made modeling more widespread in the research community. However modeling human decision making in form of land use practices...

  18. Thermodynamic modeling of activity coefficient and prediction of solubility: Part 1. Predictive models.

    Science.gov (United States)

    Mirmehrabi, Mahmoud; Rohani, Sohrab; Perry, Luisa

    2006-04-01

    A new activity coefficient model was developed from excess Gibbs free energy in the form G(ex) = cA(a) x(1)(b)...x(n)(b). The constants of the proposed model were considered to be function of solute and solvent dielectric constants, Hildebrand solubility parameters and specific volumes of solute and solvent molecules. The proposed model obeys the Gibbs-Duhem condition for activity coefficient models. To generalize the model and make it as a purely predictive model without any adjustable parameters, its constants were found using the experimental activity coefficient and physical properties of 20 vapor-liquid systems. The predictive capability of the proposed model was tested by calculating the activity coefficients of 41 binary vapor-liquid equilibrium systems and showed good agreement with the experimental data in comparison with two other predictive models, the UNIFAC and Hildebrand models. The only data used for the prediction of activity coefficients, were dielectric constants, Hildebrand solubility parameters, and specific volumes of the solute and solvent molecules. Furthermore, the proposed model was used to predict the activity coefficient of an organic compound, stearic acid, whose physical properties were available in methanol and 2-butanone. The predicted activity coefficient along with the thermal properties of the stearic acid were used to calculate the solubility of stearic acid in these two solvents and resulted in a better agreement with the experimental data compared to the UNIFAC and Hildebrand predictive models.

  19. PREDICTIVE CAPACITY OF ARCH FAMILY MODELS

    Directory of Open Access Journals (Sweden)

    Raphael Silveira Amaro

    2016-03-01

    Full Text Available In the last decades, a remarkable number of models, variants from the Autoregressive Conditional Heteroscedastic family, have been developed and empirically tested, making extremely complex the process of choosing a particular model. This research aim to compare the predictive capacity, using the Model Confidence Set procedure, than five conditional heteroskedasticity models, considering eight different statistical probability distributions. The financial series which were used refers to the log-return series of the Bovespa index and the Dow Jones Industrial Index in the period between 27 October 2008 and 30 December 2014. The empirical evidences showed that, in general, competing models have a great homogeneity to make predictions, either for a stock market of a developed country or for a stock market of a developing country. An equivalent result can be inferred for the statistical probability distributions that were used.

  20. Modeled connectivity of Acropora millepora populations from reefs of the Spratly Islands and the greater South China Sea

    Science.gov (United States)

    Dorman, Jeffrey G.; Castruccio, Frederic S.; Curchitser, Enrique N.; Kleypas, Joan A.; Powell, Thomas M.

    2016-03-01

    The Spratly Island archipelago is a remote network of coral reefs and islands in the South China Sea that is a likely source of coral larvae to the greater region, but about which little is known. Using a particle-tracking model driven by oceanographic data from the Coral Triangle region, we simulated both spring and fall spawning events of Acropora millepora, a common coral species, over a 46-yr period (1960-2005). Simulated population biology of A. millepora included the acquisition and loss of competency, settlement over appropriate benthic habitat, and mortality based on experimental data. The simulations aimed to provide insights into the connectivity of reefs within the Spratly Islands, the settlement of larvae on reefs of the greater South China Sea, and the potential dispersal range of reef organisms from the Spratly Islands. Results suggest that (1) the Spratly Islands may be a significant source of A. millepora larvae for the Palawan reefs (Philippines) and some of the most isolated reefs of the South China Sea; and (2) the relatively isolated western Spratly Islands have limited source reefs supplying them with larvae and fewer of their larvae successfully settling on other reefs. Examination of particle dispersal without biology (settlement and mortality) suggests that larval connectivity is possible throughout the South China Sea and into the Coral Triangle region. Strong differences in the spring versus fall larval connectivity and dispersal highlight the need for a greater understanding of spawning dynamics of the region. This study confirms that the Spratly Islands are likely an important source of larvae for the South China Sea and Coral Triangle region.

  1. A revised prediction model for natural conception.

    Science.gov (United States)

    Bensdorp, Alexandra J; van der Steeg, Jan Willem; Steures, Pieternel; Habbema, J Dik F; Hompes, Peter G A; Bossuyt, Patrick M M; van der Veen, Fulco; Mol, Ben W J; Eijkemans, Marinus J C

    2017-06-01

    One of the aims in reproductive medicine is to differentiate between couples that have favourable chances of conceiving naturally and those that do not. Since the development of the prediction model of Hunault, characteristics of the subfertile population have changed. The objective of this analysis was to assess whether additional predictors can refine the Hunault model and extend its applicability. Consecutive subfertile couples with unexplained and mild male subfertility presenting in fertility clinics were asked to participate in a prospective cohort study. We constructed a multivariable prediction model with the predictors from the Hunault model and new potential predictors. The primary outcome, natural conception leading to an ongoing pregnancy, was observed in 1053 women of the 5184 included couples (20%). All predictors of the Hunault model were selected into the revised model plus an additional seven (woman's body mass index, cycle length, basal FSH levels, tubal status,history of previous pregnancies in the current relationship (ongoing pregnancies after natural conception, fertility treatment or miscarriages), semen volume, and semen morphology. Predictions from the revised model seem to concur better with observed pregnancy rates compared with the Hunault model; c-statistic of 0.71 (95% CI 0.69 to 0.73) compared with 0.59 (95% CI 0.57 to 0.61). Copyright © 2017. Published by Elsevier Ltd.

  2. Wyoming greater sage-grouse habitat prioritization: A collection of multi-scale seasonal models and geographic information systems land management tools

    Science.gov (United States)

    O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.

    2015-01-01

    With rapidly changing landscape conditions within Wyoming and the potential effects of landscape changes on sage-grouse habitat, land managers and conservation planners, among others, need procedures to assess the location and juxtaposition of important habitats, land-cover, and land-use patterns to balance wildlife requirements with multiple human land uses. Biologists frequently develop habitat-selection studies to identify prioritization efforts for species of conservation concern to increase understanding and help guide habitat-conservation efforts. Recently, the authors undertook a large-scale collaborative effort that developed habitat-selection models for Greater Sage-grouse (Centrocercus urophasianus) across large landscapes in Wyoming, USA and for multiple life-stages (nesting, late brood-rearing, and winter). We developed these habitat models using resource selection functions, based upon sage-grouse telemetry data collected for localized studies and within each life-stage. The models allowed us to characterize and spatially predict seasonal sage-grouse habitat use in Wyoming. Due to the quantity of models, the diversity of model predictors (in the form of geographic information system data) produced by analyses, and the variety of potential applications for these data, we present here a resource that complements our published modeling effort, which will further support land managers.

  3. Predicting turns in proteins with a unified model.

    Directory of Open Access Journals (Sweden)

    Qi Song

    Full Text Available MOTIVATION: Turns are a critical element of the structure of a protein; turns play a crucial role in loops, folds, and interactions. Current prediction methods are well developed for the prediction of individual turn types, including α-turn, β-turn, and γ-turn, etc. However, for further protein structure and function prediction it is necessary to develop a uniform model that can accurately predict all types of turns simultaneously. RESULTS: In this study, we present a novel approach, TurnP, which offers the ability to investigate all the turns in a protein based on a unified model. The main characteristics of TurnP are: (i using newly exploited features of structural evolution information (secondary structure and shape string of protein based on structure homologies, (ii considering all types of turns in a unified model, and (iii practical capability of accurate prediction of all turns simultaneously for a query. TurnP utilizes predicted secondary structures and predicted shape strings, both of which have greater accuracy, based on innovative technologies which were both developed by our group. Then, sequence and structural evolution features, which are profile of sequence, profile of secondary structures and profile of shape strings are generated by sequence and structure alignment. When TurnP was validated on a non-redundant dataset (4,107 entries by five-fold cross-validation, we achieved an accuracy of 88.8% and a sensitivity of 71.8%, which exceeded the most state-of-the-art predictors of certain type of turn. Newly determined sequences, the EVA and CASP9 datasets were used as independent tests and the results we achieved were outstanding for turn predictions and confirmed the good performance of TurnP for practical applications.

  4. Models to predict the start of the airborne pollen season

    Science.gov (United States)

    Siniscalco, Consolata; Caramiello, Rosanna; Migliavacca, Mirco; Busetto, Lorenzo; Mercalli, Luca; Colombo, Roberto; Richardson, Andrew D.

    2015-07-01

    Aerobiological data can be used as indirect but reliable measures of flowering phenology to analyze the response of plant species to ongoing climate changes. The aims of this study are to evaluate the performance of several phenological models for predicting the pollen start of season (PSS) in seven spring-flowering trees ( Alnus glutinosa, Acer negundo, Carpinus betulus, Platanus occidentalis, Juglans nigra, Alnus viridis, and Castanea sativa) and in two summer-flowering herbaceous species ( Artemisia vulgaris and Ambrosia artemisiifolia) by using a 26-year aerobiological data set collected in Turin (Northern Italy). Data showed a reduced interannual variability of the PSS in the summer-flowering species compared to the spring-flowering ones. Spring warming models with photoperiod limitation performed best for the greater majority of the studied species, while chilling class models were selected only for the early spring flowering species. For Ambrosia and Artemisia, spring warming models were also selected as the best models, indicating that temperature sums are positively related to flowering. However, the poor variance explained by the models suggests that further analyses have to be carried out in order to develop better models for predicting the PSS in these two species. Modeling the pollen season start on a very wide data set provided a new opportunity to highlight the limits of models in elucidating the environmental factors driving the pollen season start when some factors are always fulfilled, as chilling or photoperiod or when the variance is very poor and is not explained by the models.

  5. Modelling language evolution: Examples and predictions

    Science.gov (United States)

    Gong, Tao; Shuai, Lan; Zhang, Menghan

    2014-06-01

    We survey recent computer modelling research of language evolution, focusing on a rule-based model simulating the lexicon-syntax coevolution and an equation-based model quantifying the language competition dynamics. We discuss four predictions of these models: (a) correlation between domain-general abilities (e.g. sequential learning) and language-specific mechanisms (e.g. word order processing); (b) coevolution of language and relevant competences (e.g. joint attention); (c) effects of cultural transmission and social structure on linguistic understandability; and (d) commonalities between linguistic, biological, and physical phenomena. All these contribute significantly to our understanding of the evolutions of language structures, individual learning mechanisms, and relevant biological and socio-cultural factors. We conclude the survey by highlighting three future directions of modelling studies of language evolution: (a) adopting experimental approaches for model evaluation; (b) consolidating empirical foundations of models; and (c) multi-disciplinary collaboration among modelling, linguistics, and other relevant disciplines.

  6. Model Predictive Control of Sewer Networks

    DEFF Research Database (Denmark)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik

    2016-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored and cont...... benchmark model. Due to the inherent constraints the applied approach is based on Model Predictive Control....... and controlled have thus become essential factors for efficient performance of waste water treatment plants. This paper examines methods for simplified modelling and controlling a sewer network. A practical approach to the problem is used by analysing simplified design model, which is based on the Barcelona...

  7. Bayesian Predictive Models for Rayleigh Wind Speed

    DEFF Research Database (Denmark)

    Shahirinia, Amir; Hajizadeh, Amin; Yu, David C

    2017-01-01

    predictive model of the wind speed aggregates the non-homogeneous distributions into a single continuous distribution. Therefore, the result is able to capture the variation among the probability distributions of the wind speeds at the turbines’ locations in a wind farm. More specifically, instead of using...... a wind speed distribution whose parameters are known or estimated, the parameters are considered as random whose variations are according to probability distributions. The Bayesian predictive model for a Rayleigh which only has a single model scale parameter has been proposed. Also closed-form posterior......One of the major challenges with the increase in wind power generation is the uncertain nature of wind speed. So far the uncertainty about wind speed has been presented through probability distributions. Also the existing models that consider the uncertainty of the wind speed primarily view...

  8. Comparison of two ordinal prediction models

    DEFF Research Database (Denmark)

    Kattan, Michael W; Gerds, Thomas A

    2015-01-01

    system (i.e. old or new), such as the level of evidence for one or more factors included in the system or the general opinions of expert clinicians. However, given the major objective of estimating prognosis on an ordinal scale, we argue that the rival staging system candidates should be compared...... on their ability to predict outcome. We sought to outline an algorithm that would compare two rival ordinal systems on their predictive ability. RESULTS: We devised an algorithm based largely on the concordance index, which is appropriate for comparing two models in their ability to rank observations. We...... demonstrate our algorithm with a prostate cancer staging system example. CONCLUSION: We have provided an algorithm for selecting the preferred staging system based on prognostic accuracy. It appears to be useful for the purpose of selecting between two ordinal prediction models....

  9. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  10. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  11. Predictive modeling in homogeneous catalysis: a tutorial

    NARCIS (Netherlands)

    Maldonado, A.G.; Rothenberg, G.

    2010-01-01

    Predictive modeling has become a practical research tool in homogeneous catalysis. It can help to pinpoint ‘good regions’ in the catalyst space, narrowing the search for the optimal catalyst for a given reaction. Just like any other new idea, in silico catalyst optimization is accepted by some

  12. Model predictive control of smart microgrids

    DEFF Research Database (Denmark)

    Hu, Jiefeng; Zhu, Jianguo; Guerrero, Josep M.

    2014-01-01

    required to realise high-performance of distributed generations and will realise innovative control techniques utilising model predictive control (MPC) to assist in coordinating the plethora of generation and load combinations, thus enable the effective exploitation of the clean renewable energy sources...

  13. Feedback model predictive control by randomized algorithms

    NARCIS (Netherlands)

    Batina, Ivo; Stoorvogel, Antonie Arij; Weiland, Siep

    2001-01-01

    In this paper we present a further development of an algorithm for stochastic disturbance rejection in model predictive control with input constraints based on randomized algorithms. The algorithm presented in our work can solve the problem of stochastic disturbance rejection approximately but with

  14. A Robustly Stabilizing Model Predictive Control Algorithm

    Science.gov (United States)

    Ackmece, A. Behcet; Carson, John M., III

    2007-01-01

    A model predictive control (MPC) algorithm that differs from prior MPC algorithms has been developed for controlling an uncertain nonlinear system. This algorithm guarantees the resolvability of an associated finite-horizon optimal-control problem in a receding-horizon implementation.

  15. Hierarchical Model Predictive Control for Resource Distribution

    DEFF Research Database (Denmark)

    Bendtsen, Jan Dimon; Trangbæk, K; Stoustrup, Jakob

    2010-01-01

    This paper deals with hierarchichal model predictive control (MPC) of distributed systems. A three level hierachical approach is proposed, consisting of a high level MPC controller, a second level of so-called aggregators, controlled by an online MPC-like algorithm, and a lower level of autonomous...

  16. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations ...

  17. Splitting and non splitting are pollution models photochemical reactions in the urban areas of greater Tehran area

    International Nuclear Information System (INIS)

    Heidarinasab, A.; Dabir, B.; Sahimi, M.; Badii, Kh.

    2003-01-01

    During the past years, one of the most important problems has been air pollution in urban areas. In this regards, ozone, as one of the major products of photochemical reactions, has great importance. The term 'photochemical' is applied to a number of secondary pollutants that appear as a result of sun-related reactions, ozone being the most important one. So far various models have been suggested to predict these pollutants. In this paper, we developed the model that has been introduced by Dabir, et al. [4]. In this model more than 48 chemical species and 114 chemical reactions are involved. The result of this development, showed good to excellent agreement across the region for compounds such as O 3 , NO, NO 2 , CO, and SO 2 with regard to VOC and NMHC. The results of the simulation were compared with previous work [4] and the effects of increasing the number of components and reactions were evaluated. The results of the operator splitting method were compared with non splitting solving method. The result showed that splitting method with one-tenth time step collapsed with non splitting method (Crank-Nicolson, under-relaxation iteration method without splitting of the equation terms). Then we developed one dimensional model to 3-D and were compared with experimental data

  18. Disease prediction models and operational readiness.

    Directory of Open Access Journals (Sweden)

    Courtney D Corley

    Full Text Available The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. We define a disease event to be a biological event with focus on the One Health paradigm. These events are characterized by evidence of infection and or disease condition. We reviewed models that attempted to predict a disease event, not merely its transmission dynamics and we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011. We searched commercial and government databases and harvested Google search results for eligible models, using terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche modeling. After removal of duplications and extraneous material, a core collection of 6,524 items was established, and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. As a result, we systematically reviewed 44 papers, and the results are presented in this analysis. We identified 44 models, classified as one or more of the following: event prediction (4, spatial (26, ecological niche (28, diagnostic or clinical (6, spread or response (9, and reviews (3. The model parameters (e.g., etiology, climatic, spatial, cultural and data sources (e.g., remote sensing, non-governmental organizations, expert opinion, epidemiological were recorded and reviewed. A component of this review is the identification of verification and validation (V&V methods applied to each model, if any V&V method was reported. All models were classified as either having undergone Some Verification or Validation method, or No Verification or Validation. We close by outlining an initial set of operational readiness level guidelines for disease prediction models based upon established Technology

  19. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  20. Fournier's gangrene: a model for early prediction.

    Science.gov (United States)

    Palvolgyi, Roland; Kaji, Amy H; Valeriano, Javier; Plurad, David; Rajfer, Jacob; de Virgilio, Christian

    2014-10-01

    Early diagnosis remains the cornerstone of management of Fournier's gangrene. As a result of variable progression of disease, identifying early predictors of necrosis becomes a diagnostic challenge. We present a scoring system based on objective admission criteria, which can help distinguish Fournier's gangrene from nonnecrotizing scrotal infections. Ninety-six patients were identified, 38 diagnosed with Fournier's gangrene and 58 diagnosed with scrotal cellulitis or abscess. Statistical analyses comparing admission vital signs, laboratory values, and imaging studies were performed and Classification and Regression Tree analysis was used to construct a scoring system. Admission heart rate greater than 110 beats/minute, serum sodium less than 135 mmol/L, blood urea nitrogen greater than 15 mg/dL, and white blood cell count greater than 15 × 10(3)/μL were significant predictors of Fournier's gangrene. Using a threshold score of two or greater, our model differentiates patients with Fournier's gangrene from those with nonnecrotizing infections with a sensitivity of 84.2 per cent. Only 34.2 per cent of patients with Fournier's gangrene had hard signs of necrotizing infection on admission, which were not observed in patients with nonnecrotizing infections. Objective admission criteria assist in distinguishing Fournier's gangrene from scrotal cellulitis or abscess. In situations in which results of the physical examination are ambiguous, this scoring system can heighten the index of suspicion for Fournier's gangrene and prompt rapid surgical intervention.

  1. Link Prediction via Sparse Gaussian Graphical Model

    Directory of Open Access Journals (Sweden)

    Liangliang Zhang

    2016-01-01

    Full Text Available Link prediction is an important task in complex network analysis. Traditional link prediction methods are limited by network topology and lack of node property information, which makes predicting links challenging. In this study, we address link prediction using a sparse Gaussian graphical model and demonstrate its theoretical and practical effectiveness. In theory, link prediction is executed by estimating the inverse covariance matrix of samples to overcome information limits. The proposed method was evaluated with four small and four large real-world datasets. The experimental results show that the area under the curve (AUC value obtained by the proposed method improved by an average of 3% and 12.5% compared to 13 mainstream similarity methods, respectively. This method outperforms the baseline method, and the prediction accuracy is superior to mainstream methods when using only 80% of the training set. The method also provides significantly higher AUC values when using only 60% in Dolphin and Taro datasets. Furthermore, the error rate of the proposed method demonstrates superior performance with all datasets compared to mainstream methods.

  2. Greater ethanol-induced locomotor activation in DBA/2J versus C57BL/6J mice is not predicted by presynaptic striatal dopamine dynamics.

    Directory of Open Access Journals (Sweden)

    Jamie H Rose

    Full Text Available A large body of research has aimed to determine the neurochemical factors driving differential sensitivity to ethanol between individuals in an attempt to find predictors of ethanol abuse vulnerability. Here we find that the locomotor activating effects of ethanol are markedly greater in DBA/2J compared to C57BL/6J mice, although it is unclear as to what neurochemical differences between strains mediate this behavior. Dopamine elevations in the nucleus accumbens and caudate-putamen regulate locomotor behavior for most drugs, including ethanol; thus, we aimed to determine if differences in these regions predict strain differences in ethanol-induced locomotor activity. Previous studies suggest that ethanol interacts with the dopamine transporter, potentially mediating its locomotor activating effects; however, we found that ethanol had no effects on dopamine uptake in either strain. Ex vivo voltammetry allows for the determination of ethanol effects on presynaptic dopamine terminals, independent of drug-induced changes in firing rates of afferent inputs from either dopamine neurons or other neurotransmitter systems. However, differences in striatal dopamine dynamics did not predict the locomotor-activating effects of ethanol, since the inhibitory effects of ethanol on dopamine release were similar between strains. There were differences in presynaptic dopamine function between strains, with faster dopamine clearance in the caudate-putamen of DBA/2J mice; however, it is unclear how this difference relates to locomotor behavior. Because of the role of the dopamine system in reinforcement and reward learning, differences in dopamine signaling between the strains could have implications for addiction-related behaviors that extend beyond ethanol effects in the striatum.

  3. A Greater Extent of Insomnia Symptoms and Physician-Recommended Sleep Medication Use Predict Fall Risk in Community-Dwelling Older Adults.

    Science.gov (United States)

    Chen, Tuo-Yu; Lee, Soomi; Buxton, Orfeu M

    2017-11-01

    Cross-sectional studies suggest that insomnia symptoms are associated with falls in later life. This longitudinal study examines the independent and interactive effects of the extent of insomnia symptoms (i.e., multiple co-existing insomnia symptoms) and sleep medications on fall risk over a 2-year follow-up among community-dwelling older adults. Using data from the Health and Retirement Study (2006-2014, N = 6882, Mage = 74.5 years ± 6.6 years), we calculated the extent of insomnia symptoms (range = 0-4) participants reported (i.e., trouble falling asleep, waking up during the night, waking up too early, and not feeling rested). At each wave, participants reported recent sleep medications use and falls since the last wave, and were evaluated for balance and walking speed. A greater burden of insomnia symptoms and using physician-recommended sleep medications at baseline independently predicted falling after adjusting for known risk factors of falling. The effects of insomnia symptoms on fall risk differed by sleep medications use. The extent of insomnia symptoms exhibited a positive, dose-response relation with risk of falling among those not using sleep medications. Older adults using physician-recommended sleep medications exhibited a consistently higher fall risk irrespective of the extent of insomnia symptoms. The number of insomnia symptoms predicts 2-year fall risk in older adults. Taking physician-recommended sleep medications increases the risks for falling in older adults, irrespective of the presence of insomnia symptoms. Future efforts should be directed toward treating insomnia symptoms, and managing and selecting sleep medications effectively to decrease the risk of falling in older adults. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  4. Electrostatic ion thrusters - towards predictive modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kalentev, O.; Matyash, K.; Duras, J.; Lueskow, K.F.; Schneider, R. [Ernst-Moritz-Arndt Universitaet Greifswald, D-17489 (Germany); Koch, N. [Technische Hochschule Nuernberg Georg Simon Ohm, Kesslerplatz 12, D-90489 Nuernberg (Germany); Schirra, M. [Thales Electronic Systems GmbH, Soeflinger Strasse 100, D-89077 Ulm (Germany)

    2014-02-15

    The development of electrostatic ion thrusters so far has mainly been based on empirical and qualitative know-how, and on evolutionary iteration steps. This resulted in considerable effort regarding prototype design, construction and testing and therefore in significant development and qualification costs and high time demands. For future developments it is anticipated to implement simulation tools which allow for quantitative prediction of ion thruster performance, long-term behavior and space craft interaction prior to hardware design and construction. Based on integrated numerical models combining self-consistent kinetic plasma models with plasma-wall interaction modules a new quality in the description of electrostatic thrusters can be reached. These open the perspective for predictive modeling in this field. This paper reviews the application of a set of predictive numerical modeling tools on an ion thruster model of the HEMP-T (High Efficiency Multi-stage Plasma Thruster) type patented by Thales Electron Devices GmbH. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  5. Characterizing Attention with Predictive Network Models.

    Science.gov (United States)

    Rosenberg, M D; Finn, E S; Scheinost, D; Constable, R T; Chun, M M

    2017-04-01

    Recent work shows that models based on functional connectivity in large-scale brain networks can predict individuals' attentional abilities. While being some of the first generalizable neuromarkers of cognitive function, these models also inform our basic understanding of attention, providing empirical evidence that: (i) attention is a network property of brain computation; (ii) the functional architecture that underlies attention can be measured while people are not engaged in any explicit task; and (iii) this architecture supports a general attentional ability that is common to several laboratory-based tasks and is impaired in attention deficit hyperactivity disorder (ADHD). Looking ahead, connectivity-based predictive models of attention and other cognitive abilities and behaviors may potentially improve the assessment, diagnosis, and treatment of clinical dysfunction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Genetic models of homosexuality: generating testable predictions

    Science.gov (United States)

    Gavrilets, Sergey; Rice, William R

    2006-01-01

    Homosexuality is a common occurrence in humans and other species, yet its genetic and evolutionary basis is poorly understood. Here, we formulate and study a series of simple mathematical models for the purpose of predicting empirical patterns that can be used to determine the form of selection that leads to polymorphism of genes influencing homosexuality. Specifically, we develop theory to make contrasting predictions about the genetic characteristics of genes influencing homosexuality including: (i) chromosomal location, (ii) dominance among segregating alleles and (iii) effect sizes that distinguish between the two major models for their polymorphism: the overdominance and sexual antagonism models. We conclude that the measurement of the genetic characteristics of quantitative trait loci (QTLs) found in genomic screens for genes influencing homosexuality can be highly informative in resolving the form of natural selection maintaining their polymorphism. PMID:17015344

  7. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  8. Prediction of Post-Closure Water Balance for Monolithic Soil Covers at Waste Disposal Sites in the Greater Accra Metropolitan Area of Ghana

    Directory of Open Access Journals (Sweden)

    Kodwo Beedu Keelson

    2014-04-01

    Full Text Available The Ghana Landfill Guidelines require the provision of a final cover system during landfill closure as a means of minimizing the harmful environmental effects of uncontrolled leachate discharges. However, this technical manual does not provide explicit guidance on the material types or configurations that would be suitable for the different climatic zones in Ghana. The aim of this study was to simulate and predict post-closure landfill cover water balance for waste disposal sites located in the Greater Accra Metropolitan Area using the USGS Thornthwaite monthly water balance computer program. Five different cover soil types were analyzed under using historical climatic data for the metropolis from 1980 to 2001. The maximum annual percolation and evapotranspiration rates for the native soil type were 337 mm and 974 mm respectively. Monthly percolation rates exhibited a seasonal pattern similar to the bimodal precipitation regime whereas monthly evapotranspiration did not. It was also observed that even though soils with a high clay content would be the most suitable option as landfill cover material in the Accra metropolis the maximum thickness of 600 mm recommended in the Ghana Landfill Guidelines do not seem to provide significant reduction in percolation rates into the buried waste mass when the annual rainfall exceeds 700 mm. The findings from this research should provide additional guidance to landfill managers on the specification of cover designs for waste disposal sites with similar climatic conditions.

  9. Conservatism and the neural circuitry of threat: economic conservatism predicts greater amygdala-BNST connectivity during periods of threat vs safety.

    Science.gov (United States)

    Pedersen, Walker S; Muftuler, L Tugan; Larson, Christine L

    2018-01-01

    Political conservatism is associated with an increased negativity bias, including increased attention and reactivity toward negative and threatening stimuli. Although the human amygdala has been implicated in the response to threatening stimuli, no studies to date have investigated whether conservatism is associated with altered amygdala function toward threat. Furthermore, although an influential theory posits that connectivity between the amygdala and bed nucleus of the stria terminalis (BNST) is important in initiating the response to sustained or uncertain threat, whether individual differences in conservatism modulate this connectivity is unknown. To test whether conservatism is associated with increased reactivity in neural threat circuitry, we measured participants' self-reported social and economic conservatism and asked them to complete high-resolution fMRI scans while under threat of an unpredictable shock and while safe. We found that economic conservatism predicted greater connectivity between the BNST and a cluster of voxels in the left amygdala during threat vs safety. These results suggest that increased amygdala-BNST connectivity during threat may be a key neural correlate of the enhanced negativity bias found in conservatism. © The Author (2017). Published by Oxford University Press.

  10. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to

  11. Error Estimation of An Ensemble Statistical Seasonal Precipitation Prediction Model

    Science.gov (United States)

    Shen, Samuel S. P.; Lau, William K. M.; Kim, Kyu-Myong; Li, Gui-Long

    2001-01-01

    This NASA Technical Memorandum describes an optimal ensemble canonical correlation forecasting model for seasonal precipitation. Each individual forecast is based on the canonical correlation analysis (CCA) in the spectral spaces whose bases are empirical orthogonal functions (EOF). The optimal weights in the ensemble forecasting crucially depend on the mean square error of each individual forecast. An estimate of the mean square error of a CCA prediction is made also using the spectral method. The error is decomposed onto EOFs of the predictand and decreases linearly according to the correlation between the predictor and predictand. Since new CCA scheme is derived for continuous fields of predictor and predictand, an area-factor is automatically included. Thus our model is an improvement of the spectral CCA scheme of Barnett and Preisendorfer. The improvements include (1) the use of area-factor, (2) the estimation of prediction error, and (3) the optimal ensemble of multiple forecasts. The new CCA model is applied to the seasonal forecasting of the United States (US) precipitation field. The predictor is the sea surface temperature (SST). The US Climate Prediction Center's reconstructed SST is used as the predictor's historical data. The US National Center for Environmental Prediction's optimally interpolated precipitation (1951-2000) is used as the predictand's historical data. Our forecast experiments show that the new ensemble canonical correlation scheme renders a reasonable forecasting skill. For example, when using September-October-November SST to predict the next season December-January-February precipitation, the spatial pattern correlation between the observed and predicted are positive in 46 years among the 50 years of experiments. The positive correlations are close to or greater than 0.4 in 29 years, which indicates excellent performance of the forecasting model. The forecasting skill can be further enhanced when several predictors are used.

  12. Neuro-fuzzy modeling in bankruptcy prediction

    Directory of Open Access Journals (Sweden)

    Vlachos D.

    2003-01-01

    Full Text Available For the past 30 years the problem of bankruptcy prediction had been thoroughly studied. From the paper of Altman in 1968 to the recent papers in the '90s, the progress of prediction accuracy was not satisfactory. This paper investigates an alternative modeling of the system (firm, combining neural networks and fuzzy controllers, i.e. using neuro-fuzzy models. Classical modeling is based on mathematical models that describe the behavior of the firm under consideration. The main idea of fuzzy control, on the other hand, is to build a model of a human control expert who is capable of controlling the process without thinking in a mathematical model. This control expert specifies his control action in the form of linguistic rules. These control rules are translated into the framework of fuzzy set theory providing a calculus, which can stimulate the behavior of the control expert and enhance its performance. The accuracy of the model is studied using datasets from previous research papers.

  13. Predictive Models for Carcinogenicity and Mutagenicity ...

    Science.gov (United States)

    Mutagenicity and carcinogenicity are endpoints of major environmental and regulatory concern. These endpoints are also important targets for development of alternative methods for screening and prediction due to the large number of chemicals of potential concern and the tremendous cost (in time, money, animals) of rodent carcinogenicity bioassays. Both mutagenicity and carcinogenicity involve complex, cellular processes that are only partially understood. Advances in technologies and generation of new data will permit a much deeper understanding. In silico methods for predicting mutagenicity and rodent carcinogenicity based on chemical structural features, along with current mutagenicity and carcinogenicity data sets, have performed well for local prediction (i.e., within specific chemical classes), but are less successful for global prediction (i.e., for a broad range of chemicals). The predictivity of in silico methods can be improved by improving the quality of the data base and endpoints used for modelling. In particular, in vitro assays for clastogenicity need to be improved to reduce false positives (relative to rodent carcinogenicity) and to detect compounds that do not interact directly with DNA or have epigenetic activities. New assays emerging to complement or replace some of the standard assays include VitotoxTM, GreenScreenGC, and RadarScreen. The needs of industry and regulators to assess thousands of compounds necessitate the development of high-t

  14. Disease Prediction Models and Operational Readiness

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Pullum, Laura L.; Hartley, David M.; Benedum, Corey M.; Noonan, Christine F.; Rabinowitz, Peter M.; Lancaster, Mary J.

    2014-03-19

    INTRODUCTION: The objective of this manuscript is to present a systematic review of biosurveillance models that operate on select agents and can forecast the occurrence of a disease event. One of the primary goals of this research was to characterize the viability of biosurveillance models to provide operationally relevant information for decision makers to identify areas for future research. Two critical characteristics differentiate this work from other infectious disease modeling reviews. First, we reviewed models that attempted to predict the disease event, not merely its transmission dynamics. Second, we considered models involving pathogens of concern as determined by the US National Select Agent Registry (as of June 2011). Methods: We searched dozens of commercial and government databases and harvested Google search results for eligible models utilizing terms and phrases provided by public health analysts relating to biosurveillance, remote sensing, risk assessments, spatial epidemiology, and ecological niche-modeling, The publication date of search results returned are bound by the dates of coverage of each database and the date in which the search was performed, however all searching was completed by December 31, 2010. This returned 13,767 webpages and 12,152 citations. After de-duplication and removal of extraneous material, a core collection of 6,503 items was established and these publications along with their abstracts are presented in a semantic wiki at http://BioCat.pnnl.gov. Next, PNNL’s IN-SPIRE visual analytics software was used to cross-correlate these publications with the definition for a biosurveillance model resulting in the selection of 54 documents that matched the criteria resulting Ten of these documents, However, dealt purely with disease spread models, inactivation of bacteria, or the modeling of human immune system responses to pathogens rather than predicting disease events. As a result, we systematically reviewed 44 papers and the

  15. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  16. A predictive model for dimensional errors in fused deposition modeling

    DEFF Research Database (Denmark)

    Stolfi, A.

    2015-01-01

    values of L (0.254 mm, 0.330 mm) was produced by comparing predicted values with external face-to-face measurements. After removing outliers, the results show that the developed two-parameter model can serve as tool for modeling the FDM dimensional behavior in a wide range of deposition angles....

  17. Modelling the impact of urban form on household energy demand and related CO2 emissions in the Greater Dublin Region

    International Nuclear Information System (INIS)

    Liu Xiaochen; Sweeney, John

    2012-01-01

    This study aims to investigate the relationship between household space heating energy use and urban form (land use characteristics) for the Greater Dublin Region. The geographical distributions of household energy use are evaluated at the Enumeration Districts (ED) level based on the building thermal balance model. Moreover, it estimates the impact of possible factors on the household space heating consumption. Results illustrate that the distribution profile of dwellings is a significant factor related to overall heating energy demand and individual dwelling energy consumption for space heating. Residents living in compact dwellings with small floor areas consume less energy for space heating than residents living in dwellings with big floor areas. Moreover, domestic heating energy demand per household was also estimated for two extreme urban development scenarios: the compact city scenario and the dispersed scenario. The results illustrate that the compact city scenario is likely to decrease the domestic heating energy consumption per household by 16.2% compared with the dispersed city scenario. Correspondingly, the energy-related CO 2 emissions could be significantly decreased by compact city scenario compared with the dispersed city scenario. - Highlights: ► A method was developed to investigate urban form impacts on energy demand. ► This study estimates impacts of possible factors on the household energy consumption. ► Household heating energy demand is sensitive to dwelling distribution profile. ► The compact case could reduce domestic energy demand compared with the dispersed case.

  18. Quantification and mapping of urban fluxes under climate change: Application of WRF-SUEWS model to Greater Porto area (Portugal).

    Science.gov (United States)

    Rafael, S; Martins, H; Marta-Almeida, M; Sá, E; Coelho, S; Rocha, A; Borrego, C; Lopes, M

    2017-05-01

    Climate change and the growth of urban populations are two of the main challenges facing Europe today. These issues are linked as climate change results in serious challenges for cities. Recent attention has focused on how urban surface-atmosphere exchanges of heat and water will be affected by climate change and the implications for urban planning and sustainability. In this study energy fluxes for Greater Porto area, Portugal, were estimated and the influence of the projected climate change evaluated. To accomplish this, the Weather Research and Forecasting Model (WRF) and the Surface Urban Energy and Water Balance Scheme (SUEWS) were applied for two climatological scenarios: a present (or reference, 1986-2005) scenario and a future scenario (2046-2065), in this case the Representative Concentration Pathway RCP8.5, which reflects the worst set of expectations (with the most onerous impacts). The results show that for the future climate conditions, the incoming shortwave radiation will increase by around 10%, the sensible heat flux around 40% and the net storage heat flux around 35%. In contrast, the latent heat flux will decrease about 20%. The changes in the magnitude of the different fluxes result in an increase of the net all-wave radiation by 15%. The implications of the changes of the energy balance on the meteorological variables are discussed, particularly in terms of temperature and precipitation. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Greater expectations: using hierarchical linear modeling to examine expectancy for treatment outcome as a predictor of treatment response.

    Science.gov (United States)

    Price, Matthew; Anderson, Page; Henrich, Christopher C; Rothbaum, Barbara Olasov

    2008-12-01

    A client's expectation that therapy will be beneficial has long been considered an important factor contributing to therapeutic outcomes, but recent empirical work examining this hypothesis has primarily yielded null findings. The present study examined the contribution of expectancies for treatment outcome to actual treatment outcome from the start of therapy through 12-month follow-up in a clinical sample of individuals (n=72) treated for fear of flying with either in vivo exposure or virtual reality exposure therapy. Using a piecewise hierarchical linear model, outcome expectancy predicted treatment gains made during therapy but not during follow-up. Compared to lower levels, higher expectations for treatment outcome yielded stronger rates of symptom reduction from the beginning to the end of treatment on 2 standardized self-report questionnaires on fear of flying. The analytic approach of the current study is one potential reason that findings contrast with prior literature. The advantages of using hierarchical linear modeling to assess interindividual differences in longitudinal data are discussed.

  20. Prediction Model for Relativistic Electrons at Geostationary Orbit

    Science.gov (United States)

    Khazanov, George V.; Lyatsky, Wladislaw

    2008-01-01

    We developed a new prediction model for forecasting relativistic (greater than 2MeV) electrons, which provides a VERY HIGH correlation between predicted and actually measured electron fluxes at geostationary orbit. This model implies the multi-step particle acceleration and is based on numerical integrating two linked continuity equations for primarily accelerated particles and relativistic electrons. The model includes a source and losses, and used solar wind data as only input parameters. We used the coupling function which is a best-fit combination of solar wind/interplanetary magnetic field parameters, responsible for the generation of geomagnetic activity, as a source. The loss function was derived from experimental data. We tested the model for four year period 2004-2007. The correlation coefficient between predicted and actual values of the electron fluxes for whole four year period as well as for each of these years is stable and incredibly high (about 0.9). The high and stable correlation between the computed and actual electron fluxes shows that the reliable forecasting these electrons at geostationary orbit is possible.

  1. A predictive model for dimensional errors in fused deposition modeling

    DEFF Research Database (Denmark)

    Stolfi, A.

    2015-01-01

    This work concerns the effect of deposition angle (a) and layer thickness (L) on the dimensional performance of FDM parts using a predictive model based on the geometrical description of the FDM filament profile. An experimental validation over the whole a range from 0° to 177° at 3° steps and two...... values of L (0.254 mm, 0.330 mm) was produced by comparing predicted values with external face-to-face measurements. After removing outliers, the results show that the developed two-parameter model can serve as tool for modeling the FDM dimensional behavior in a wide range of deposition angles....

  2. Predictive Modeling in Actinide Chemistry and Catalysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ping [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-16

    These are slides from a presentation on predictive modeling in actinide chemistry and catalysis. The following topics are covered in these slides: Structures, bonding, and reactivity (bonding can be quantified by optical probes and theory, and electronic structures and reaction mechanisms of actinide complexes); Magnetic resonance properties (transition metal catalysts with multi-nuclear centers, and NMR/EPR parameters); Moving to more complex systems (surface chemistry of nanomaterials, and interactions of ligands with nanoparticles); Path forward and conclusions.

  3. Predictive modelling of evidence informed teaching

    OpenAIRE

    Zhang, Dell; Brown, C.

    2017-01-01

    In this paper, we analyse the questionnaire survey data collected from 79 English primary schools about the situation of evidence informed teaching, where the evidences could come from research journals or conferences. Specifically, we build a predictive model to see what external factors could help to close the gap between teachers’ belief and behaviour in evidence informed teaching, which is the first of its kind to our knowledge. The major challenge, from the data mining perspective, is th...

  4. A Predictive Model for Cognitive Radio

    Science.gov (United States)

    2006-09-14

    response in a given situation. Vadde et al. interest and produce a model for prediction of the response. have applied response surface methodology and...34 2000. [3] K. K. Vadde and V. R. Syrotiuk, "Factor interaction on service configurations to those that best meet our communication delivery in mobile ad...resulting set of configurations randomly or apply additional 2004. screening criteria. [4] K. K. Vadde , M.-V. R. Syrotiuk, and D. C. Montgomery

  5. Tectonic predictions with mantle convection models

    Science.gov (United States)

    Coltice, Nicolas; Shephard, Grace E.

    2018-04-01

    Over the past 15 yr, numerical models of convection in Earth's mantle have made a leap forward: they can now produce self-consistent plate-like behaviour at the surface together with deep mantle circulation. These digital tools provide a new window into the intimate connections between plate tectonics and mantle dynamics, and can therefore be used for tectonic predictions, in principle. This contribution explores this assumption. First, initial conditions at 30, 20, 10 and 0 Ma are generated by driving a convective flow with imposed plate velocities at the surface. We then compute instantaneous mantle flows in response to the guessed temperature fields without imposing any boundary conditions. Plate boundaries self-consistently emerge at correct locations with respect to reconstructions, except for small plates close to subduction zones. As already observed for other types of instantaneous flow calculations, the structure of the top boundary layer and upper-mantle slab is the dominant character that leads to accurate predictions of surface velocities. Perturbations of the rheological parameters have little impact on the resulting surface velocities. We then compute fully dynamic model evolution from 30 and 10 to 0 Ma, without imposing plate boundaries or plate velocities. Contrary to instantaneous calculations, errors in kinematic predictions are substantial, although the plate layout and kinematics in several areas remain consistent with the expectations for the Earth. For these calculations, varying the rheological parameters makes a difference for plate boundary evolution. Also, identified errors in initial conditions contribute to first-order kinematic errors. This experiment shows that the tectonic predictions of dynamic models over 10 My are highly sensitive to uncertainties of rheological parameters and initial temperature field in comparison to instantaneous flow calculations. Indeed, the initial conditions and the rheological parameters can be good enough

  6. Predictive Modeling of the CDRA 4BMS

    Science.gov (United States)

    Coker, Robert F.; Knox, James C.

    2016-01-01

    As part of NASA's Advanced Exploration Systems (AES) program and the Life Support Systems Project (LSSP), fully predictive models of the Four Bed Molecular Sieve (4BMS) of the Carbon Dioxide Removal Assembly (CDRA) on the International Space Station (ISS) are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  7. A neural network based model for urban noise prediction.

    Science.gov (United States)

    Genaro, N; Torija, A; Ramos-Ridao, A; Requena, I; Ruiz, D P; Zamorano, M

    2010-10-01

    Noise is a global problem. In 1972 the World Health Organization (WHO) classified noise as a pollutant. Since then, most industrialized countries have enacted laws and local regulations to prevent and reduce acoustic environmental pollution. A further aim is to alert people to the dangers of this type of pollution. In this context, urban planners need to have tools that allow them to evaluate the degree of acoustic pollution. Scientists in many countries have modeled urban noise, using a wide range of approaches, but their results have not been as good as expected. This paper describes a model developed for the prediction of environmental urban noise using Soft Computing techniques, namely Artificial Neural Networks (ANN). The model is based on the analysis of variables regarded as influential by experts in the field and was applied to data collected on different types of streets. The results were compared to those obtained with other models. The study found that the ANN system was able to predict urban noise with greater accuracy, and thus, was an improvement over those models. The principal component analysis (PCA) was also used to try to simplify the model. Although there was a slight decline in the accuracy of the results, the values obtained were also quite acceptable.

  8. Coal demand prediction based on a support vector machine model

    Energy Technology Data Exchange (ETDEWEB)

    Jia, Cun-liang; Wu, Hai-shan; Gong, Dun-wei [China University of Mining & Technology, Xuzhou (China). School of Information and Electronic Engineering

    2007-01-15

    A forecasting model for coal demand of China using a support vector regression was constructed. With the selected embedding dimension, the output vectors and input vectors were constructed based on the coal demand of China from 1980 to 2002. After compared with lineal kernel and Sigmoid kernel, a radial basis function(RBF) was adopted as the kernel function. By analyzing the relationship between the error margin of prediction and the model parameters, the proper parameters were chosen. The support vector machines (SVM) model with multi-input and single output was proposed. Compared the predictor based on RBF neural networks with test datasets, the results show that the SVM predictor has higher precision and greater generalization ability. In the end, the coal demand from 2003 to 2006 is accurately forecasted. l0 refs., 2 figs., 4 tabs.

  9. Predictive Modeling by the Cerebellum Improves Proprioception

    Science.gov (United States)

    Bhanpuri, Nasir H.; Okamura, Allison M.

    2013-01-01

    Because sensation is delayed, real-time movement control requires not just sensing, but also predicting limb position, a function hypothesized for the cerebellum. Such cerebellar predictions could contribute to perception of limb position (i.e., proprioception), particularly when a person actively moves the limb. Here we show that human cerebellar patients have proprioceptive deficits compared with controls during active movement, but not when the arm is moved passively. Furthermore, when healthy subjects move in a force field with unpredictable dynamics, they have active proprioceptive deficits similar to cerebellar patients. Therefore, muscle activity alone is likely insufficient to enhance proprioception and predictability (i.e., an internal model of the body and environment) is important for active movement to benefit proprioception. We conclude that cerebellar patients have an active proprioceptive deficit consistent with disrupted movement prediction rather than an inability to generally enhance peripheral proprioceptive signals during action and suggest that active proprioceptive deficits should be considered a fundamental cerebellar impairment of clinical importance. PMID:24005283

  10. Prediction of Chemical Function: Model Development and ...

    Science.gov (United States)

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi

  11. Gamma-Ray Pulsars Models and Predictions

    CERN Document Server

    Harding, A K

    2001-01-01

    Pulsed emission from gamma-ray pulsars originates inside the magnetosphere, from radiation by charged particles accelerated near the magnetic poles or in the outer gaps. In polar cap models, the high energy spectrum is cut off by magnetic pair production above an energy that is dependent on the local magnetic field strength. While most young pulsars with surface fields in the range B = 10^{12} - 10^{13} G are expected to have high energy cutoffs around several GeV, the gamma-ray spectra of old pulsars having lower surface fields may extend to 50 GeV. Although the gamma-ray emission of older pulsars is weaker, detecting pulsed emission at high energies from nearby sources would be an important confirmation of polar cap models. Outer gap models predict more gradual high-energy turnovers at around 10 GeV, but also predict an inverse Compton component extending to TeV energies. Detection of pulsed TeV emission, which would not survive attenuation at the polar caps, is thus an important test of outer gap models. N...

  12. A prediction model for Clostridium difficile recurrence

    Directory of Open Access Journals (Sweden)

    Francis D. LaBarbera

    2015-02-01

    Full Text Available Background: Clostridium difficile infection (CDI is a growing problem in the community and hospital setting. Its incidence has been on the rise over the past two decades, and it is quickly becoming a major concern for the health care system. High rate of recurrence is one of the major hurdles in the successful treatment of C. difficile infection. There have been few studies that have looked at patterns of recurrence. The studies currently available have shown a number of risk factors associated with C. difficile recurrence (CDR; however, there is little consensus on the impact of most of the identified risk factors. Methods: Our study was a retrospective chart review of 198 patients diagnosed with CDI via Polymerase Chain Reaction (PCR from February 2009 to Jun 2013. In our study, we decided to use a machine learning algorithm called the Random Forest (RF to analyze all of the factors proposed to be associated with CDR. This model is capable of making predictions based on a large number of variables, and has outperformed numerous other models and statistical methods. Results: We came up with a model that was able to accurately predict the CDR with a sensitivity of 83.3%, specificity of 63.1%, and area under curve of 82.6%. Like other similar studies that have used the RF model, we also had very impressive results. Conclusions: We hope that in the future, machine learning algorithms, such as the RF, will see a wider application.

  13. Artificial Neural Network Model for Predicting Compressive

    Directory of Open Access Journals (Sweden)

    Salim T. Yousif

    2013-05-01

    Full Text Available   Compressive strength of concrete is a commonly used criterion in evaluating concrete. Although testing of the compressive strength of concrete specimens is done routinely, it is performed on the 28th day after concrete placement. Therefore, strength estimation of concrete at early time is highly desirable. This study presents the effort in applying neural network-based system identification techniques to predict the compressive strength of concrete based on concrete mix proportions, maximum aggregate size (MAS, and slump of fresh concrete. Back-propagation neural networks model is successively developed, trained, and tested using actual data sets of concrete mix proportions gathered from literature.    The test of the model by un-used data within the range of input parameters shows that the maximum absolute error for model is about 20% and 88% of the output results has absolute errors less than 10%. The parametric study shows that water/cement ratio (w/c is the most significant factor  affecting the output of the model.     The results showed that neural networks has strong potential as a feasible tool for predicting compressive strength of concrete.

  14. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  15. A generative model for predicting terrorist incidents

    Science.gov (United States)

    Verma, Dinesh C.; Verma, Archit; Felmlee, Diane; Pearson, Gavin; Whitaker, Roger

    2017-05-01

    A major concern in coalition peace-support operations is the incidence of terrorist activity. In this paper, we propose a generative model for the occurrence of the terrorist incidents, and illustrate that an increase in diversity, as measured by the number of different social groups to which that an individual belongs, is inversely correlated with the likelihood of a terrorist incident in the society. A generative model is one that can predict the likelihood of events in new contexts, as opposed to statistical models which are used to predict the future incidents based on the history of the incidents in an existing context. Generative models can be useful in planning for persistent Information Surveillance and Reconnaissance (ISR) since they allow an estimation of regions in the theater of operation where terrorist incidents may arise, and thus can be used to better allocate the assignment and deployment of ISR assets. In this paper, we present a taxonomy of terrorist incidents, identify factors related to occurrence of terrorist incidents, and provide a mathematical analysis calculating the likelihood of occurrence of terrorist incidents in three common real-life scenarios arising in peace-keeping operations

  16. PREDICTION MODELS OF GRAIN YIELD AND CHARACTERIZATION

    Directory of Open Access Journals (Sweden)

    Narciso Ysac Avila Serrano

    2009-06-01

    Full Text Available With the objective to characterize the grain yield of five cowpea cultivars and to find linear regression models to predict it, a study was developed in La Paz, Baja California Sur, Mexico. A complete randomized blocks design was used. Simple and multivariate analyses of variance were carried out using the canonical variables to characterize the cultivars. The variables cluster per plant, pods per plant, pods per cluster, seeds weight per plant, seeds hectoliter weight, 100-seed weight, seeds length, seeds wide, seeds thickness, pods length, pods wide, pods weight, seeds per pods, and seeds weight per pods, showed significant differences (P≤ 0.05 among cultivars. Paceño and IT90K-277-2 cultivars showed the higher seeds weight per plant. The linear regression models showed correlation coefficients ≥0.92. In these models, the seeds weight per plant, pods per cluster, pods per plant, cluster per plant and pods length showed significant correlations (P≤ 0.05. In conclusion, the results showed that grain yield differ among cultivars and for its estimation, the prediction models showed determination coefficients highly dependable.

  17. Predictive Models for Normal Fetal Cardiac Structures.

    Science.gov (United States)

    Krishnan, Anita; Pike, Jodi I; McCarter, Robert; Fulgium, Amanda L; Wilson, Emmanuel; Donofrio, Mary T; Sable, Craig A

    2016-12-01

    Clinicians rely on age- and size-specific measures of cardiac structures to diagnose cardiac disease. No universally accepted normative data exist for fetal cardiac structures, and most fetal cardiac centers do not use the same standards. The aim of this study was to derive predictive models for Z scores for 13 commonly evaluated fetal cardiac structures using a large heterogeneous population of fetuses without structural cardiac defects. The study used archived normal fetal echocardiograms in representative fetuses aged 12 to 39 weeks. Thirteen cardiac dimensions were remeasured by a blinded echocardiographer from digitally stored clips. Studies with inadequate imaging views were excluded. Regression models were developed to relate each dimension to estimated gestational age (EGA) by dates, biparietal diameter, femur length, and estimated fetal weight by the Hadlock formula. Dimension outcomes were transformed (e.g., using the logarithm or square root) as necessary to meet the normality assumption. Higher order terms, quadratic or cubic, were added as needed to improve model fit. Information criteria and adjusted R 2 values were used to guide final model selection. Each Z-score equation is based on measurements derived from 296 to 414 unique fetuses. EGA yielded the best predictive model for the majority of dimensions; adjusted R 2 values ranged from 0.72 to 0.893. However, each of the other highly correlated (r > 0.94) biometric parameters was an acceptable surrogate for EGA. In most cases, the best fitting model included squared and cubic terms to introduce curvilinearity. For each dimension, models based on EGA provided the best fit for determining normal measurements of fetal cardiac structures. Nevertheless, other biometric parameters, including femur length, biparietal diameter, and estimated fetal weight provided results that were nearly as good. Comprehensive Z-score results are available on the basis of highly predictive models derived from gestational

  18. An analytical model for climatic predictions

    International Nuclear Information System (INIS)

    Njau, E.C.

    1990-12-01

    A climatic model based upon analytical expressions is presented. This model is capable of making long-range predictions of heat energy variations on regional or global scales. These variations can then be transformed into corresponding variations of some other key climatic parameters since weather and climatic changes are basically driven by differential heating and cooling around the earth. On the basis of the mathematical expressions upon which the model is based, it is shown that the global heat energy structure (and hence the associated climatic system) are characterized by zonally as well as latitudinally propagating fluctuations at frequencies downward of 0.5 day -1 . We have calculated the propagation speeds for those particular frequencies that are well documented in the literature. The calculated speeds are in excellent agreement with the measured speeds. (author). 13 refs

  19. An Anisotropic Hardening Model for Springback Prediction

    International Nuclear Information System (INIS)

    Zeng, Danielle; Xia, Z. Cedric

    2005-01-01

    As more Advanced High-Strength Steels (AHSS) are heavily used for automotive body structures and closures panels, accurate springback prediction for these components becomes more challenging because of their rapid hardening characteristics and ability to sustain even higher stresses. In this paper, a modified Mroz hardening model is proposed to capture realistic Bauschinger effect at reverse loading, such as when material passes through die radii or drawbead during sheet metal forming process. This model accounts for material anisotropic yield surface and nonlinear isotropic/kinematic hardening behavior. Material tension/compression test data are used to accurately represent Bauschinger effect. The effectiveness of the model is demonstrated by comparison of numerical and experimental springback results for a DP600 straight U-channel test

  20. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  1. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  2. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  3. [Endometrial cancer: Predictive models and clinical impact].

    Science.gov (United States)

    Bendifallah, Sofiane; Ballester, Marcos; Daraï, Emile

    2017-12-01

    In France, in 2015, endometrial cancer (CE) is the first gynecological cancer in terms of incidence and the fourth cause of cancer of the woman. About 8151 new cases and nearly 2179 deaths have been reported. Treatments (surgery, external radiotherapy, brachytherapy and chemotherapy) are currently delivered on the basis of an estimation of the recurrence risk, an estimation of lymph node metastasis or an estimate of survival probability. This risk is determined on the basis of prognostic factors (clinical, histological, imaging, biological) taken alone or grouped together in the form of classification systems, which are currently insufficient to account for the evolutionary and prognostic heterogeneity of endometrial cancer. For endometrial cancer, the concept of mathematical modeling and its application to prediction have developed in recent years. These biomathematical tools have opened a new era of care oriented towards the promotion of targeted therapies and personalized treatments. Many predictive models have been published to estimate the risk of recurrence and lymph node metastasis, but a tiny fraction of them is sufficiently relevant and of clinical utility. The optimization tracks are multiple and varied, suggesting the possibility in the near future of a place for these mathematical models. The development of high-throughput genomics is likely to offer a more detailed molecular characterization of the disease and its heterogeneity. Copyright © 2017 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  4. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  5. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  6. The use of the greater trochanter marker in the thigh segment model: Implications for hip and knee frontal and transverse plane motion

    Directory of Open Access Journals (Sweden)

    Valentina Graci

    2016-03-01

    Conclusion: Hip and knee kinematics differed across different segment definitions including or excluding the greater trochanter marker, especially in the transverse plane. Therefore when considering whether to include the greater trochanter in the thigh segment model when using a surface markers to calculate 3D kinematics for movement assessment, it is important to have a clear understanding of the effect of different marker sets and segment models in use.

  7. Use of the GREAT-ER model to estimate mass fluxes of chemicals, carried into the Western Scheldt estuary from the Rupel basin

    OpenAIRE

    Schowanek, D.

    2002-01-01

    The poster illustrates the application of the GREAT-ER model to estimate the mass flux of chemicals carried from a river basin into an estuary. GREAT-ER (Geo-referenced Regional Exposure Assessment Tool for European Rivers) is a newly developed model (1999) for management and risk assessment of chemicals in river basins (see www.great-er.org). Recently the Rupel basin has been made available for use within GREAT-ER. This now allows to make a reliable estimation of the contribution of pollu...

  8. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  9. Predictive Model for Permanent Shunting in Cryptococcal meningitis.

    Science.gov (United States)

    Phusoongnern, Woralux; Anunnatsiri, Siriluck; Sawanyawisuth, Kittisak; Kitkhuandee, Amnat

    2017-11-01

    Cryptococcal meningitis may have long-term morbidity and requires a permanent cerebrospinal fluid shunt. This study aimed to evaluate the risk factors and create a predictive model for permanent shunt treatment in cryptococcal meningitis patients. This was a retrospective analytical study conducted at Khon Kaen University. The study period was from January 2005 to December 2015. We enrolled all adult patients diagnosed with cryptococcal meningitis. Risk factors predictive for permanent shunting treatment were analyzed by multivariate logistic regression analysis. There were 341 patients diagnosed with cryptococcal meningitis. Of those, 64 patients (18.7%) were treated with permanent shunts. There were three independent factors associated with permanent shunt treatment. The presence of hydrocephalus had the highest adjusted odds ratio at 56.77. The resulting predictive model for permanent shunt treatment (y) is (-3.85) + (4.04 × hydrocephalus) + (2.13 × initial cerebrospinal fluid (CSF) opening pressure (OP) > 25 cm H 2 O) + (1.87 × non-human immune deficiency vrus (HIV)). In conclusion, non-HIV status, initial CSF OP greater than or equal to 25 cm H 2 O, and the presence of hydrocephalus are indicators of the future necessity for permanent shunt therapy.

  10. Predictive modeling of gingivitis severity and susceptibility via oral microbiota.

    Science.gov (United States)

    Huang, Shi; Li, Rui; Zeng, Xiaowei; He, Tao; Zhao, Helen; Chang, Alice; Bo, Cunpei; Chen, Jie; Yang, Fang; Knight, Rob; Liu, Jiquan; Davis, Catherine; Xu, Jian

    2014-09-01

    Predictive modeling of human disease based on the microbiota holds great potential yet remains challenging. Here, 50 adults underwent controlled transitions from naturally occurring gingivitis, to healthy gingivae (baseline), and to experimental gingivitis (EG). In diseased plaque microbiota, 27 bacterial genera changed in relative abundance and functional genes including 33 flagellar biosynthesis-related groups were enriched. Plaque microbiota structure exhibited a continuous gradient along the first principal component, reflecting transition from healthy to diseased states, which correlated with Mazza Gingival Index. We identified two host types with distinct gingivitis sensitivity. Our proposed microbial indices of gingivitis classified host types with 74% reliability, and, when tested on another 41-member cohort, distinguished healthy from diseased individuals with 95% accuracy. Furthermore, the state of the microbiota in naturally occurring gingivitis predicted the microbiota state and severity of subsequent EG (but not the state of the microbiota during the healthy baseline period). Because the effect of disease is greater than interpersonal variation in plaque, in contrast to the gut, plaque microbiota may provide advantages in predictive modeling of oral diseases.

  11. Combining GPS measurements and IRI model predictions

    International Nuclear Information System (INIS)

    Hernandez-Pajares, M.; Juan, J.M.; Sanz, J.; Bilitza, D.

    2002-01-01

    The free electrons distributed in the ionosphere (between one hundred and thousands of km in height) produce a frequency-dependent effect on Global Positioning System (GPS) signals: a delay in the pseudo-orange and an advance in the carrier phase. These effects are proportional to the columnar electron density between the satellite and receiver, i.e. the integrated electron density along the ray path. Global ionospheric TEC (total electron content) maps can be obtained with GPS data from a network of ground IGS (international GPS service) reference stations with an accuracy of few TEC units. The comparison with the TOPEX TEC, mainly measured over the oceans far from the IGS stations, shows a mean bias and standard deviation of about 2 and 5 TECUs respectively. The discrepancies between the STEC predictions and the observed values show an RMS typically below 5 TECUs (which also includes the alignment code noise). he existence of a growing database 2-hourly global TEC maps and with resolution of 5x2.5 degrees in longitude and latitude can be used to improve the IRI prediction capability of the TEC. When the IRI predictions and the GPS estimations are compared for a three month period around the Solar Maximum, they are in good agreement for middle latitudes. An over-determination of IRI TEC has been found at the extreme latitudes, the IRI predictions being, typically two times higher than the GPS estimations. Finally, local fits of the IRI model can be done by tuning the SSN from STEC GPS observations

  12. Mathematical models for indoor radon prediction

    International Nuclear Information System (INIS)

    Malanca, A.; Pessina, V.; Dallara, G.

    1995-01-01

    It is known that the indoor radon (Rn) concentration can be predicted by means of mathematical models. The simplest model relies on two variables only: the Rn source strength and the air exchange rate. In the Lawrence Berkeley Laboratory (LBL) model several environmental parameters are combined into a complex equation; besides, a correlation between the ventilation rate and the Rn entry rate from the soil is admitted. The measurements were carried out using activated carbon canisters. Seventy-five measurements of Rn concentrations were made inside two rooms placed on the second floor of a building block. One of the rooms had a single-glazed window whereas the other room had a double pane window. During three different experimental protocols, the mean Rn concentration was always higher into the room with a double-glazed window. That behavior can be accounted for by the simplest model. A further set of 450 Rn measurements was collected inside a ground-floor room with a grounding well in it. This trend maybe accounted for by the LBL model

  13. A Predictive Maintenance Model for Railway Tracks

    DEFF Research Database (Denmark)

    Li, Rui; Wen, Min; Salling, Kim Bang

    2015-01-01

    presents a mathematical model based on Mixed Integer Programming (MIP) which is designed to optimize the predictive railway tamping activities for ballasted track for the time horizon up to four years. The objective function is setup to minimize the actual costs for the tamping machine (measured by time......). Five technical and economic aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality...... recovery on the track quality after tamping operation and (5) Tamping machine operation factors. A Danish railway track between Odense and Fredericia with 57.2 km of length is applied for a time period of two to four years in the proposed maintenance model. The total cost can be reduced with up to 50...

  14. An Operational Model for the Prediction of Jet Blast

    Science.gov (United States)

    2012-01-09

    This paper presents an operational model for the prediction of jet blast. The model was : developed based upon three modules including a jet exhaust model, jet centerline decay : model and aircraft motion model. The final analysis was compared with d...

  15. Continuous-Discrete Time Prediction-Error Identification Relevant for Linear Model Predictive Control

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    A Prediction-error-method tailored for model based predictive control is presented. The prediction-error method studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model. The linear discrete-time stochastic state space...... model is realized from a continuous-discrete-time linear stochastic system specified using transfer functions with time-delays. It is argued that the prediction-error criterion should be selected such that it is compatible with the objective function of the predictive controller in which the model...

  16. Predictive modeling: potential application in prevention services.

    Science.gov (United States)

    Wilson, Moira L; Tumen, Sarah; Ota, Rissa; Simmers, Anthony G

    2015-05-01

    In 2012, the New Zealand Government announced a proposal to introduce predictive risk models (PRMs) to help professionals identify and assess children at risk of abuse or neglect as part of a preventive early intervention strategy, subject to further feasibility study and trialing. The purpose of this study is to examine technical feasibility and predictive validity of the proposal, focusing on a PRM that would draw on population-wide linked administrative data to identify newborn children who are at high priority for intensive preventive services. Data analysis was conducted in 2013 based on data collected in 2000-2012. A PRM was developed using data for children born in 2010 and externally validated for children born in 2007, examining outcomes to age 5 years. Performance of the PRM in predicting administratively recorded substantiations of maltreatment was good compared to the performance of other tools reviewed in the literature, both overall, and for indigenous Māori children. Some, but not all, of the children who go on to have recorded substantiations of maltreatment could be identified early using PRMs. PRMs should be considered as a potential complement to, rather than a replacement for, professional judgment. Trials are needed to establish whether risks can be mitigated and PRMs can make a positive contribution to frontline practice, engagement in preventive services, and outcomes for children. Deciding whether to proceed to trial requires balancing a range of considerations, including ethical and privacy risks and the risk of compounding surveillance bias. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  17. Heuristic Modeling for TRMM Lifetime Predictions

    Science.gov (United States)

    Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.

    1996-01-01

    Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.

  18. A Computational Model for Predicting Gas Breakdown

    Science.gov (United States)

    Gill, Zachary

    2017-10-01

    Pulsed-inductive discharges are a common method of producing a plasma. They provide a mechanism for quickly and efficiently generating a large volume of plasma for rapid use and are seen in applications including propulsion, fusion power, and high-power lasers. However, some common designs see a delayed response time due to the plasma forming when the magnitude of the magnetic field in the thruster is at a minimum. New designs are difficult to evaluate due to the amount of time needed to construct a new geometry and the high monetary cost of changing the power generation circuit. To more quickly evaluate new designs and better understand the shortcomings of existing designs, a computational model is developed. This model uses a modified single-electron model as the basis for a Mathematica code to determine how the energy distribution in a system changes with regards to time and location. By analyzing this energy distribution, the approximate time and location of initial plasma breakdown can be predicted. The results from this code are then compared to existing data to show its validity and shortcomings. Missouri S&T APLab.

  19. Distributed model predictive control made easy

    CERN Document Server

    Negenborn, Rudy

    2014-01-01

    The rapid evolution of computer science, communication, and information technology has enabled the application of control techniques to systems beyond the possibilities of control theory just a decade ago. Critical infrastructures such as electricity, water, traffic and intermodal transport networks are now in the scope of control engineers. The sheer size of such large-scale systems requires the adoption of advanced distributed control approaches. Distributed model predictive control (MPC) is one of the promising control methodologies for control of such systems.   This book provides a state-of-the-art overview of distributed MPC approaches, while at the same time making clear directions of research that deserve more attention. The core and rationale of 35 approaches are carefully explained. Moreover, detailed step-by-step algorithmic descriptions of each approach are provided. These features make the book a comprehensive guide both for those seeking an introduction to distributed MPC as well as for those ...

  20. Which method predicts recidivism best?: A comparison of statistical, machine learning, and data mining predictive models

    OpenAIRE

    Tollenaar, N.; van der Heijden, P.G.M.

    2012-01-01

    Using criminal population conviction histories of recent offenders, prediction mod els are developed that predict three types of criminal recidivism: general recidivism, violent recidivism and sexual recidivism. The research question is whether prediction techniques from modern statistics, data mining and machine learning provide an improvement in predictive performance over classical statistical methods, namely logistic regression and linear discrim inant analysis. These models are compared ...

  1. Greater body mass independently predicts less radiographic progression on X-ray and MRI over 1-2 years

    DEFF Research Database (Denmark)

    Baker, Joshua F; Østergaard, Mikkel; George, Michael

    2014-01-01

    INTRODUCTION: Greater body mass index (BMI) has been associated with less radiographic progression in rheumatoid arthritis (RA). We evaluated the association between BMI and joint damage progression as measured by X-ray and MRI. METHODS: 1068 subjects with RA from two clinical trials of golimumab...

  2. Fuzzy predictive filtering in nonlinear economic model predictive control for demand response

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.

    2016-01-01

    The performance of a model predictive controller (MPC) is highly correlated with the model's accuracy. This paper introduces an economic model predictive control (EMPC) scheme based on a nonlinear model, which uses a branch-and-bound tree search for solving the inherent non-convex optimization...

  3. Impact of Lesion Visibility on Transrectal Ultrasound on the Prediction of Clinically Significant Prostate Cancer (Gleason Score 3 + 4 or Greater) with Transrectal Ultrasound-Magnetic Resonance Imaging Fusion Biopsy.

    Science.gov (United States)

    Garcia-Reyes, Kirema; Nguyen, Hao G; Zagoria, Ronald J; Shinohara, Katsuto; Carroll, Peter R; Behr, Spencer C; Westphalen, Antonio C

    2017-09-20

    The purpose of this study was to estimate the impact of lesion visibility with transrectal ultrasound on the prediction of clinically significant prostate cancer with transrectal ultrasound-magnetic resonance imaging fusion biopsy. This HIPAA (Health Insurance Portability and Accountability Act) compliant, institutional review board approved, retrospective study was performed in 178 men who were 64.7 years old with prostate specific antigen 8.9 ng/ml. They underwent transrectal ultrasound-magnetic resonance imaging fusion biopsy from January 2013 to September 2016. Visible lesions on magnetic resonance imaging were assigned a PI-RADS™ (Prostate Imaging Reporting and Data System), version 2 score of 3 or greater. Transrectal ultrasound was positive when a hypoechoic lesion was identified. We used a 3-level, mixed effects logistic regression model to determine how transrectal ultrasound-magnetic resonance imaging concordance predicted the presence of clinically significant prostate cancer. The diagnostic performance of the 2 methods was estimated using ROC curves. A total of 1,331 sextants were targeted by transrectal ultrasound-magnetic resonance imaging fusion or systematic biopsies, of which 1,037 were negative, 183 were Gleason score 3 + 3 and 111 were Gleason score 3 + 4 or greater. Clinically significant prostate cancer was diagnosed by transrectal ultrasound and magnetic resonance imaging alone at 20.5% and 19.7% of these locations, respectively. Men with positive imaging had higher odds of clinically significant prostate cancer than men without visible lesions regardless of modality (transrectal ultrasound OR 14.75, 95% CI 5.22-41.69, magnetic resonance imaging OR 12.27, 95% CI 6.39-23.58 and the 2 modalities OR 28.68, 95% CI 14.45-56.89, all p magnetic resonance imaging alone (0.83, 95% CI 0.79-0.87, p = 0.04). The sensitivity and specificity of transrectal ultrasound were 42.3% and 91.6%, and the sensitivity and specificity of magnetic resonance imaging

  4. Performance assessment of the Greater Confinement Disposal facility on the Nevada Test Site: Comparing the performance of two conceptual site models

    International Nuclear Information System (INIS)

    Baer, T.A.; Price, L.L.; Gallegos, D.P.

    1993-01-01

    A small amount of transuranic (TRU) waste has been disposed of at the Greater Confinement Disposal (GCD) site located on the Nevada Test Site's (NTS) Radioactive Waste Management Site (RWMS). The waste has been buried in several deep (37 m) boreholes dug into the floor of an alluvial basin. For the waste to remain in its current configuration, the DOE must demonstrate compliance of the site with the TRU disposal requirements, 40 CFR 191. Sandia's approach to process modelling in performance assessment is to use demonstrably conservative models of the site. Choosing the most conservative model, however, can be uncertain. As an example, diffusion of contaminants upward from the buried waste in the vadose zone water is the primary mechanism of release. This process can be modelled as straight upward planar diffusion or as spherical diffusion in all directions. The former has high fluxes but low release areas, the latter has lower fluxes but is spread over a greater area. We have developed analytic solutions to a simple test problem for both models and compared the total integrated discharges. The spherical diffusion conceptual model results in at least five times greater release to the accessible environment than the planar model at all diffusivities. Modifying the planar model to allow for a larger release, however, compensated for the smaller original planar discharge and resulted in a new planar model that was more conservative that the spherical model except at low diffusivities

  5. Exploring the processes of generating LOD (0-2) CityGML models in greater municipality of Istanbul

    NARCIS (Netherlands)

    Buyukaslih, I.; Isikdag, U.; Zlatanova, S.

    2013-01-01

    3D models of cities, visualised and exploded in 3D virtual environments have been available for several years. Currently a large number of impressive realistic 3D models have been regularly presented at scientific, professional and commercial events. One of the most promising developments is OGC

  6. Implante Autólogo Ovariano no Omento Maior: Estudo Experimental Ovarian Autotransplantation to the Greater Omentum: Experimental Model

    Directory of Open Access Journals (Sweden)

    Luiz Ronaldo Alberti

    2002-01-01

    Full Text Available Objetivos: avaliar os aspectos morfofuncionais de ovários implantados no omento maior, bem como a melhor técnica para implantação do ovário: se íntegro ou fatiado. Métodos: foram divididas aleatoriamente 40 ratas Wistar com ciclos estrais normais em quatro grupos: Grupo I (n = 5, controle - laparotomia; Grupo II (n = 5, ooforectomia total bilateral; Grupo III (n = 15, implante autólogo íntegro no omento maior e Grupo IV (n = 15, implante autólogo fatiado no omento maior. Realizaram-se esfregaços vaginais nos 3º e 6º mes pós-operatório e estudos histológicos dos implantes ovarianos, avaliando-se: degeneração, fibrose, reação inflamatória, angiogênese, cistos foliculares, desenvolvimento folicular e corpos lúteos. Resultados: os animais do Grupo I ciclaram normalmente. As ratas do Grupo II não apresentaram ciclo, permanecendo em diestro. No Grupo III, 11 ratas permaneceram em diestro, três apresentaram ciclos incompletos e apenas uma ciclou normalmente. No Grupo IV, três animais não ciclaram, oito tiveram esfregaços vaginais incompletos e quatro ciclaram normalmente. Os achados histológicos dos animais pertencentes ao Grupo III evidenciaram histoarquitetura normal em dez ratas, porém nas outras cinco, houve degeneração ovariana. No Grupo IV, 14 ratas tiveram ovários com histoarquitetura preservada e em apenas uma houve sinais de degeneração. Conclusões: o implante autólogo ovariano no omento maior foi viável, obtendo-se melhor preservação morfofuncional com a implantação de fatias.Purpose: in order to maintain the gonadal function after oophorectomy, morphofunctional aspects of ovarian autotransplantation to the greater omentum and the best kind of implantation, intact or sliced, were investigated. Methods: forty cycling female Wistar rats were randomly divided into four groups: Group I (n = 5, control - laparotomy; Group II (n = 5, bilateral oophorectomy; Group III (n = 10, intact ovarian

  7. Model for predicting mountain wave field uncertainties

    Science.gov (United States)

    Damiens, Florentin; Lott, François; Millet, Christophe; Plougonven, Riwal

    2017-04-01

    Studying the propagation of acoustic waves throughout troposphere requires knowledge of wind speed and temperature gradients from the ground up to about 10-20 km. Typical planetary boundary layers flows are known to present vertical low level shears that can interact with mountain waves, thereby triggering small-scale disturbances. Resolving these fluctuations for long-range propagation problems is, however, not feasible because of computer memory/time restrictions and thus, they need to be parameterized. When the disturbances are small enough, these fluctuations can be described by linear equations. Previous works by co-authors have shown that the critical layer dynamics that occur near the ground produces large horizontal flows and buoyancy disturbances that result in intense downslope winds and gravity wave breaking. While these phenomena manifest almost systematically for high Richardson numbers and when the boundary layer depth is relatively small compare to the mountain height, the process by which static stability affects downslope winds remains unclear. In the present work, new linear mountain gravity wave solutions are tested against numerical predictions obtained with the Weather Research and Forecasting (WRF) model. For Richardson numbers typically larger than unity, the mesoscale model is used to quantify the effect of neglected nonlinear terms on downslope winds and mountain wave patterns. At these regimes, the large downslope winds transport warm air, a so called "Foehn" effect than can impact sound propagation properties. The sensitivity of small-scale disturbances to Richardson number is quantified using two-dimensional spectral analysis. It is shown through a pilot study of subgrid scale fluctuations of boundary layer flows over realistic mountains that the cross-spectrum of mountain wave field is made up of the same components found in WRF simulations. The impact of each individual component on acoustic wave propagation is discussed in terms of

  8. Older Women, Deeper Learning, and Greater Satisfaction at University: Age and Gender Predict University Students' Learning Approach and Degree Satisfaction

    Science.gov (United States)

    Rubin, Mark; Scevak, Jill; Southgate, Erica; Macqueen, Suzanne; Williams, Paul; Douglas, Heather

    2018-01-01

    The present study explored the interactive effect of age and gender in predicting surface and deep learning approaches. It also investigated how these variables related to degree satisfaction. Participants were 983 undergraduate students at a large public Australian university. They completed a research survey either online or on paper. Consistent…

  9. Model Predictive Control for an Industrial SAG Mill

    DEFF Research Database (Denmark)

    Ohan, Valeriu; Steinke, Florian; Metzger, Michael

    2012-01-01

    We discuss Model Predictive Control (MPC) based on ARX models and a simple lower order disturbance model. The advantage of this MPC formulation is that it has few tuning parameters and is based on an ARX prediction model that can readily be identied using standard technologies from system identic...

  10. Uncertainties in spatially aggregated predictions from a logistic regression model

    NARCIS (Netherlands)

    Horssen, P.W. van; Pebesma, E.J.; Schot, P.P.

    2002-01-01

    This paper presents a method to assess the uncertainty of an ecological spatial prediction model which is based on logistic regression models, using data from the interpolation of explanatory predictor variables. The spatial predictions are presented as approximate 95% prediction intervals. The

  11. Dealing with missing predictor values when applying clinical prediction models.

    NARCIS (Netherlands)

    Janssen, K.J.; Vergouwe, Y.; Donders, A.R.T.; Harrell Jr, F.E.; Chen, Q.; Grobbee, D.E.; Moons, K.G.

    2009-01-01

    BACKGROUND: Prediction models combine patient characteristics and test results to predict the presence of a disease or the occurrence of an event in the future. In the event that test results (predictor) are unavailable, a strategy is needed to help users applying a prediction model to deal with

  12. Neural Fuzzy Inference System-Based Weather Prediction Model and Its Precipitation Predicting Experiment

    Directory of Open Access Journals (Sweden)

    Jing Lu

    2014-11-01

    Full Text Available We propose a weather prediction model in this article based on neural network and fuzzy inference system (NFIS-WPM, and then apply it to predict daily fuzzy precipitation given meteorological premises for testing. The model consists of two parts: the first part is the “fuzzy rule-based neural network”, which simulates sequential relations among fuzzy sets using artificial neural network; and the second part is the “neural fuzzy inference system”, which is based on the first part, but could learn new fuzzy rules from the previous ones according to the algorithm we proposed. NFIS-WPM (High Pro and NFIS-WPM (Ave are improved versions of this model. It is well known that the need for accurate weather prediction is apparent when considering the benefits. However, the excessive pursuit of accuracy in weather prediction makes some of the “accurate” prediction results meaningless and the numerical prediction model is often complex and time-consuming. By adapting this novel model to a precipitation prediction problem, we make the predicted outcomes of precipitation more accurate and the prediction methods simpler than by using the complex numerical forecasting model that would occupy large computation resources, be time-consuming and which has a low predictive accuracy rate. Accordingly, we achieve more accurate predictive precipitation results than by using traditional artificial neural networks that have low predictive accuracy.

  13. Foundation Settlement Prediction Based on a Novel NGM Model

    Directory of Open Access Journals (Sweden)

    Peng-Yu Chen

    2014-01-01

    Full Text Available Prediction of foundation or subgrade settlement is very important during engineering construction. According to the fact that there are lots of settlement-time sequences with a nonhomogeneous index trend, a novel grey forecasting model called NGM (1,1,k,c model is proposed in this paper. With an optimized whitenization differential equation, the proposed NGM (1,1,k,c model has the property of white exponential law coincidence and can predict a pure nonhomogeneous index sequence precisely. We used two case studies to verify the predictive effect of NGM (1,1,k,c model for settlement prediction. The results show that this model can achieve excellent prediction accuracy; thus, the model is quite suitable for simulation and prediction of approximate nonhomogeneous index sequence and has excellent application value in settlement prediction.

  14. Predictive capabilities of various constitutive models for arterial tissue.

    Science.gov (United States)

    Schroeder, Florian; Polzer, Stanislav; Slažanský, Martin; Man, Vojtěch; Skácel, Pavel

    2018-02-01

    Aim of this study is to validate some constitutive models by assessing their capabilities in describing and predicting uniaxial and biaxial behavior of porcine aortic tissue. 14 samples from porcine aortas were used to perform 2 uniaxial and 5 biaxial tensile tests. Transversal strains were furthermore stored for uniaxial data. The experimental data were fitted by four constitutive models: Holzapfel-Gasser-Ogden model (HGO), model based on generalized structure tensor (GST), Four-Fiber-Family model (FFF) and Microfiber model. Fitting was performed to uniaxial and biaxial data sets separately and descriptive capabilities of the models were compared. Their predictive capabilities were assessed in two ways. Firstly each model was fitted to biaxial data and its accuracy (in term of R 2 and NRMSE) in prediction of both uniaxial responses was evaluated. Then this procedure was performed conversely: each model was fitted to both uniaxial tests and its accuracy in prediction of 5 biaxial responses was observed. Descriptive capabilities of all models were excellent. In predicting uniaxial response from biaxial data, microfiber model was the most accurate while the other models showed also reasonable accuracy. Microfiber and FFF models were capable to reasonably predict biaxial responses from uniaxial data while HGO and GST models failed completely in this task. HGO and GST models are not capable to predict biaxial arterial wall behavior while FFF model is the most robust of the investigated constitutive models. Knowledge of transversal strains in uniaxial tests improves robustness of constitutive models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Comparing National Water Model Inundation Predictions with Hydrodynamic Modeling

    Science.gov (United States)

    Egbert, R. J.; Shastry, A.; Aristizabal, F.; Luo, C.

    2017-12-01

    The National Water Model (NWM) simulates the hydrologic cycle and produces streamflow forecasts, runoff, and other variables for 2.7 million reaches along the National Hydrography Dataset for the continental United States. NWM applies Muskingum-Cunge channel routing which is based on the continuity equation. However, the momentum equation also needs to be considered to obtain better estimates of streamflow and stage in rivers especially for applications such as flood inundation mapping. Simulation Program for River NeTworks (SPRNT) is a fully dynamic model for large scale river networks that solves the full nonlinear Saint-Venant equations for 1D flow and stage height in river channel networks with non-uniform bathymetry. For the current work, the steady-state version of the SPRNT model was leveraged. An evaluation on SPRNT's and NWM's abilities to predict inundation was conducted for the record flood of Hurricane Matthew in October 2016 along the Neuse River in North Carolina. This event was known to have been influenced by backwater effects from the Hurricane's storm surge. Retrospective NWM discharge predictions were converted to stage using synthetic rating curves. The stages from both models were utilized to produce flood inundation maps using the Height Above Nearest Drainage (HAND) method which uses the local relative heights to provide a spatial representation of inundation depths. In order to validate the inundation produced by the models, Sentinel-1A synthetic aperture radar data in the VV and VH polarizations along with auxiliary data was used to produce a reference inundation map. A preliminary, binary comparison of the inundation maps to the reference, limited to the five HUC-12 areas of Goldsboro, NC, yielded that the flood inundation accuracies for NWM and SPRNT were 74.68% and 78.37%, respectively. The differences for all the relevant test statistics including accuracy, true positive rate, true negative rate, and positive predictive value were found

  16. Predictive models for moving contact line flows

    Science.gov (United States)

    Rame, Enrique; Garoff, Stephen

    2003-01-01

    Modeling flows with moving contact lines poses the formidable challenge that the usual assumptions of Newtonian fluid and no-slip condition give rise to a well-known singularity. This singularity prevents one from satisfying the contact angle condition to compute the shape of the fluid-fluid interface, a crucial calculation without which design parameters such as the pressure drop needed to move an immiscible 2-fluid system through a solid matrix cannot be evaluated. Some progress has been made for low Capillary number spreading flows. Combining experimental measurements of fluid-fluid interfaces very near the moving contact line with an analytical expression for the interface shape, we can determine a parameter that forms a boundary condition for the macroscopic interface shape when Ca much les than l. This parameter, which plays the role of an "apparent" or macroscopic dynamic contact angle, is shown by the theory to depend on the system geometry through the macroscopic length scale. This theoretically established dependence on geometry allows this parameter to be "transferable" from the geometry of the measurement to any other geometry involving the same material system. Unfortunately this prediction of the theory cannot be tested on Earth.

  17. Developmental prediction model for early alcohol initiation in Dutch adolescents

    NARCIS (Netherlands)

    Geels, L.M.; Vink, J.M.; Beijsterveldt, C.E.M. van; Bartels, M.; Boomsma, D.I.

    2013-01-01

    Objective: Multiple factors predict early alcohol initiation in teenagers. Among these are genetic risk factors, childhood behavioral problems, life events, lifestyle, and family environment. We constructed a developmental prediction model for alcohol initiation below the Dutch legal drinking age

  18. Seasonal predictability of Kiremt rainfall in coupled general circulation models

    Science.gov (United States)

    Gleixner, Stephanie; Keenlyside, Noel S.; Demissie, Teferi D.; Counillon, François; Wang, Yiguo; Viste, Ellen

    2017-11-01

    The Ethiopian economy and population is strongly dependent on rainfall. Operational seasonal predictions for the main rainy season (Kiremt, June-September) are based on statistical approaches with Pacific sea surface temperatures (SST) as the main predictor. Here we analyse dynamical predictions from 11 coupled general circulation models for the Kiremt seasons from 1985-2005 with the forecasts starting from the beginning of May. We find skillful predictions from three of the 11 models, but no model beats a simple linear prediction model based on the predicted Niño3.4 indices. The skill of the individual models for dynamically predicting Kiremt rainfall depends on the strength of the teleconnection between Kiremt rainfall and concurrent Pacific SST in the models. Models that do not simulate this teleconnection fail to capture the observed relationship between Kiremt rainfall and the large-scale Walker circulation.

  19. MODELLING OF DYNAMIC SPEED LIMITS USING THE MODEL PREDICTIVE CONTROL

    Directory of Open Access Journals (Sweden)

    Andrey Borisovich Nikolaev

    2017-09-01

    Full Text Available The article considers the issues of traffic management using intelligent system “Car-Road” (IVHS, which consist of interacting intelligent vehicles (IV and intelligent roadside controllers. Vehicles are organized in convoy with small distances between them. All vehicles are assumed to be fully automated (throttle control, braking, steering. Proposed approaches for determining speed limits for traffic cars on the motorway using a model predictive control (MPC. The article proposes an approach to dynamic speed limit to minimize the downtime of vehicles in traffic.

  20. Environmental fate and ecotoxicological risk of the antibiotic sulfamethoxazole across the Katari catchment (Bolivian Altiplano) : application of the GREAT-ER model

    OpenAIRE

    Archundia, D.; Boithias, Laurie; Duwig, Céline; Morel, M. C.; Aviles, G. F.; Martins, J. M. F.

    2018-01-01

    Antibiotics are emergent contaminants that can induce adverse effects in terrestrial and aquatic organisms. The surface water compartment is of particular concern as it receives direct waste water discharge. Modeling is highlighted as an essential tool to understand the fate and behavior of these compounds and to assess their eco-toxicological risk. This study aims at testing the ability of the GREAT-ER model in simulating sulfamethoxazole (SMX) concentrations in the surface waters of the ari...

  1. Predictability in models of the atmospheric circulation

    NARCIS (Netherlands)

    Houtekamer, P.L.

    1992-01-01

    It will be clear from the above discussions that skill forecasts are still in their infancy. Operational skill predictions do not exist. One is still struggling to prove that skill predictions, at any range, have any quality at all. It is not clear what the statistics of the analysis error

  2. Linezolid Exerts Greater Bacterial Clearance but No Modification of Host Lung Gene Expression Profiling: A Mouse MRSA Pneumonia Model.

    Directory of Open Access Journals (Sweden)

    Jiwang Chen

    Full Text Available Linezolid (LZD is beneficial to patients with MRSA pneumonia, but whether and how LZD influences global host lung immune responses at the mRNA level during MRSA-mediated pneumonia is still unknown.A lethal mouse model of MRSA pneumonia mediated by USA300 was employed to study the influence of LZD on survival, while the sublethal mouse model was used to examine the effect of LZD on bacterial clearance and lung gene expression during MRSA pneumonia. LZD (100mg/kg/day, IP was given to C57Bl6 mice for three days. On Day 1 and Day 3 post infection, bronchoalveolar lavage fluid (BALF protein concentration and levels of cytokines including IL6, TNFα, IL1β, Interferon-γ and IL17 were measured. In the sublethal model, left lungs were used to determine bacterial clearance and right lungs for whole-genome transcriptional profiling of lung immune responses.LZD therapy significantly improved survival and bacterial clearance. It also significantly decreased BALF protein concentration and levels of cytokines including IL6, IL1β, Interferon-γ and IL17. No significant gene expression changes in the mouse lungs were associated with LZD therapy.LZD is beneficial to MRSA pneumonia, but it does not modulate host lung immune responses at the transcriptional level.

  3. Models for predicting compressive strength and water absorption of ...

    African Journals Online (AJOL)

    This work presents a mathematical model for predicting the compressive strength and water absorption of laterite-quarry dust cement block using augmented Scheffe's simplex lattice design. The statistical models developed can predict the mix proportion that will yield the desired property. The models were tested for lack of ...

  4. Regression models for predicting anthropometric measurements of ...

    African Journals Online (AJOL)

    measure anthropometric dimensions to predict difficult-to-measure dimensions required for ergonomic design of school furniture. A total of 143 students aged between 16 and 18 years from eight public secondary schools in Ogbomoso, Nigeria ...

  5. FINITE ELEMENT MODEL FOR PREDICTING RESIDUAL ...

    African Journals Online (AJOL)

    direction (σx) had a maximum value of 375MPa (tensile) and minimum value of ... These results shows that the residual stresses obtained by prediction from the finite element method are in fair agreement with the experimental results.

  6. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...... visualization to improve our understanding of the different attained performances, effectively compiling all the conducted experiments in a meaningful way. We complete our study with an entropy-based analysis that highlights the uncertainty handling properties provided by the GP, crucial for prediction tasks...

  7. Prediction for Major Adverse Outcomes in Cardiac Surgery: Comparison of Three Prediction Models

    Directory of Open Access Journals (Sweden)

    Cheng-Hung Hsieh

    2007-09-01

    Conclusion: The Parsonnet score performed as well as the logistic regression models in predicting major adverse outcomes. The Parsonnet score appears to be a very suitable model for clinicians to use in risk stratification of cardiac surgery.

  8. From Predictive Models to Instructional Policies

    Science.gov (United States)

    Rollinson, Joseph; Brunskill, Emma

    2015-01-01

    At their core, Intelligent Tutoring Systems consist of a student model and a policy. The student model captures the state of the student and the policy uses the student model to individualize instruction. Policies require different properties from the student model. For example, a mastery threshold policy requires the student model to have a way…

  9. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  10. A model to predict the beginning of the pollen season

    DEFF Research Database (Denmark)

    Toldam-Andersen, Torben Bo

    1991-01-01

    In order to predict the beginning of the pollen season, a model comprising the Utah phenoclirnatography Chill Unit (CU) and ASYMCUR-Growing Degree Hour (GDH) submodels were used to predict the first bloom in Alms, Ulttirrs and Berirln. The model relates environmental temperatures to rest completion...... and bud development. As phenologic parameter 14 years of pollen counts were used. The observed datcs for the beginning of the pollen seasons were defined from the pollen counts and compared with the model prediction. The CU and GDH submodels were used as: 1. A fixed day model, using only the GDH model...... for fruit trees are generally applicable, and give a reasonable description of the growth processes of other trees. This type of model can therefore be of value in predicting the start of the pollen season. The predicted dates were generally within 3-5 days of the observed. Finally the possibility of frost...

  11. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  12. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  13. Comparative Evaluation of Some Crop Yield Prediction Models ...

    African Journals Online (AJOL)

    A computer program was adopted from the work of Hill et al. (1982) to calibrate and test three of the existing yield prediction models using tropical cowpea yieldÐweather data. The models tested were Hanks Model (first and second versions). Stewart Model (first and second versions) and HallÐButcher Model. Three sets of ...

  14. Comparative Evaluation of Some Crop Yield Prediction Models ...

    African Journals Online (AJOL)

    (1982) to calibrate and test three of the existing yield prediction models using tropical cowpea yieldÐweather data. The models tested were Hanks Model (first and second versions). Stewart Model (first and second versions) and HallÐButcher Model. Three sets of cowpea yield-water use and weather data were collected.

  15. Prediction of speech intelligibility based on an auditory preprocessing model

    DEFF Research Database (Denmark)

    Christiansen, Claus Forup Corlin; Pedersen, Michael Syskind; Dau, Torsten

    2010-01-01

    Classical speech intelligibility models, such as the speech transmission index (STI) and the speech intelligibility index (SII) are based on calculations on the physical acoustic signals. The present study predicts speech intelligibility by combining a psychoacoustically validated model of auditory...

  16. Modelling microbial interactions and food structure in predictive microbiology

    NARCIS (Netherlands)

    Malakar, P.K.

    2002-01-01

    Keywords: modelling, dynamic models, microbial interactions, diffusion, microgradients, colony growth, predictive microbiology.

    Growth response of microorganisms in foods is a complex process. Innovations in food production and preservation techniques have resulted in adoption of

  17. Ocean wave prediction using numerical and neural network models

    Digital Repository Service at National Institute of Oceanography (India)

    Mandal, S.; Prabaharan, N.

    This paper presents an overview of the development of the numerical wave prediction models and recently used neural networks for ocean wave hindcasting and forecasting. The numerical wave models express the physical concepts of the phenomena...

  18. A Prediction Model of the Capillary Pressure J-Function.

    Directory of Open Access Journals (Sweden)

    W S Xu

    Full Text Available The capillary pressure J-function is a dimensionless measure of the capillary pressure of a fluid in a porous medium. The function was derived based on a capillary bundle model. However, the dependence of the J-function on the saturation Sw is not well understood. A prediction model for it is presented based on capillary pressure model, and the J-function prediction model is a power function instead of an exponential or polynomial function. Relative permeability is calculated with the J-function prediction model, resulting in an easier calculation and results that are more representative.

  19. Statistical model based gender prediction for targeted NGS clinical panels

    Directory of Open Access Journals (Sweden)

    Palani Kannan Kandavel

    2017-12-01

    The reference test dataset are being used to test the model. The sensitivity on predicting the gender has been increased from the current “genotype composition in ChrX” based approach. In addition, the prediction score given by the model can be used to evaluate the quality of clinical dataset. The higher prediction score towards its respective gender indicates the higher quality of sequenced data.

  20. comparative analysis of two mathematical models for prediction

    African Journals Online (AJOL)

    Abstract. A mathematical modeling for prediction of compressive strength of sandcrete blocks was performed using statistical analysis for the sandcrete block data ob- tained from experimental work done in this study. The models used are Scheffes and Osadebes optimization theories to predict the compressive strength of ...

  1. Comparison of predictive models for the early diagnosis of diabetes

    NARCIS (Netherlands)

    M. Jahani (Meysam); M. Mahdavi (Mahdi)

    2016-01-01

    textabstractObjectives: This study develops neural network models to improve the prediction of diabetes using clinical and lifestyle characteristics. Prediction models were developed using a combination of approaches and concepts. Methods: We used memetic algorithms to update weights and to improve

  2. Testing and analysis of internal hardwood log defect prediction models

    Science.gov (United States)

    R. Edward. Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  3. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  4. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    Dimitrakakis, C.

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more

  5. Demonstrating the improvement of predictive maturity of a computational model

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M [Los Alamos National Laboratory; Unal, Cetin [Los Alamos National Laboratory; Atamturktur, Huriye S [CLEMSON UNIV.

    2010-01-01

    We demonstrate an improvement of predictive capability brought to a non-linear material model using a combination of test data, sensitivity analysis, uncertainty quantification, and calibration. A model that captures increasingly complicated phenomena, such as plasticity, temperature and strain rate effects, is analyzed. Predictive maturity is defined, here, as the accuracy of the model to predict multiple Hopkinson bar experiments. A statistical discrepancy quantifies the systematic disagreement (bias) between measurements and predictions. Our hypothesis is that improving the predictive capability of a model should translate into better agreement between measurements and predictions. This agreement, in turn, should lead to a smaller discrepancy. We have recently proposed to use discrepancy and coverage, that is, the extent to which the physical experiments used for calibration populate the regime of applicability of the model, as basis to define a Predictive Maturity Index (PMI). It was shown that predictive maturity could be improved when additional physical tests are made available to increase coverage of the regime of applicability. This contribution illustrates how the PMI changes as 'better' physics are implemented in the model. The application is the non-linear Preston-Tonks-Wallace (PTW) strength model applied to Beryllium metal. We demonstrate that our framework tracks the evolution of maturity of the PTW model. Robustness of the PMI with respect to the selection of coefficients needed in its definition is also studied.

  6. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  7. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  8. Wind turbine control and model predictive control for uncertain systems

    DEFF Research Database (Denmark)

    Thomsen, Sven Creutz

    as disturbance models for controller design. The theoretical study deals with Model Predictive Control (MPC). MPC is an optimal control method which is characterized by the use of a receding prediction horizon. MPC has risen in popularity due to its inherent ability to systematically account for time...

  9. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  10. Model predictive control of a 3-DOF helicopter system using ...

    African Journals Online (AJOL)

    ... by simulation, and its performance is compared with that achieved by linear model predictive control (LMPC). Keywords: nonlinear systems, helicopter dynamics, MIMO systems, model predictive control, successive linearization. International Journal of Engineering, Science and Technology, Vol. 2, No. 10, 2010, pp. 9-19 ...

  11. Models for predicting fuel consumption in sagebrush-dominated ecosystems

    Science.gov (United States)

    Clinton S. Wright

    2013-01-01

    Fuel consumption predictions are necessary to accurately estimate or model fire effects, including pollutant emissions during wildland fires. Fuel and environmental measurements on a series of operational prescribed fires were used to develop empirical models for predicting fuel consumption in big sagebrush (Artemisia tridentate Nutt.) ecosystems....

  12. Comparative Analysis of Two Mathematical Models for Prediction of ...

    African Journals Online (AJOL)

    A mathematical modeling for prediction of compressive strength of sandcrete blocks was performed using statistical analysis for the sandcrete block data obtained from experimental work done in this study. The models used are Scheffe's and Osadebe's optimization theories to predict the compressive strength of sandcrete ...

  13. A mathematical model for predicting earthquake occurrence ...

    African Journals Online (AJOL)

    We consider the continental crust under damage. We use the observed results of microseism in many seismic stations of the world which was established to study the time series of the activities of the continental crust with a view to predicting possible time of occurrence of earthquake. We consider microseism time series ...

  14. Model for predicting the injury severity score.

    Science.gov (United States)

    Hagiwara, Shuichi; Oshima, Kiyohiro; Murata, Masato; Kaneko, Minoru; Aoki, Makoto; Kanbe, Masahiko; Nakamura, Takuro; Ohyama, Yoshio; Tamura, Jun'ichi

    2015-07-01

    To determine the formula that predicts the injury severity score from parameters that are obtained in the emergency department at arrival. We reviewed the medical records of trauma patients who were transferred to the emergency department of Gunma University Hospital between January 2010 and December 2010. The injury severity score, age, mean blood pressure, heart rate, Glasgow coma scale, hemoglobin, hematocrit, red blood cell count, platelet count, fibrinogen, international normalized ratio of prothrombin time, activated partial thromboplastin time, and fibrin degradation products, were examined in those patients on arrival. To determine the formula that predicts the injury severity score, multiple linear regression analysis was carried out. The injury severity score was set as the dependent variable, and the other parameters were set as candidate objective variables. IBM spss Statistics 20 was used for the statistical analysis. Statistical significance was set at P  Watson ratio was 2.200. A formula for predicting the injury severity score in trauma patients was developed with ordinary parameters such as fibrin degradation products and mean blood pressure. This formula is useful because we can predict the injury severity score easily in the emergency department.

  15. Econometric models for predicting confusion crop ratios

    Science.gov (United States)

    Umberger, D. E.; Proctor, M. H.; Clark, J. E.; Eisgruber, L. M.; Braschler, C. B. (Principal Investigator)

    1979-01-01

    Results for both the United States and Canada show that econometric models can provide estimates of confusion crop ratios that are more accurate than historical ratios. Whether these models can support the LACIE 90/90 accuracy criterion is uncertain. In the United States, experimenting with additional model formulations could provide improved methods models in some CRD's, particularly in winter wheat. Improved models may also be possible for the Canadian CD's. The more aggressive province/state models outperformed individual CD/CRD models. This result was expected partly because acreage statistics are based on sampling procedures, and the sampling precision declines from the province/state to the CD/CRD level. Declining sampling precision and the need to substitute province/state data for the CD/CRD data introduced measurement error into the CD/CRD models.

  16. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  17. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  18. PEEX Modelling Platform for Seamless Environmental Prediction

    Science.gov (United States)

    Baklanov, Alexander; Mahura, Alexander; Arnold, Stephen; Makkonen, Risto; Petäjä, Tuukka; Kerminen, Veli-Matti; Lappalainen, Hanna K.; Ezau, Igor; Nuterman, Roman; Zhang, Wen; Penenko, Alexey; Gordov, Evgeny; Zilitinkevich, Sergej; Kulmala, Markku

    2017-04-01

    The Pan-Eurasian EXperiment (PEEX) is a multidisciplinary, multi-scale research programme stared in 2012 and aimed at resolving the major uncertainties in Earth System Science and global sustainability issues concerning the Arctic and boreal Northern Eurasian regions and in China. Such challenges include climate change, air quality, biodiversity loss, chemicalization, food supply, and the use of natural resources by mining, industry, energy production and transport. The research infrastructure introduces the current state of the art modeling platform and observation systems in the Pan-Eurasian region and presents the future baselines for the coherent and coordinated research infrastructures in the PEEX domain. The PEEX modeling Platform is characterized by a complex seamless integrated Earth System Modeling (ESM) approach, in combination with specific models of different processes and elements of the system, acting on different temporal and spatial scales. The ensemble approach is taken to the integration of modeling results from different models, participants and countries. PEEX utilizes the full potential of a hierarchy of models: scenario analysis, inverse modeling, and modeling based on measurement needs and processes. The models are validated and constrained by available in-situ and remote sensing data of various spatial and temporal scales using data assimilation and top-down modeling. The analyses of the anticipated large volumes of data produced by available models and sensors will be supported by a dedicated virtual research environment developed for these purposes.

  19. Energy sector integration for low carbon development in Greater Mekong sub-region: Towards a model of South-South cooperation

    Energy Technology Data Exchange (ETDEWEB)

    Zhai, Yongping

    2010-09-15

    The Greater Mekong Sub-region (GMS) in Southeast Asia has embarked on a roadmap of power interconnection and expanded energy sector cooperation. An Asian development bank committed study using Model of Energy Supply Systems Alternatives and their General Environmental Impacts (MESSAGE) assessed the impacts of various scenarios, the results indicate that GMS integration will help these countries to achieve low carbon and sustainable development. The article suggests that the experience of GMS cooperation be made a model for South-South cooperation in the global effort to fight climate change.

  20. Models Predicting Success of Infertility Treatment: A Systematic Review

    Science.gov (United States)

    Zarinara, Alireza; Zeraati, Hojjat; Kamali, Koorosh; Mohammad, Kazem; Shahnazari, Parisa; Akhondi, Mohammad Mehdi

    2016-01-01

    Background: Infertile couples are faced with problems that affect their marital life. Infertility treatment is expensive and time consuming and occasionally isn’t simply possible. Prediction models for infertility treatment have been proposed and prediction of treatment success is a new field in infertility treatment. Because prediction of treatment success is a new need for infertile couples, this paper reviewed previous studies for catching a general concept in applicability of the models. Methods: This study was conducted as a systematic review at Avicenna Research Institute in 2015. Six data bases were searched based on WHO definitions and MESH key words. Papers about prediction models in infertility were evaluated. Results: Eighty one papers were eligible for the study. Papers covered years after 1986 and studies were designed retrospectively and prospectively. IVF prediction models have more shares in papers. Most common predictors were age, duration of infertility, ovarian and tubal problems. Conclusion: Prediction model can be clinically applied if the model can be statistically evaluated and has a good validation for treatment success. To achieve better results, the physician and the couples’ needs estimation for treatment success rate were based on history, the examination and clinical tests. Models must be checked for theoretical approach and appropriate validation. The privileges for applying the prediction models are the decrease in the cost and time, avoiding painful treatment of patients, assessment of treatment approach for physicians and decision making for health managers. The selection of the approach for designing and using these models is inevitable. PMID:27141461

  1. Towards a generalized energy prediction model for machine tools.

    Science.gov (United States)

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan

    2017-04-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.

  2. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  3. Comparison of Predictive Models for the Early Diagnosis of Diabetes.

    Science.gov (United States)

    Jahani, Meysam; Mahdavi, Mahdi

    2016-04-01

    This study develops neural network models to improve the prediction of diabetes using clinical and lifestyle characteristics. Prediction models were developed using a combination of approaches and concepts. We used memetic algorithms to update weights and to improve prediction accuracy of models. In the first step, the optimum amount for neural network parameters such as momentum rate, transfer function, and error function were obtained through trial and error and based on the results of previous studies. In the second step, optimum parameters were applied to memetic algorithms in order to improve the accuracy of prediction. This preliminary analysis showed that the accuracy of neural networks is 88%. In the third step, the accuracy of neural network models was improved using a memetic algorithm and resulted model was compared with a logistic regression model using a confusion matrix and receiver operating characteristic curve (ROC). The memetic algorithm improved the accuracy from 88.0% to 93.2%. We also found that memetic algorithm had a higher accuracy than the model from the genetic algorithm and a regression model. Among models, the regression model has the least accuracy. For the memetic algorithm model the amount of sensitivity, specificity, positive predictive value, negative predictive value, and ROC are 96.2, 95.3, 93.8, 92.4, and 0.958 respectively. The results of this study provide a basis to design a Decision Support System for risk management and planning of care for individuals at risk of diabetes.

  4. Applications of modeling in polymer-property prediction

    Science.gov (United States)

    Case, F. H.

    1996-08-01

    A number of molecular modeling techniques have been applied for the prediction of polymer properties and behavior. Five examples illustrate the range of methodologies used. A simple atomistic simulation of small polymer fragments is used to estimate drug compatibility with a polymer matrix. The analysis of molecular dynamics results from a more complex model of a swollen hydrogel system is used to study gas diffusion in contact lenses. Statistical mechanics are used to predict conformation dependent properties — an example is the prediction of liquid-crystal formation. The effect of the molecular weight distribution on phase separation in polyalkanes is predicted using thermodynamic models. In some cases, the properties of interest cannot be directly predicted using simulation methods or polymer theory. Correlation methods may be used to bridge the gap between molecular structure and macroscopic properties. The final example shows how connectivity-indices-based quantitative structure-property relationships were used to predict properties for candidate polyimids in an electronics application.

  5. Artificial Neural Network Model for Predicting Compressive

    OpenAIRE

    Salim T. Yousif; Salwa M. Abdullah

    2013-01-01

      Compressive strength of concrete is a commonly used criterion in evaluating concrete. Although testing of the compressive strength of concrete specimens is done routinely, it is performed on the 28th day after concrete placement. Therefore, strength estimation of concrete at early time is highly desirable. This study presents the effort in applying neural network-based system identification techniques to predict the compressive strength of concrete based on concrete mix proportions, maximum...

  6. Computerized heat balance models to predict performance of operating nuclear power plants

    International Nuclear Information System (INIS)

    Breeding, C.L.; Carter, J.C.; Schaefer, R.C.

    1983-01-01

    The use of computerized heat balance models has greatly enhanced the decision making ability of TVA's Division of Nuclear Power. These models are utilized to predict the effects of various operating modes and to analyze changes in plant performance resulting from turbine cycle equipment modifications with greater speed and accuracy than was possible before. Computer models have been successfully used to optimize plant output by predicting the effects of abnormal condenser circulating water conditions. They were utilized to predict the degradation in performance resulting from installation of a baffle plate assembly to replace damaged low-pressure blading, thereby providing timely information allowing an optimal economic judgement as to when to replace the blading. Future use will be for routine performance test analysis. This paper presents the benefits of utility use of computerized heat balance models

  7. Hybrid Model Predictive Control for Optimizing Gestational Weight Gain Behavioral Interventions.

    Science.gov (United States)

    Dong, Yuwen; Rivera, Daniel E; Downs, Danielle S; Savage, Jennifer S; Thomas, Diana M; Collins, Linda M

    2013-01-01

    Excessive gestational weight gain (GWG) represents a major public health issue. In this paper, we pursue a control engineering approach to the problem by applying model predictive control (MPC) algorithms to act as decision policies in the intervention for assigning optimal intervention dosages. The intervention components consist of education, behavioral modification and active learning. The categorical nature of the intervention dosage assignment problem dictates the need for hybrid model predictive control (HMPC) schemes, ultimately leading to improved outcomes. The goal is to design a controller that generates an intervention dosage sequence which improves a participant's healthy eating behavior and physical activity to better control GWG. An improved formulation of self-regulation is also presented through the use of Internal Model Control (IMC), allowing greater flexibility in describing self-regulatory behavior. Simulation results illustrate the basic workings of the model and demonstrate the benefits of hybrid predictive control for optimized GWG adaptive interventions.

  8. Prediction of hourly solar radiation with multi-model framework

    International Nuclear Information System (INIS)

    Wu, Ji; Chan, Chee Keong

    2013-01-01

    Highlights: • A novel approach to predict solar radiation through the use of clustering paradigms. • Development of prediction models based on the intrinsic pattern observed in each cluster. • Prediction based on proper clustering and selection of model on current time provides better results than other methods. • Experiments were conducted on actual solar radiation data obtained from a weather station in Singapore. - Abstract: In this paper, a novel multi-model prediction framework for prediction of solar radiation is proposed. The framework started with the assumption that there are several patterns embedded in the solar radiation series. To extract the underlying pattern, the solar radiation series is first segmented into smaller subsequences, and the subsequences are further grouped into different clusters. For each cluster, an appropriate prediction model is trained. Hence a procedure for pattern identification is developed to identify the proper pattern that fits the current period. Based on this pattern, the corresponding prediction model is applied to obtain the prediction value. The prediction result of the proposed framework is then compared to other techniques. It is shown that the proposed framework provides superior performance as compared to others

  9. Posterior Predictive Model Checking for Multidimensionality in Item Response Theory

    Science.gov (United States)

    Levy, Roy; Mislevy, Robert J.; Sinharay, Sandip

    2009-01-01

    If data exhibit multidimensionality, key conditional independence assumptions of unidimensional models do not hold. The current work pursues posterior predictive model checking, a flexible family of model-checking procedures, as a tool for criticizing models due to unaccounted for dimensions in the context of item response theory. Factors…

  10. Model predictive control of a crude oil distillation column

    Directory of Open Access Journals (Sweden)

    Morten Hovd

    1999-04-01

    Full Text Available The project of designing and implementing model based predictive control on the vacuum distillation column at the Nynäshamn Refinery of Nynäs AB is described in this paper. The paper describes in detail the modeling for the model based control, covers the controller implementation, and documents the benefits gained from the model based controller.

  11. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  12. Establishment and Application of Coalmine Gas Prediction Model Based on Multi-Sensor Data Fusion Technology

    Directory of Open Access Journals (Sweden)

    Wenyu Lv

    2014-04-01

    Full Text Available Undoubtedly an accident involving gas is one of the greater disasters that can occur in a coalmine, thus being able to predict when an accident involving gas might occur is an essential aspect in loss prevention and the reduction of safety hazards. However, the traditional methods concerning gas safety prediction is hindered by multi-objective and non-continuous problem. The coalmine gas prediction model based on multi-sensor data fusion technology (CGPM-MSDFT was established through analysis of accidents involving gas using artificial neural network to fuse multi- sensor data, using an improved algorithm designed to train the network and using an early stop method to resolve the over-fitting problem, the network test and field application results show that this model can provide a new direction for research into predicting the likelihood of a gas related incident within a coalmine. It will have a broad application prospect in coal mining.

  13. Predictive models for acute kidney injury following cardiac surgery.

    Science.gov (United States)

    Demirjian, Sevag; Schold, Jesse D; Navia, Jose; Mastracci, Tara M; Paganini, Emil P; Yared, Jean-Pierre; Bashour, Charles A

    2012-03-01

    Accurate prediction of cardiac surgery-associated acute kidney injury (AKI) would improve clinical decision making and facilitate timely diagnosis and treatment. The aim of the study was to develop predictive models for cardiac surgery-associated AKI using presurgical and combined pre- and intrasurgical variables. Prospective observational cohort. 25,898 patients who underwent cardiac surgery at Cleveland Clinic in 2000-2008. Presurgical and combined pre- and intrasurgical variables were used to develop predictive models. Dialysis therapy and a composite of doubling of serum creatinine level or dialysis therapy within 2 weeks (or discharge if sooner) after cardiac surgery. Incidences of dialysis therapy and the composite of doubling of serum creatinine level or dialysis therapy were 1.7% and 4.3%, respectively. Kidney function parameters were strong independent predictors in all 4 models. Surgical complexity reflected by type and history of previous cardiac surgery were robust predictors in models based on presurgical variables. However, the inclusion of intrasurgical variables accounted for all explained variance by procedure-related information. Models predictive of dialysis therapy showed good calibration and superb discrimination; a combined (pre- and intrasurgical) model performed better than the presurgical model alone (C statistics, 0.910 and 0.875, respectively). Models predictive of the composite end point also had excellent discrimination with both presurgical and combined (pre- and intrasurgical) variables (C statistics, 0.797 and 0.825, respectively). However, the presurgical model predictive of the composite end point showed suboptimal calibration (P predictive models in other cohorts is required before wide-scale application. We developed and internally validated 4 new models that accurately predict cardiac surgery-associated AKI. These models are based on readily available clinical information and can be used for patient counseling, clinical

  14. Modeling number of claims and prediction of total claim amount

    Science.gov (United States)

    Acar, Aslıhan Şentürk; Karabey, Uǧur

    2017-07-01

    In this study we focus on annual number of claims of a private health insurance data set which belongs to a local insurance company in Turkey. In addition to Poisson model and negative binomial model, zero-inflated Poisson model and zero-inflated negative binomial model are used to model the number of claims in order to take into account excess zeros. To investigate the impact of different distributional assumptions for the number of claims on the prediction of total claim amount, predictive performances of candidate models are compared by using root mean square error (RMSE) and mean absolute error (MAE) criteria.

  15. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  16. Predictive models of choroidal neovascularization and geographic atrophy incidence applied to clinical trial design.

    Science.gov (United States)

    McCarthy, Linda C; Newcombe, Paul J; Whittaker, John C; Wurzelmann, John I; Fries, Michael A; Burnham, Nancy R; Cai, Gengqian; Stinnett, Sandra W; Trivedi, Trupti M; Xu, Chun-Fang

    2012-09-01

    To develop comprehensive predictive models for choroidal neovascularization (CNV) and geographic atrophy (GA) incidence within 3 years that can be applied realistically to clinical practice. Retrospective evaluation of data from a longitudinal study to develop and validate predictive models of CNV and GA. The predictive performance of clinical, environmental, demographic, and genetic risk factors was explored in regression models, using data from both eyes of 2011 subjects from the Age-Related Eye Disease Study (AREDS). The performance of predictive models was compared using 10-fold cross-validated receiver operating characteristic curves in the training data, followed by comparisons in an independent validation dataset (1410 AREDS subjects). Bayesian trial simulations were used to compare the usefulness of predictive models to screen patients for inclusion in prevention clinical trials. Logistic regression models that included clinical, demographic, and environmental factors had better predictive performance for 3-year CNV and GA incidence (area under the receiver operating characteristic curve of 0.87 and 0.89, respectively), compared with simple clinical criteria (AREDS simplified severity scale). Although genetic markers were associated significantly with 3-year CNV (CFH: Y402H; ARMS2: A69S) and GA incidence (CFH: Y402H), the inclusion of genetic factors in the models provided only marginal improvements in predictive performance. The logistic regression models combine good predictive performance with greater flexibility to optimize clinical trial design compared with simple clinical models (AREDS simplified severity scale). The benefit of including genetic factors to screen patients for recruitment to CNV prevention studies is marginal and is dependent on individual clinical trial economics. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel

    2006-01-01

    algorithm when extrapolating beyond the range of data used to build the model. The effects of these factors should be carefully considered when using this modelling approach to predict species ranges. Main conclusions We highlight an important source of uncertainty in assessments of the impacts of climate......Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions......, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. Location The Western Cape of South Africa. Methods We applied nine of the most widely used modelling techniques to model potential distributions under current...

  18. Prediction Model for Gastric Cancer Incidence in Korean Population.

    Science.gov (United States)

    Eom, Bang Wool; Joo, Jungnam; Kim, Sohee; Shin, Aesun; Yang, Hye-Ryung; Park, Junghyun; Choi, Il Ju; Kim, Young-Woo; Kim, Jeongseon; Nam, Byung-Ho

    2015-01-01

    Predicting high risk groups for gastric cancer and motivating these groups to receive regular checkups is required for the early detection of gastric cancer. The aim of this study is was to develop a prediction model for gastric cancer incidence based on a large population-based cohort in Korea. Based on the National Health Insurance Corporation data, we analyzed 10 major risk factors for gastric cancer. The Cox proportional hazards model was used to develop gender specific prediction models for gastric cancer development, and the performance of the developed model in terms of discrimination and calibration was also validated using an independent cohort. Discrimination ability was evaluated using Harrell's C-statistics, and the calibration was evaluated using a calibration plot and slope. During a median of 11.4 years of follow-up, 19,465 (1.4%) and 5,579 (0.7%) newly developed gastric cancer cases were observed among 1,372,424 men and 804,077 women, respectively. The prediction models included age, BMI, family history, meal regularity, salt preference, alcohol consumption, smoking and physical activity for men, and age, BMI, family history, salt preference, alcohol consumption, and smoking for women. This prediction model showed good accuracy and predictability in both the developing and validation cohorts (C-statistics: 0.764 for men, 0.706 for women). In this study, a prediction model for gastric cancer incidence was developed that displayed a good performance.

  19. AN EFFICIENT PATIENT INFLOW PREDICTION MODEL FOR HOSPITAL RESOURCE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Kottalanka Srikanth

    2017-07-01

    Full Text Available There has been increasing demand in improving service provisioning in hospital resources management. Hospital industries work with strict budget constraint at the same time assures quality care. To achieve quality care with budget constraint an efficient prediction model is required. Recently there has been various time series based prediction model has been proposed to manage hospital resources such ambulance monitoring, emergency care and so on. These models are not efficient as they do not consider the nature of scenario such climate condition etc. To address this artificial intelligence is adopted. The issues with existing prediction are that the training suffers from local optima error. This induces overhead and affects the accuracy in prediction. To overcome the local minima error, this work presents a patient inflow prediction model by adopting resilient backpropagation neural network. Experiment are conducted to evaluate the performance of proposed model inter of RMSE and MAPE. The outcome shows the proposed model reduces RMSE and MAPE over existing back propagation based artificial neural network. The overall outcomes show the proposed prediction model improves the accuracy of prediction which aid in improving the quality of health care management.

  20. Risk Prediction Model for Severe Postoperative Complication in Bariatric Surgery.

    Science.gov (United States)

    Stenberg, Erik; Cao, Yang; Szabo, Eva; Näslund, Erik; Näslund, Ingmar; Ottosson, Johan

    2018-01-12

    Factors associated with risk for adverse outcome are important considerations in the preoperative assessment of patients for bariatric surgery. As yet, prediction models based on preoperative risk factors have not been able to predict adverse outcome sufficiently. This study aimed to identify preoperative risk factors and to construct a risk prediction model based on these. Patients who underwent a bariatric surgical procedure in Sweden between 2010 and 2014 were identified from the Scandinavian Obesity Surgery Registry (SOReg). Associations between preoperative potential risk factors and severe postoperative complications were analysed using a logistic regression model. A multivariate model for risk prediction was created and validated in the SOReg for patients who underwent bariatric surgery in Sweden, 2015. Revision surgery (standardized OR 1.19, 95% confidence interval (CI) 1.14-0.24, p prediction model. Despite high specificity, the sensitivity of the model was low. Revision surgery, high age, low BMI, large waist circumference, and dyspepsia/GERD were associated with an increased risk for severe postoperative complication. The prediction model based on these factors, however, had a sensitivity that was too low to predict risk in the individual patient case.

  1. Prediction Model for Gastric Cancer Incidence in Korean Population.

    Directory of Open Access Journals (Sweden)

    Bang Wool Eom

    Full Text Available Predicting high risk groups for gastric cancer and motivating these groups to receive regular checkups is required for the early detection of gastric cancer. The aim of this study is was to develop a prediction model for gastric cancer incidence based on a large population-based cohort in Korea.Based on the National Health Insurance Corporation data, we analyzed 10 major risk factors for gastric cancer. The Cox proportional hazards model was used to develop gender specific prediction models for gastric cancer development, and the performance of the developed model in terms of discrimination and calibration was also validated using an independent cohort. Discrimination ability was evaluated using Harrell's C-statistics, and the calibration was evaluated using a calibration plot and slope.During a median of 11.4 years of follow-up, 19,465 (1.4% and 5,579 (0.7% newly developed gastric cancer cases were observed among 1,372,424 men and 804,077 women, respectively. The prediction models included age, BMI, family history, meal regularity, salt preference, alcohol consumption, smoking and physical activity for men, and age, BMI, family history, salt preference, alcohol consumption, and smoking for women. This prediction model showed good accuracy and predictability in both the developing and validation cohorts (C-statistics: 0.764 for men, 0.706 for women.In this study, a prediction model for gastric cancer incidence was developed that displayed a good performance.

  2. Stage-specific predictive models for breast cancer survivability.

    Science.gov (United States)

    Kate, Rohit J; Nadig, Ramya

    2017-01-01

    Survivability rates vary widely among various stages of breast cancer. Although machine learning models built in past to predict breast cancer survivability were given stage as one of the features, they were not trained or evaluated separately for each stage. To investigate whether there are differences in performance of machine learning models trained and evaluated across different stages for predicting breast cancer survivability. Using three different machine learning methods we built models to predict breast cancer survivability separately for each stage and compared them with the traditional joint models built for all the stages. We also evaluated the models separately for each stage and together for all the stages. Our results show that the most suitable model to predict survivability for a specific stage is the model trained for that particular stage. In our experiments, using additional examples of other stages during training did not help, in fact, it made it worse in some cases. The most important features for predicting survivability were also found to be different for different stages. By evaluating the models separately on different stages we found that the performance widely varied across them. We also demonstrate that evaluating predictive models for survivability on all the stages together, as was done in the past, is misleading because it overestimates performance. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Evaluation of wave runup predictions from numerical and parametric models

    Science.gov (United States)

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  4. Multi-year predictability in a coupled general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Power, Scott; Colman, Rob [Bureau of Meteorology Research Centre, Melbourne, VIC (Australia)

    2006-02-01

    Multi-year to decadal variability in a 100-year integration of a BMRC coupled atmosphere-ocean general circulation model (CGCM) is examined. The fractional contribution made by the decadal component generally increases with depth and latitude away from surface waters in the equatorial Indo-Pacific Ocean. The relative importance of decadal variability is enhanced in off-equatorial ''wings'' in the subtropical eastern Pacific. The model and observations exhibit ''ENSO-like'' decadal patterns. Analytic results are derived, which show that the patterns can, in theory, occur in the absence of any predictability beyond ENSO time-scales. In practice, however, modification to this stochastic view is needed to account for robust differences between ENSO-like decadal patterns and their interannual counterparts. An analysis of variability in the CGCM, a wind-forced shallow water model, and a simple mixed layer model together with existing and new theoretical results are used to improve upon this stochastic paradigm and to provide a new theory for the origin of decadal ENSO-like patterns like the Interdecadal Pacific Oscillation and Pacific Decadal Oscillation. In this theory, ENSO-driven wind-stress variability forces internal equatorially-trapped Kelvin waves that propagate towards the eastern boundary. Kelvin waves can excite reflected internal westward propagating equatorially-trapped Rossby waves (RWs) and coastally-trapped waves (CTWs). CTWs have no impact on the off-equatorial sub-surface ocean outside the coastal wave guide, whereas the RWs do. If the frequency of the incident wave is too high, then only CTWs are excited. At lower frequencies, both CTWs and RWs can be excited. The lower the frequency, the greater the fraction of energy transmitted to RWs. This lowers the characteristic frequency of variability off the equator relative to its equatorial counterpart. Both the eastern boundary interactions and the accumulation of

  5. Femtocells Sharing Management using mobility prediction model

    OpenAIRE

    Barth, Dominique; Choutri, Amira; Kloul, Leila; Marcé, Olivier

    2013-01-01

    Bandwidth sharing paradigm constitutes an incentive solution for the serious capacity management problem faced by operators as femtocells owners are able to offer a QoS guaranteed network access to mobile users in their femtocell coverage. In this paper, we consider a technico-economic bandwidth sharing model based on a reinforcement learning algorithm. Because such a model does not allow the convergence of the learning algorithm, due to the small size of the femtocells, the mobile users velo...

  6. Toward integration of genomic selection with crop modelling: the development of an integrated approach to predicting rice heading dates.

    Science.gov (United States)

    Onogi, Akio; Watanabe, Maya; Mochizuki, Toshihiro; Hayashi, Takeshi; Nakagawa, Hiroshi; Hasegawa, Toshihiro; Iwata, Hiroyoshi

    2016-04-01

    It is suggested that accuracy in predicting plant phenotypes can be improved by integrating genomic prediction with crop modelling in a single hierarchical model. Accurate prediction of phenotypes is important for plant breeding and management. Although genomic prediction/selection aims to predict phenotypes on the basis of whole-genome marker information, it is often difficult to predict phenotypes of complex traits in diverse environments, because plant phenotypes are often influenced by genotype-environment interaction. A possible remedy is to integrate genomic prediction with crop/ecophysiological modelling, which enables us to predict plant phenotypes using environmental and management information. To this end, in the present study, we developed a novel method for integrating genomic prediction with phenological modelling of Asian rice (Oryza sativa, L.), allowing the heading date of untested genotypes in untested environments to be predicted. The method simultaneously infers the phenological model parameters and whole-genome marker effects on the parameters in a Bayesian framework. By cultivating backcross inbred lines of Koshihikari × Kasalath in nine environments, we evaluated the potential of the proposed method in comparison with conventional genomic prediction, phenological modelling, and two-step methods that applied genomic prediction to phenological model parameters inferred from Nelder-Mead or Markov chain Monte Carlo algorithms. In predicting heading dates of untested lines in untested environments, the proposed and two-step methods tended to provide more accurate predictions than the conventional genomic prediction methods, particularly in environments where phenotypes from environments similar to the target environment were unavailable for training genomic prediction. The proposed method showed greater accuracy in prediction than the two-step methods in all cross-validation schemes tested, suggesting the potential of the integrated approach in

  7. North Atlantic climate model bias influence on multiyear predictability

    Science.gov (United States)

    Wu, Y.; Park, T.; Park, W.; Latif, M.

    2018-01-01

    The influences of North Atlantic biases on multiyear predictability of unforced surface air temperature (SAT) variability are examined in the Kiel Climate Model (KCM). By employing a freshwater flux correction over the North Atlantic to the model, which strongly alleviates both North Atlantic sea surface salinity (SSS) and sea surface temperature (SST) biases, the freshwater flux-corrected integration depicts significantly enhanced multiyear SAT predictability in the North Atlantic sector in comparison to the uncorrected one. The enhanced SAT predictability in the corrected integration is due to a stronger and more variable Atlantic Meridional Overturning Circulation (AMOC) and its enhanced influence on North Atlantic SST. Results obtained from preindustrial control integrations of models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) support the findings obtained from the KCM: models with large North Atlantic biases tend to have a weak AMOC influence on SAT and exhibit a smaller SAT predictability over the North Atlantic sector.

  8. Autonomy and social norms in a three factor grief model predicting perinatal grief in India.

    Science.gov (United States)

    Roberts, Lisa R; Lee, Jerry W

    2014-01-01

    Perinatal grief following stillbirth is a significant social and mental health burden. We examined associations among the following latent variables: autonomy, social norms, self-despair, strained coping, and acute grief-among poor, rural women in India who experienced stillbirth. A structural equation model was built and tested using quantitative data from 347 women of reproductive age in Chhattisgarh. Maternal acceptance of traditional social norms worsens self-despair and strained coping, and increases the autonomy granted to women. Greater autonomy increases acute grief. Greater despair and acute grief increase strained coping. Social and cultural factors were found to predict perinatal grief in India.

  9. Climate predictability and prediction skill on seasonal time scales over South America from CHFP models

    Science.gov (United States)

    Osman, Marisol; Vera, C. S.

    2017-10-01

    This work presents an assessment of the predictability and skill of climate anomalies over South America. The study was made considering a multi-model ensemble of seasonal forecasts for surface air temperature, precipitation and regional circulation, from coupled global circulation models included in the Climate Historical Forecast Project. Predictability was evaluated through the estimation of the signal-to-total variance ratio while prediction skill was assessed computing anomaly correlation coefficients. Both indicators present over the continent higher values at the tropics than at the extratropics for both, surface air temperature and precipitation. Moreover, predictability and prediction skill for temperature are slightly higher in DJF than in JJA while for precipitation they exhibit similar levels in both seasons. The largest values of predictability and skill for both variables and seasons are found over northwestern South America while modest but still significant values for extratropical precipitation at southeastern South America and the extratropical Andes. The predictability levels in ENSO years of both variables are slightly higher, although with the same spatial distribution, than that obtained considering all years. Nevertheless, predictability at the tropics for both variables and seasons diminishes in both warm and cold ENSO years respect to that in all years. The latter can be attributed to changes in signal rather than in the noise. Predictability and prediction skill for low-level winds and upper-level zonal winds over South America was also assessed. Maximum levels of predictability for low-level winds were found were maximum mean values are observed, i.e. the regions associated with the equatorial trade winds, the midlatitudes westerlies and the South American Low-Level Jet. Predictability maxima for upper-level zonal winds locate where the subtropical jet peaks. Seasonal changes in wind predictability are observed that seem to be related to

  10. Prediction skill of rainstorm events over India in the TIGGE weather prediction models

    Science.gov (United States)

    Karuna Sagar, S.; Rajeevan, M.; Vijaya Bhaskara Rao, S.; Mitra, A. K.

    2017-12-01

    Extreme rainfall events pose a serious threat of leading to severe floods in many countries worldwide. Therefore, advance prediction of its occurrence and spatial distribution is very essential. In this paper, an analysis has been made to assess the skill of numerical weather prediction models in predicting rainstorms over India. Using gridded daily rainfall data set and objective criteria, 15 rainstorms were identified during the monsoon season (June to September). The analysis was made using three TIGGE (THe Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble) models. The models considered are the European Centre for Medium-Range Weather Forecasts (ECMWF), National Centre for Environmental Prediction (NCEP) and the UK Met Office (UKMO). Verification of the TIGGE models for 43 observed rainstorm days from 15 rainstorm events has been made for the period 2007-2015. The comparison reveals that rainstorm events are predictable up to 5 days in advance, however with a bias in spatial distribution and intensity. The statistical parameters like mean error (ME) or Bias, root mean square error (RMSE) and correlation coefficient (CC) have been computed over the rainstorm region using the multi-model ensemble (MME) mean. The study reveals that the spread is large in ECMWF and UKMO followed by the NCEP model. Though the ensemble spread is quite small in NCEP, the ensemble member averages are not well predicted. The rank histograms suggest that the forecasts are under prediction. The modified Contiguous Rain Area (CRA) technique was used to verify the spatial as well as the quantitative skill of the TIGGE models. Overall, the contribution from the displacement and pattern errors to the total RMSE is found to be more in magnitude. The volume error increases from 24 hr forecast to 48 hr forecast in all the three models.

  11. Micro-mechanical studies on graphite strength prediction models

    Science.gov (United States)

    Kanse, Deepak; Khan, I. A.; Bhasin, V.; Vaze, K. K.

    2013-06-01

    The influence of type of loading and size-effects on the failure strength of graphite were studied using Weibull model. It was observed that this model over-predicts size effect in tension. However, incorporation of grain size effect in Weibull model, allows a more realistic simulation of size effects. Numerical prediction of strength of four-point bend specimen was made using the Weibull parameters obtained from tensile test data. Effective volume calculations were carried out and subsequently predicted strength was compared with experimental data. It was found that Weibull model can predict mean flexural strength with reasonable accuracy even when grain size effect was not incorporated. In addition, the effects of microstructural parameters on failure strength were analyzed using Rose and Tucker model. Uni-axial tensile, three-point bend and four-point bend strengths were predicted using this model and compared with the experimental data. It was found that this model predicts flexural strength within 10%. For uni-axial tensile strength, difference was 22% which can be attributed to less number of tests on tensile specimens. In order to develop failure surface of graphite under multi-axial state of stress, an open ended hollow tube of graphite was subjected to internal pressure and axial load and Batdorf model was employed to calculate failure probability of the tube. Bi-axial failure surface was generated in the first and fourth quadrant for 50% failure probability by varying both internal pressure and axial load.

  12. New Approaches for Channel Prediction Based on Sinusoidal Modeling

    Directory of Open Access Journals (Sweden)

    Ekman Torbjörn

    2007-01-01

    Full Text Available Long-range channel prediction is considered to be one of the most important enabling technologies to future wireless communication systems. The prediction of Rayleigh fading channels is studied in the frame of sinusoidal modeling in this paper. A stochastic sinusoidal model to represent a Rayleigh fading channel is proposed. Three different predictors based on the statistical sinusoidal model are proposed. These methods outperform the standard linear predictor (LP in Monte Carlo simulations, but underperform with real measurement data, probably due to nonstationary model parameters. To mitigate these modeling errors, a joint moving average and sinusoidal (JMAS prediction model and the associated joint least-squares (LS predictor are proposed. It combines the sinusoidal model with an LP to handle unmodeled dynamics in the signal. The joint LS predictor outperforms all the other sinusoidal LMMSE predictors in suburban environments, but still performs slightly worse than the standard LP in urban environments.

  13. Prediction of high-energy radiation belt electron fluxes using a combined VERB-NARMAX model

    Science.gov (United States)

    Pakhotin, I. P.; Balikhin, M. A.; Shprits, Y.; Subbotin, D.; Boynton, R.

    2013-12-01

    This study is concerned with the modelling and forecasting of energetic electron fluxes that endanger satellites in space. By combining data-driven predictions from the NARMAX methodology with the physics-based VERB code, it becomes possible to predict electron fluxes with a high level of accuracy and across a radial distance from inside the local acceleration region to out beyond geosynchronous orbit. The model coupling also makes is possible to avoid accounting for seed electron variations at the outer boundary. Conversely, combining a convection code with the VERB and NARMAX models has the potential to provide even greater accuracy in forecasting that is not limited to geostationary orbit but makes predictions across the entire outer radiation belt region.

  14. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  15. Modeling for prediction of restrained shrinkage effect in concrete repair

    International Nuclear Information System (INIS)

    Yuan Yingshu; Li Guo; Cai Yue

    2003-01-01

    A general model of autogenous shrinkage caused by chemical reaction (chemical shrinkage) is developed by means of Arrhenius' law and a degree of chemical reaction. Models of tensile creep and relaxation modulus are built based on a viscoelastic, three-element model. Tests of free shrinkage and tensile creep were carried out to determine some coefficients in the models. Two-dimensional FEM analysis based on the models and other constitutions can predict the development of tensile strength and cracking. Three groups of patch-repaired beams were designed for analysis and testing. The prediction from the analysis shows agreement with the test results. The cracking mechanism after repair is discussed

  16. A laboratory-scale comparison of rate of spread model predictions using chaparral fuel beds – preliminary results

    Science.gov (United States)

    D.R. Weise; E. Koo; X. Zhou; S. Mahalingam

    2011-01-01

    Observed fire spread rates from 240 laboratory fires in horizontally-oriented single-species live fuel beds were compared to predictions from various implementations and modifications of the Rothermel rate of spread model and a physical fire spread model developed by Pagni and Koo. Packing ratio of the laboratory fuel beds was generally greater than that observed in...

  17. Predicting Footbridge Response using Stochastic Load Models

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2013-01-01

    Walking parameters such as step frequency, pedestrian mass, dynamic load factor, etc. are basically stochastic, although it is quite common to adapt deterministic models for these parameters. The present paper considers a stochastic approach to modeling the action of pedestrians, but when doing so...... decisions need to be made in terms of statistical distributions of walking parameters and in terms of the parameters describing the statistical distributions. The paper explores how sensitive computations of bridge response are to some of the decisions to be made in this respect. This is useful...

  18. Longitudinal Model Predicting Self-Concept in Pediatric Chronic Illness.

    Science.gov (United States)

    Emerson, Natacha D; Morrell, Holly E R; Neece, Cameron; Tapanes, Daniel; Distelberg, Brian

    2018-04-16

    Although self-concept has been identified as salient to the psychosocial adjustment of adolescents dealing with a chronic illness (CI), little research has focused on its predictors it. Given that depression and parent-child attachment have been linked to self-concept in the population at large, the goal of this study was to evaluate these relationships longitudinally in a sample of adolescents with CI. Using participant data from the Mastering Each New Direction (MEND) program, a 3-month psychosocial, family based intensive outpatient program for adolescents with CI, we employed multilevel modeling to test longitudinal changes in self-concept, as predicted by depressive symptoms and parent-child attachment, in a sample of 50 youths (M age  = 14.56, SD age  = 1.82) participating in MEND. Both "time spent in the program" and decreases in depressive symptoms were associated with increases in self-concept over time. Higher baseline levels of avoidant attachment to both mother and father were also associated with greater initial levels of self-concept. Targeting depressive symptoms and supporting adaptive changes in attachment may be key to promoting a healthy self-concept in pediatric CI populations. The association between avoidant attachment and higher baseline self-concept scores may reflect differences in participants' autonomy, self-confidence, or depression. Limitations of the study include variability in the amount of time spent in the program, attrition in final time point measures, and the inability to fully examine and model all potential covariates due to a small sample size (e.g. power). © 2018 Family Process Institute.

  19. Uncertainties in model-based outcome predictions for treatment planning

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry

    2001-01-01

    Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment

  20. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  1. Geospatial application of the Water Erosion Prediction Project (WEPP) Model

    Science.gov (United States)

    D. C. Flanagan; J. R. Frankenberger; T. A. Cochrane; C. S. Renschler; W. J. Elliot

    2011-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based technology for prediction of soil erosion by water at hillslope profile, field, and small watershed scales. In particular, WEPP utilizes observed or generated daily climate inputs to drive the surface hydrology processes (infiltration, runoff, ET) component, which subsequently impacts the rest of the...

  2. Reduced order modelling and predictive control of multivariable ...

    Indian Academy of Sciences (India)

    Anuj Abraham

    2018-03-16

    Mar 16, 2018 ... The performance of constraint generalized predictive control scheme is found to be superior to that of the conventional PID controller in terms of overshoot, settling time and performance indices, mainly ISE, IAE and MSE. Keywords. Predictive control; distillation column; reduced order model; dominant pole; ...

  3. Mixed models for predictive modeling in actuarial science

    NARCIS (Netherlands)

    Antonio, K.; Zhang, Y.

    2012-01-01

    We start with a general discussion of mixed (also called multilevel) models and continue with illustrating specific (actuarial) applications of this type of models. Technical details on (linear, generalized, non-linear) mixed models follow: model assumptions, specifications, estimation techniques

  4. Consensus models to predict endocrine disruption for all ...

    Science.gov (United States)

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an exte

  5. Dietary information improves cardiovascular disease risk prediction models.

    Science.gov (United States)

    Baik, I; Cho, N H; Kim, S H; Shin, C

    2013-01-01

    Data are limited on cardiovascular disease (CVD) risk prediction models that include dietary predictors. Using known risk factors and dietary information, we constructed and evaluated CVD risk prediction models. Data for modeling were from population-based prospective cohort studies comprised of 9026 men and women aged 40-69 years. At baseline, all were free of known CVD and cancer, and were followed up for CVD incidence during an 8-year period. We used Cox proportional hazard regression analysis to construct a traditional risk factor model, an office-based model, and two diet-containing models and evaluated these models by calculating Akaike information criterion (AIC), C-statistics, integrated discrimination improvement (IDI), net reclassification improvement (NRI) and calibration statistic. We constructed diet-containing models with significant dietary predictors such as poultry, legumes, carbonated soft drinks or green tea consumption. Adding dietary predictors to the traditional model yielded a decrease in AIC (delta AIC=15), a 53% increase in relative IDI (P-value for IDI NRI (category-free NRI=0.14, P NRI (category-free NRI=0.08, P<0.01) compared with the office-based model. The calibration plots for risk prediction demonstrated that the inclusion of dietary predictors contributes to better agreement in persons at high risk for CVD. C-statistics for the four models were acceptable and comparable. We suggest that dietary information may be useful in constructing CVD risk prediction models.

  6. Scanpath Based N-Gram Models for Predicting Reading Behavior

    DEFF Research Database (Denmark)

    Mishra, Abhijit; Bhattacharyya, Pushpak; Carl, Michael

    2013-01-01

    Predicting reading behavior is a difficult task. Reading behavior depends on various linguistic factors (e.g. sentence length, structural complexity etc.) and other factors (e.g individual's reading style, age etc.). Ideally, a reading model should be similar to a language model where the model i...

  7. Unsupervised ship trajectory modeling and prediction using compression and clustering

    NARCIS (Netherlands)

    de Vries, G.; van Someren, M.; van Erp, M.; Stehouwer, H.; van Zaanen, M.

    2009-01-01

    In this paper we show how to build a model of ship trajectories in a certain maritime region and use this model to predict future ship movements. The presented method is unsupervised and based on existing compression (line-simplification) and clustering techniques. We evaluate the model with a

  8. Prediction of annual rainfall pattern using Hidden Markov Model ...

    African Journals Online (AJOL)

    A hidden Markov model to predict annual rainfall pattern has been presented in this paper. The model is developed to provide necessary information for the farmers, agronomists, water resource management scientists and policy makers to enable them plan for the uncertainty of annual rainfall. The model classified annual ...

  9. The Selection of Turbulence Models for Prediction of Room Airflow

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    This paper discusses the use of different turbulence models and their advantages in given situations. As an example, it is shown that a simple zero-equation model can be used for the prediction of special situations as flow with a low level of turbulence. A zero-equation model with compensation...

  10. Model Predictive Control of Wind Turbines using Uncertain LIDAR Measurements

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Soltani, Mohsen; Poulsen, Niels Kjølstad

    2013-01-01

    The problem of Model predictive control (MPC) of wind turbines using uncertain LIDAR (LIght Detection And Ranging) measurements is considered. A nonlinear dynamical model of the wind turbine is obtained. We linearize the obtained nonlinear model for different operating points, which are determined...

  11. Using Pareto points for model identification in predictive toxicology

    Science.gov (United States)

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  12. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    constructed from geological and hydrological data. However, geophysical data are increasingly used to inform hydrogeologic models because they are collected at lower cost and much higher density than geological and hydrological data. Despite increased use of geophysics, it is still unclear whether...... the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... collecting geophysical data. At a minimum, an analysis should be conducted assuming settings that are favorable for the chosen geophysical method. If the analysis suggests that data collected by the geophysical method is unlikely to improve model prediction performance under these favorable settings...

  13. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  14. Preoperative prediction model of outcome after cholecystectomy for symptomatic gallstones

    DEFF Research Database (Denmark)

    Borly, L; Anderson, I B; Bardram, Linda

    1999-01-01

    and sonography evaluated gallbladder motility, gallstones, and gallbladder volume. Preoperative variables in patients with or without postcholecystectomy pain were compared statistically, and significant variables were combined in a logistic regression model to predict the postoperative outcome. RESULTS: Eighty...... and by the absence of 'agonizing' pain and of symptoms coinciding with pain (P model 15 of 18 predicted patients had postoperative pain (PVpos = 0.83). Of 62 patients predicted as having no pain postoperatively, 56 were pain-free (PVneg = 0.90). Overall accuracy...... was 89%. CONCLUSION: From this prospective study a model based on preoperative symptoms was developed to predict postcholecystectomy pain. Since intrastudy reclassification may give too optimistic results, the model should be validated in future studies....

  15. Prediction of Chemical Function: Model Development and Application

    Science.gov (United States)

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...

  16. Linear regression crash prediction models : issues and proposed solutions.

    Science.gov (United States)

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  17. FPGA implementation of predictive degradation model for engine oil lifetime

    Science.gov (United States)

    Idros, M. F. M.; Razak, A. H. A.; Junid, S. A. M. Al; Suliman, S. I.; Halim, A. K.

    2018-03-01

    This paper presents the implementation of linear regression model for degradation prediction on Register Transfer Logic (RTL) using QuartusII. A stationary model had been identified in the degradation trend for the engine oil in a vehicle in time series method. As for RTL implementation, the degradation model is written in Verilog HDL and the data input are taken at a certain time. Clock divider had been designed to support the timing sequence of input data. At every five data, a regression analysis is adapted for slope variation determination and prediction calculation. Here, only the negative value are taken as the consideration for the prediction purposes for less number of logic gate. Least Square Method is adapted to get the best linear model based on the mean values of time series data. The coded algorithm has been implemented on FPGA for validation purposes. The result shows the prediction time to change the engine oil.

  18. Predictive Modeling: A New Paradigm for Managing Endometrial Cancer.

    Science.gov (United States)

    Bendifallah, Sofiane; Daraï, Emile; Ballester, Marcos

    2016-03-01

    With the abundance of new options in diagnostic and treatment modalities, a shift in the medical decision process for endometrial cancer (EC) has been observed. The emergence of individualized medicine and the increasing complexity of available medical data has lead to the development of several prediction models. In EC, those clinical models (algorithms, nomograms, and risk scoring systems) have been reported, especially for stratifying and subgrouping patients, with various unanswered questions regarding such things as the optimal surgical staging for lymph node metastasis as well as the assessment of recurrence and survival outcomes. In this review, we highlight existing prognostic and predictive models in EC, with a specific focus on their clinical applicability. We also discuss the methodologic aspects of the development of such predictive models and the steps that are required to integrate these tools into clinical decision making. In the future, the emerging field of molecular or biochemical markers research may substantially improve predictive and treatment approaches.

  19. On the Predictiveness of Single-Field Inflationary Models

    CERN Document Server

    Burgess, C.P.; Trott, Michael

    2014-01-01

    We re-examine the predictiveness of single-field inflationary models and discuss how an unknown UV completion can complicate determining inflationary model parameters from observations, even from precision measurements. Besides the usual naturalness issues associated with having a shallow inflationary potential, we describe another issue for inflation, namely, unknown UV physics modifies the running of Standard Model (SM) parameters and thereby introduces uncertainty into the potential inflationary predictions. We illustrate this point using the minimal Higgs Inflationary scenario, which is arguably the most predictive single-field model on the market, because its predictions for $A_s$, $r$ and $n_s$ are made using only one new free parameter beyond those measured in particle physics experiments, and run up to the inflationary regime. We find that this issue can already have observable effects. At the same time, this UV-parameter dependence in the Renormalization Group allows Higgs Inflation to occur (in prin...

  20. Predictive modeling in catalysis - from dream to reality

    NARCIS (Netherlands)

    Maldonado, A.G.; Rothenberg, G.

    2009-01-01

    In silico catalyst optimization is the ultimate application of computers in catalysis. This article provides an overview of the basic concepts of predictive modeling and describes how this technique can be used in catalyst and reaction design.

  1. Fuzzy model predictive control algorithm applied in nuclear power plant

    International Nuclear Information System (INIS)

    Zuheir, Ahmad

    2006-01-01

    The aim of this paper is to design a predictive controller based on a fuzzy model. The Takagi-Sugeno fuzzy model with an Adaptive B-splines neuro-fuzzy implementation is used and incorporated as a predictor in a predictive controller. An optimization approach with a simplified gradient technique is used to calculate predictions of the future control actions. In this approach, adaptation of the fuzzy model using dynamic process information is carried out to build the predictive controller. The easy description of the fuzzy model and the easy computation of the gradient sector during the optimization procedure are the main advantages of the computation algorithm. The algorithm is applied to the control of a U-tube steam generation unit (UTSG) used for electricity generation. (author)

  2. Compensatory versus noncompensatory models for predicting consumer preferences

    Directory of Open Access Journals (Sweden)

    Anja Dieckmann

    2009-04-01

    Full Text Available Standard preference models in consumer research assume that people weigh and add all attributes of the available options to derive a decision, while there is growing evidence for the use of simplifying heuristics. Recently, a greedoid algorithm has been developed (Yee, Dahan, Hauser and Orlin, 2007; Kohli and Jedidi, 2007 to model lexicographic heuristics from preference data. We compare predictive accuracies of the greedoid approach and standard conjoint analysis in an online study with a rating and a ranking task. The lexicographic model derived from the greedoid algorithm was better at predicting ranking compared to rating data, but overall, it achieved lower predictive accuracy for hold-out data than the compensatory model estimated by conjoint analysis. However, a considerable minority of participants was better predicted by lexicographic strategies. We conclude that the new algorithm will not replace standard tools for analyzing preferences, but can boost the study of situational and individual differences in preferential choice processes.

  3. Predictive Modeling of Partitioned Systems: Implementation and Applications

    OpenAIRE

    Latten, Christine

    2014-01-01

    A general mathematical methodology for predictive modeling of coupled multi-physics systems is implemented and has been applied without change to an illustrative heat conduction example and reactor physics benchmarks.

  4. A new, accurate predictive model for incident hypertension

    DEFF Research Database (Denmark)

    Völzke, Henry; Fung, Glenn; Ittermann, Till

    2013-01-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures....

  5. Model Predictive Control for Ethanol Steam Reformers

    OpenAIRE

    Li, Mingming

    2014-01-01

    This thesis firstly proposes a new approach of modelling an ethanol steam reformer (ESR) for producing pure hydrogen. Hydrogen has obvious benefits as an alternative for feeding the proton exchange membrane fuel cells (PEMFCs) to produce electricity. However, an important drawback is that the hydrogen distribution and storage have high cost. So the ESR is regarded as a way to overcome these difficulties. Ethanol is currently considered as a promising energy source under the res...

  6. Haskell financial data modeling and predictive analytics

    CERN Document Server

    Ryzhov, Pavel

    2013-01-01

    This book is a hands-on guide that teaches readers how to use Haskell's tools and libraries to analyze data from real-world sources in an easy-to-understand manner.This book is great for developers who are new to financial data modeling using Haskell. A basic knowledge of functional programming is not required but will be useful. An interest in high frequency finance is essential.

  7. Wireless model predictive control: Application to water-level system

    Directory of Open Access Journals (Sweden)

    Ramdane Hedjar

    2016-04-01

    Full Text Available This article deals with wireless model predictive control of a water-level control system. The objective of the model predictive control algorithm is to constrain the control signal inside saturation limits and maintain the water level around the desired level. Linear modeling of any nonlinear plant leads to parameter uncertainties and non-modeled dynamics in the linearized mathematical model. These uncertainties induce a steady-state error in the output response of the water level. To eliminate this steady-state error and increase the robustness of the control algorithm, an integral action is included in the closed loop. To control the water-level system remotely, the communication between the controller and the process is performed using radio channel. To validate the proposed scheme, simulation and real-time implementation of the algorithm have been conducted, and the results show the effectiveness of wireless model predictive control with integral action.

  8. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  9. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  10. Prediction of cloud droplet number in a general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Ghan, S.J.; Leung, L.R. [Pacific Northwest National Lab., Richland, WA (United States)

    1996-04-01

    We have applied the Colorado State University Regional Atmospheric Modeling System (RAMS) bulk cloud microphysics parameterization to the treatment of stratiform clouds in the National Center for Atmospheric Research Community Climate Model (CCM2). The RAMS predicts mass concentrations of cloud water, cloud ice, rain and snow, and number concnetration of ice. We have introduced the droplet number conservation equation to predict droplet number and it`s dependence on aerosols.

  11. The Next Page Access Prediction Using Makov Model

    OpenAIRE

    Deepti Razdan

    2011-01-01

    Predicting the next page to be accessed by the Webusers has attracted a large amount of research. In this paper, anew web usage mining approach is proposed to predict next pageaccess. It is proposed to identify similar access patterns from weblog using K-mean clustering and then Markov model is used forprediction for next page accesses. The tightness of clusters isimproved by setting similarity threshold while forming clusters.In traditional recommendation models, clustering by nonsequentiald...

  12. Working Towards a Risk Prediction Model for Neural Tube Defects

    Science.gov (United States)

    Agopian, A.J.; Lupo, Philip J.; Tinker, Sarah C.; Canfield, Mark A.; Mitchell, Laura E.

    2015-01-01

    BACKGROUND Several risk factors have been consistently associated with neural tube defects (NTDs). However, the predictive ability of these risk factors in combination has not been evaluated. METHODS To assess the predictive ability of established risk factors for NTDs, we built predictive models using data from the National Birth Defects Prevention Study, which is a large, population-based study of nonsyndromic birth defects. Cases with spina bifida or anencephaly, or both (n = 1239), and controls (n = 8494) were randomly divided into separate training (75% of cases and controls) and validation (remaining 25%) samples. Multivariable logistic regression models were constructed with the training samples. The predictive ability of these models was evaluated in the validation samples by assessing the area under the receiver operator characteristic curves. An ordinal predictive risk index was also constructed and evaluated. In addition, the ability of classification and regression tree (CART) analysis to identify subgroups of women at increased risk for NTDs in offspring was evaluated. RESULTS The predictive ability of the multivariable models was poor (area under the receiver operating curve: 0.55 for spina bifida only, 0.59 for anencephaly only, and 0.56 for anencephaly and spina bifida combined). The predictive abilities of the ordinal risk indexes and CART models were also low. CONCLUSION Current established risk factors for NTDs are insufficient for population-level prediction of a women’s risk for having affected offspring. Identification of genetic risk factors and novel nongenetic risk factors will be critical to establishing models, with good predictive ability, for NTDs. PMID:22253139

  13. Watershed regressions for pesticides (warp) models for predicting atrazine concentrations in Corn Belt streams

    Science.gov (United States)

    Stone, Wesley W.; Gilliom, Robert J.

    2012-01-01

    Watershed Regressions for Pesticides (WARP) models, previously developed for atrazine at the national scale, are improved for application to the United States (U.S.) Corn Belt region by developing region-specific models that include watershed characteristics that are influential in predicting atrazine concentration statistics within the Corn Belt. WARP models for the Corn Belt (WARP-CB) were developed for annual maximum moving-average (14-, 21-, 30-, 60-, and 90-day durations) and annual 95th-percentile atrazine concentrations in streams of the Corn Belt region. The WARP-CB models accounted for 53 to 62% of the variability in the various concentration statistics among the model-development sites. Model predictions were within a factor of 5 of the observed concentration statistic for over 90% of the model-development sites. The WARP-CB residuals and uncertainty are lower than those of the National WARP model for the same sites. Although atrazine-use intensity is the most important explanatory variable in the National WARP models, it is not a significant variable in the WARP-CB models. The WARP-CB models provide improved predictions for Corn Belt streams draining watersheds with atrazine-use intensities of 17 kg/km2 of watershed area or greater.

  14. Predictive QSAR Models for the Toxicity of Disinfection Byproducts.

    Science.gov (United States)

    Qin, Litang; Zhang, Xin; Chen, Yuhan; Mo, Lingyun; Zeng, Honghu; Liang, Yanpeng

    2017-10-09

    Several hundred disinfection byproducts (DBPs) in drinking water have been identified, and are known to have potentially adverse health effects. There are toxicological data gaps for most DBPs, and the predictive method may provide an effective way to address this. The development of an in-silico model of toxicology endpoints of DBPs is rarely studied. The main aim of the present study is to develop predictive quantitative structure-activity relationship (QSAR) models for the reactive toxicities of 50 DBPs in the five bioassays of X-Microtox, GSH+, GSH-, DNA+ and DNA-. All-subset regression was used to select the optimal descriptors, and multiple linear-regression models were built. The developed QSAR models for five endpoints satisfied the internal and external validation criteria: coefficient of determination ( R ²) > 0.7, explained variance in leave-one-out prediction ( Q ² LOO ) and in leave-many-out prediction ( Q ² LMO ) > 0.6, variance explained in external prediction ( Q ² F1 , Q ² F2 , and Q ² F3 ) > 0.7, and concordance correlation coefficient ( CCC ) > 0.85. The application domains and the meaning of the selective descriptors for the QSAR models were discussed. The obtained QSAR models can be used in predicting the toxicities of the 50 DBPs.

  15. Predictive QSAR Models for the Toxicity of Disinfection Byproducts

    Directory of Open Access Journals (Sweden)

    Litang Qin

    2017-10-01

    Full Text Available Several hundred disinfection byproducts (DBPs in drinking water have been identified, and are known to have potentially adverse health effects. There are toxicological data gaps for most DBPs, and the predictive method may provide an effective way to address this. The development of an in-silico model of toxicology endpoints of DBPs is rarely studied. The main aim of the present study is to develop predictive quantitative structure–activity relationship (QSAR models for the reactive toxicities of 50 DBPs in the five bioassays of X-Microtox, GSH+, GSH−, DNA+ and DNA−. All-subset regression was used to select the optimal descriptors, and multiple linear-regression models were built. The developed QSAR models for five endpoints satisfied the internal and external validation criteria: coefficient of determination (R2 > 0.7, explained variance in leave-one-out prediction (Q2LOO and in leave-many-out prediction (Q2LMO > 0.6, variance explained in external prediction (Q2F1, Q2F2, and Q2F3 > 0.7, and concordance correlation coefficient (CCC > 0.85. The application domains and the meaning of the selective descriptors for the QSAR models were discussed. The obtained QSAR models can be used in predicting the toxicities of the 50 DBPs.

  16. Nonconvex Model Predictive Control for Commercial Refrigeration

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F.S.; Jørgensen, John Bagterp

    2013-01-01

    function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimization method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out...... the iterations, which is more than fast enough to run in real-time. We demonstrate our method on a realistic model, with a full year simulation and 15 minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost...... capacity associated with large penetration of intermittent renewable energy sources in a future smart grid....

  17. Maxent modelling for predicting the potential distribution of Thai Palms

    DEFF Research Database (Denmark)

    Tovaranonte, Jantrararuk; Barfod, Anders S.; Overgaard, Anne Blach

    2011-01-01

    Increasingly species distribution models are being used to address questions related to ecology, biogeography and species conservation on global and regional scales. We used the maximum entropy approach implemented in the MAXENT programme to build a habitat suitability model for Thai palms based...... overprediction of species distribution ranges. The models with the best predictive power were found by calculating the area under the curve (AUC) of receiver-operating characteristic (ROC). Here, we provide examples of contrasting predicted species distribution ranges as well as a map of modeled palm diversity...

  18. Validation of Fatigue Modeling Predictions in Aviation Operations

    Science.gov (United States)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin

    2017-01-01

    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  19. Aero-acoustic noise of wind turbines. Noise prediction models

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B. [ed.

    1997-12-31

    Semi-empirical and CAA (Computational AeroAcoustics) noise prediction techniques are the subject of this expert meeting. The meeting presents and discusses models and methods. The meeting may provide answers to the following questions: What Noise sources are the most important? How are the sources best modeled? What needs to be done to do better predictions? Does it boil down to correct prediction of the unsteady aerodynamics around the rotor? Or is the difficult part to convert the aerodynamics into acoustics? (LN)

  20. Using a Prediction Model to Manage Cyber Security Threats.

    Science.gov (United States)

    Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  1. Using a Prediction Model to Manage Cyber Security Threats

    Directory of Open Access Journals (Sweden)

    Venkatesh Jaganathan

    2015-01-01

    Full Text Available Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  2. Predictions for mt and MW in minimal supersymmetric models

    International Nuclear Information System (INIS)

    Buchmueller, O.; Ellis, J.R.; Flaecher, H.; Isidori, G.

    2009-12-01

    Using a frequentist analysis of experimental constraints within two versions of the minimal supersymmetric extension of the Standard Model, we derive the predictions for the top quark mass, m t , and the W boson mass, m W . We find that the supersymmetric predictions for both m t and m W , obtained by incorporating all the relevant experimental information and state-of-the-art theoretical predictions, are highly compatible with the experimental values with small remaining uncertainties, yielding an improvement compared to the case of the Standard Model. (orig.)

  3. Webinar of paper 2013, Which method predicts recidivism best? A comparison of statistical, machine learning and data mining predictive models

    NARCIS (Netherlands)

    Tollenaar, N.; Van der Heijden, P.G.M.

    2013-01-01

    Using criminal population criminal conviction history information, prediction models are developed that predict three types of criminal recidivism: general recidivism, violent recidivism and sexual recidivism. The research question is whether prediction techniques from modern statistics, data mining

  4. Preprocedural Prediction Model for Contrast-Induced Nephropathy Patients.

    Science.gov (United States)

    Yin, Wen-Jun; Yi, Yi-Hu; Guan, Xiao-Feng; Zhou, Ling-Yun; Wang, Jiang-Lin; Li, Dai-Yang; Zuo, Xiao-Cong

    2017-02-03

    Several models have been developed for prediction of contrast-induced nephropathy (CIN); however, they only contain patients receiving intra-arterial contrast media for coronary angiographic procedures, which represent a small proportion of all contrast procedures. In addition, most of them evaluate radiological interventional procedure-related variables. So it is necessary for us to develop a model for prediction of CIN before radiological procedures among patients administered contrast media. A total of 8800 patients undergoing contrast administration were randomly assigned in a 4:1 ratio to development and validation data sets. CIN was defined as an increase of 25% and/or 0.5 mg/dL in serum creatinine within 72 hours above the baseline value. Preprocedural clinical variables were used to develop the prediction model from the training data set by the machine learning method of random forest, and 5-fold cross-validation was used to evaluate the prediction accuracies of the model. Finally we tested this model in the validation data set. The incidence of CIN was 13.38%. We built a prediction model with 13 preprocedural variables selected from 83 variables. The model obtained an area under the receiver-operating characteristic (ROC) curve (AUC) of 0.907 and gave prediction accuracy of 80.8%, sensitivity of 82.7%, specificity of 78.8%, and Matthews correlation coefficient of 61.5%. For the first time, 3 new factors are included in the model: the decreased sodium concentration, the INR value, and the preprocedural glucose level. The newly established model shows excellent predictive ability of CIN development and thereby provides preventative measures for CIN. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  5. Risk Prediction Models for Oral Clefts Allowing for Phenotypic Heterogeneity

    Directory of Open Access Journals (Sweden)

    Yalu eWen

    2015-08-01

    Full Text Available Oral clefts are common birth defects that have a major impact on the affected individual, their family and society. World-wide, the incidence of oral clefts is 1/700 live births, making them the most common craniofacial birth defects. The successful prediction of oral clefts may help identify sub-population at high risk, and promote new diagnostic and therapeutic strategies. Nevertheless, developing a clinically useful oral clefts risk prediction model remains a great challenge. Compelling evidences suggest the etiologies of oral clefts are highly heterogeneous, and the development of a risk prediction model with consideration of phenotypic heterogeneity may potentially improve the accuracy of a risk prediction model. In this study, we applied a previously developed statistical method to investigate the risk prediction on sub-phenotypes of oral clefts. Our results suggested subtypes of cleft lip and palate have similar genetic etiologies (AUC=0.572 with subtypes of cleft lip only (AUC=0.589, while the subtypes of cleft palate only (CPO have heterogeneous underlying mechanisms (AUCs for soft CPO and hard CPO are 0.617 and 0.623, respectively. This highlighted the potential that the hard and soft forms of CPO have their own mechanisms despite sharing some of the genetic risk factors. Comparing with conventional methods for risk prediction modeling, our method considers phenotypic heterogeneity of a disease, which potentially improves the accuracy for predicting each sub-phenotype of oral clefts.

  6. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  7. Survival prediction model for postoperative hepatocellular carcinoma patients.

    Science.gov (United States)

    Ren, Zhihui; He, Shasha; Fan, Xiaotang; He, Fangping; Sang, Wei; Bao, Yongxing; Ren, Weixin; Zhao, Jinming; Ji, Xuewen; Wen, Hao

    2017-09-01

    This study is to establish a predictive index (PI) model of 5-year survival rate for patients with hepatocellular carcinoma (HCC) after radical resection and to evaluate its prediction sensitivity, specificity, and accuracy.Patients underwent HCC surgical resection were enrolled and randomly divided into prediction model group (101 patients) and model evaluation group (100 patients). Cox regression model was used for univariate and multivariate survival analysis. A PI model was established based on multivariate analysis and receiver operating characteristic (ROC) curve was drawn accordingly. The area under ROC (AUROC) and PI cutoff value was identified.Multiple Cox regression analysis of prediction model group showed that neutrophil to lymphocyte ratio, histological grade, microvascular invasion, positive resection margin, number of tumor, and postoperative transcatheter arterial chemoembolization treatment were the independent predictors for the 5-year survival rate for HCC patients. The model was PI = 0.377 × NLR + 0.554 × HG + 0.927 × PRM + 0.778 × MVI + 0.740 × NT - 0.831 × transcatheter arterial chemoembolization (TACE). In the prediction model group, AUROC was 0.832 and the PI cutoff value was 3.38. The sensitivity, specificity, and accuracy were 78.0%, 80%, and 79.2%, respectively. In model evaluation group, AUROC was 0.822, and the PI cutoff value was well corresponded to the prediction model group with sensitivity, specificity, and accuracy of 85.0%, 83.3%, and 84.0%, respectively.The PI model can quantify the mortality risk of hepatitis B related HCC with high sensitivity, specificity, and accuracy.

  8. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  9. Numerical modeling capabilities to predict repository performance

    International Nuclear Information System (INIS)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used

  10. Model Predictive Control for Smart Energy Systems

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus

    In this thesis, we consider control strategies for flexible distributed energy resources in the future intelligent energy system – the Smart Grid. The energy system is a large-scale complex network with many actors and objectives in different hierarchical layers. Specifically the power system must...... significantly. A Smart Grid calls for flexible consumers that can adjust their consumption based on the amount of green energy in the grid. This requires coordination through new large-scale control and optimization algorithms. Trading of flexibility is key to drive power consumption in a sustainable direction....... In Denmark, we expect that distributed energy resources such as heat pumps, and batteries in electric vehicles will mobilize part of the needed flexibility. Our primary objectives in the thesis were threefold: 1.Simulate the components in the power system based on simple models from literature (e.g. heat...

  11. Model Predictive Control of Wind Turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian

    Wind turbines play a major role in the transformation from a fossil fuel based energy production to a more sustainable production of energy. Total-cost-of-ownership is an important parameter when investors decide in which energy technology they should place their capital. Modern wind turbines...... are controlled by pitching the blades and by controlling the electro-magnetic torque of the generator, thus slowing the rotation of the blades. Improved control of wind turbines, leading to reduced fatigue loads, can be exploited by using less materials in the construction of the wind turbine or by reducing...... the need for maintenance of the wind turbine. Either way, better total-cost-of-ownership for wind turbine operators can be achieved by improved control of the wind turbines. Wind turbine control can be improved in two ways, by improving the model on which the controller bases its design or by improving...

  12. Comparison of Linear Prediction Models for Audio Signals

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available While linear prediction (LP has become immensely popular in speech modeling, it does not seem to provide a good approach for modeling audio signals. This is somewhat surprising, since a tonal signal consisting of a number of sinusoids can be perfectly predicted based on an (all-pole LP model with a model order that is twice the number of sinusoids. We provide an explanation why this result cannot simply be extrapolated to LP of audio signals. If noise is taken into account in the tonal signal model, a low-order all-pole model appears to be only appropriate when the tonal components are uniformly distributed in the Nyquist interval. Based on this observation, different alternatives to the conventional LP model can be suggested. Either the model should be changed to a pole-zero, a high-order all-pole, or a pitch prediction model, or the conventional LP model should be preceded by an appropriate frequency transform, such as a frequency warping or downsampling. By comparing these alternative LP models to the conventional LP model in terms of frequency estimation accuracy, residual spectral flatness, and perceptual frequency resolution, we obtain several new and promising approaches to LP-based audio modeling.

  13. Model Predictive Control of a Wave Energy Converter

    DEFF Research Database (Denmark)

    Andersen, Palle; Pedersen, Tom Søndergård; Nielsen, Kirsten Mølgaard

    2015-01-01

    In this paper reactive control and Model Predictive Control (MPC) for a Wave Energy Converter (WEC) are compared. The analysis is based on a WEC from Wave Star A/S designed as a point absorber. The model predictive controller uses wave models based on the dominating sea states combined with a model......'s are designed for each sea state using a model assuming a linear loss torque. The mean power results from two controllers are compared using both loss models. Simulation results show that MPC can outperform a reactive controller if a good model of the conversion losses is available....... connecting undisturbed wave sequences to sequences of torque. Losses in the conversion from mechanical to electrical power are taken into account in two ways. Conventional reactive controllers are tuned for each sea state with the assumption that the converter has the same efficiency back and forth. MPC...

  14. Review of Model Predictions for Extensive Air Showers

    Science.gov (United States)

    Pierog, Tanguy

    In detailed air shower simulations, the uncertainty in the prediction of shower observable for different primary particles and energies is currently dominated by differences between hadronic interaction models. With the results of the first run of the LHC, the difference between post-LHC model predictions has been reduced at the same level as experimental uncertainties of cosmic ray experiments. At the same time new types of air shower observables, like the muon production depth, have been measured, adding new constraints on hadronic models. Currently no model is able to reproduce consistently all mass composition measurements possible with the Pierre Auger Observatory for instance. We review the current model predictions for various particle production observables and their link with air shower observables and discuss the future possible improvements.

  15. Integrating predictive frameworks and cognitive models of face perception.

    Science.gov (United States)

    Trapp, Sabrina; Schweinberger, Stefan R; Hayward, William G; Kovács, Gyula

    2018-02-08

    The idea of a "predictive brain"-that is, the interpretation of internal and external information based on prior expectations-has been elaborated intensely over the past decade. Several domains in cognitive neuroscience have embraced this idea, including studies in perception, motor control, language, and affective, social, and clinical neuroscience. Despite the various studies that have used face stimuli to address questions related to predictive processing, there has been surprisingly little connection between this work and established cognitive models of face recognition. Here we suggest that the predictive framework can serve as an important complement of established cognitive face models. Conversely, the link to cognitive face models has the potential to shed light on issues that remain open in predictive frameworks.

  16. A model for predicting lung cancer response to therapy

    International Nuclear Information System (INIS)

    Seibert, Rebecca M.; Ramsey, Chester R.; Hines, J. Wesley; Kupelian, Patrick A.; Langen, Katja M.; Meeks, Sanford L.; Scaperoth, Daniel D.

    2007-01-01

    Purpose: Volumetric computed tomography (CT) images acquired by image-guided radiation therapy (IGRT) systems can be used to measure tumor response over the course of treatment. Predictive adaptive therapy is a novel treatment technique that uses volumetric IGRT data to actively predict the future tumor response to therapy during the first few weeks of IGRT treatment. The goal of this study was to develop and test a model for predicting lung tumor response during IGRT treatment using serial megavoltage CT (MVCT). Methods and Materials: Tumor responses were measured for 20 lung cancer lesions in 17 patients that were imaged and treated with helical tomotherapy with doses ranging from 2.0 to 2.5 Gy per fraction. Five patients were treated with concurrent chemotherapy, and 1 patient was treated with neoadjuvant chemotherapy. Tumor response to treatment was retrospectively measured by contouring 480 serial MVCT images acquired before treatment. A nonparametric, memory-based locally weight regression (LWR) model was developed for predicting tumor response using the retrospective tumor response data. This model predicts future tumor volumes and the associated confidence intervals based on limited observations during the first 2 weeks of treatment. The predictive accuracy of the model was tested using a leave-one-out cross-validation technique with the measured tumor responses. Results: The predictive algorithm was used to compare predicted verse-measured tumor volume response for all 20 lesions. The average error for the predictions of the final tumor volume was 12%, with the true volumes always bounded by the 95% confidence interval. The greatest model uncertainty occurred near the middle of the course of treatment, in which the tumor response relationships were more complex, the model has less information, and the predictors were more varied. The optimal days for measuring the tumor response on the MVCT images were on elapsed Days 1, 2, 5, 9, 11, 12, 17, and 18 during

  17. New Application of Bioelectrical Impedance Analysis by the Back Propagation Artificial Neural Network Mathematically Predictive Model of Tissue Composition in the Lower Limbs of Elderly People

    Directory of Open Access Journals (Sweden)

    Tsang-Pai Liu

    2012-03-01

    Conclusion: In summary, the greater predictive accuracy and precision made the application of BIA with the BP–ANN mathematical model more feasible for the clinical measurement of FM and FFM in the lower limbs of elderly people.

  18. Predictive modeling of coupled multi-physics systems: I. Theory

    International Nuclear Information System (INIS)

    Cacuci, Dan Gabriel

    2014-01-01

    Highlights: • We developed “predictive modeling of coupled multi-physics systems (PMCMPS)”. • PMCMPS reduces predicted uncertainties in predicted model responses and parameters. • PMCMPS treats efficiently very large coupled systems. - Abstract: This work presents an innovative mathematical methodology for “predictive modeling of coupled multi-physics systems (PMCMPS).” This methodology takes into account fully the coupling terms between the systems but requires only the computational resources that would be needed to perform predictive modeling on each system separately. The PMCMPS methodology uses the maximum entropy principle to construct an optimal approximation of the unknown a priori distribution based on a priori known mean values and uncertainties characterizing the parameters and responses for both multi-physics models. This “maximum entropy”-approximate a priori distribution is combined, using Bayes’ theorem, with the “likelihood” provided by the multi-physics simulation models. Subsequently, the posterior distribution thus obtained is evaluated using the saddle-point method to obtain analytical expressions for the optimally predicted values for the multi-physics models parameters and responses along with corresponding reduced uncertainties. Noteworthy, the predictive modeling methodology for the coupled systems is constructed such that the systems can be considered sequentially rather than simultaneously, while preserving exactly the same results as if the systems were treated simultaneously. Consequently, very large coupled systems, which could perhaps exceed available computational resources if treated simultaneously, can be treated with the PMCMPS methodology presented in this work sequentially and without any loss of generality or information, requiring just the resources that would be needed if the systems were treated sequentially

  19. Embryo quality predictive models based on cumulus cells gene expression

    Directory of Open Access Journals (Sweden)

    Devjak R

    2016-06-01

    Full Text Available Since the introduction of in vitro fertilization (IVF in clinical practice of infertility treatment, the indicators for high quality embryos were investigated. Cumulus cells (CC have a specific gene expression profile according to the developmental potential of the oocyte they are surrounding, and therefore, specific gene expression could be used as a biomarker. The aim of our study was to combine more than one biomarker to observe improvement in prediction value of embryo development. In this study, 58 CC samples from 17 IVF patients were analyzed. This study was approved by the Republic of Slovenia National Medical Ethics Committee. Gene expression analysis [quantitative real time polymerase chain reaction (qPCR] for five genes, analyzed according to embryo quality level, was performed. Two prediction models were tested for embryo quality prediction: a binary logistic and a decision tree model. As the main outcome, gene expression levels for five genes were taken and the area under the curve (AUC for two prediction models were calculated. Among tested genes, AMHR2 and LIF showed significant expression difference between high quality and low quality embryos. These two genes were used for the construction of two prediction models: the binary logistic model yielded an AUC of 0.72 ± 0.08 and the decision tree model yielded an AUC of 0.73 ± 0.03. Two different prediction models yielded similar predictive power to differentiate high and low quality embryos. In terms of eventual clinical decision making, the decision tree model resulted in easy-to-interpret rules that are highly applicable in clinical practice.

  20. Comparison of Predictive Modeling Methods of Aircraft Landing Speed

    Science.gov (United States)

    Diallo, Ousmane H.

    2012-01-01

    Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.

  1. Model Predictive Control of Three Phase Inverter for PV Systems

    OpenAIRE

    Irtaza M. Syed; Kaamran Raahemifar

    2015-01-01

    This paper presents a model predictive control (MPC) of a utility interactive three phase inverter (TPI) for a photovoltaic (PV) system at commercial level. The proposed model uses phase locked loop (PLL) to synchronize the TPI with the power electric grid (PEG) and performs MPC control in a dq reference frame. TPI model consists of a boost converter (BC), maximum power point tracking (MPPT) control, and a three-leg voltage source inverter (VSI). The operational model of ...

  2. Research of combination model for prediction of the trend of outbreak of hepatitis B

    Directory of Open Access Journals (Sweden)

    Yin-ping CHEN

    2014-03-01

    Full Text Available Objective To establish a combination model of autoregressive integrated moving average model and the grey dynamics (ARIMA-GM of hepatitis B incidence rate (1/100 000 to predict the trend of outbreak of hepatitis B, as to provide a scientific basis for the early discovery of the infectious diseases for the performance of countermeasures of controlling its spread. Methods The monthly incidence of hepatitis B in Qian'an city, Hebei province, was collected from Jan 2004 to Dec 2012, and a model (ARIMA was reproduced with SPSS software. The GM (1,1 model was used to correct the residual sequence with a threshold value, and a combined forecasting model was reproduced. This combination model was used to predict the monthly incidence rate in this city in 2013. Results The model ARIMA(0,1,1(0,1,112 was established successfully and the residual sequence was a white noise sequence. Then the GM (1,1 model with a threshold of 3 was used to correct its residuals and obtain its nonlinear feature extraction of information. The forecasting model met required precision standards (C=0.673, P=0.877, the fitting accuracy of which was basically qualified. The results showed that the MAE, MAPE of the ARIMA-GM combined model were smaller than that of a single model, and the combined model could improve the prediction accuracy. Using the combined model to forecast the incidence of hepatitis B during Jan 2013 to Dec 2013, the overall trend was relatively consistent with the condition of previous years. Conclusion The ARIMA-GM combined model can better fit the incidence rate of hepatitis B with a greater accuracy than the seasonal ARIMA model. The prediction results can provide the reference for the early warning system of HBV. DOI: 10.11855/j.issn.0577-7402.2014.01.12

  3. Prediction error, ketamine and psychosis: An updated model.

    Science.gov (United States)

    Corlett, Philip R; Honey, Garry D; Fletcher, Paul C

    2016-11-01

    In 2007, we proposed an explanation of delusion formation as aberrant prediction error-driven associative learning. Further, we argued that the NMDA receptor antagonist ketamine provided a good model for this process. Subsequently, we validated the model in patients with psychosis, relating aberrant prediction error signals to delusion severity. During the ensuing period, we have developed these ideas, drawing on the simple principle that brains build a model of the world and refine it by minimising prediction errors, as well as using it to guide perceptual inferences. While previously we focused on the prediction error signal per se, an updated view takes into account its precision, as well as the precision of prior expectations. With this expanded perspective, we see several possible routes to psychotic symptoms - which may explain the heterogeneity of psychotic illness, as well as the fact that other drugs, with different pharmacological actions, can produce psychotomimetic effects. In this article, we review the basic principles of this model and highlight specific ways in which prediction errors can be perturbed, in particular considering the reliability and uncertainty of predictions. The expanded model explains hallucinations as perturbations of the uncertainty mediated balance between expectation and prediction error. Here, expectations dominate and create perceptions by suppressing or ignoring actual inputs. Negative symptoms may arise due to poor reliability of predictions in service of action. By mapping from biology to belief and perception, the account proffers new explanations of psychosis. However, challenges remain. We attempt to address some of these concerns and suggest future directions, incorporating other symptoms into the model, building towards better understanding of psychosis. © The Author(s) 2016.

  4. Individualized prediction of perineural invasion in colorectal cancer: development and validation of a radiomics prediction model.

    Science.gov (United States)

    Huang, Yanqi; He, Lan; Dong, Di; Yang, Caiyun; Liang, Cuishan; Chen, Xin; Ma, Zelan; Huang, Xiaomei; Yao, Su; Liang, Changhong; Tian, Jie; Liu, Zaiyi

    2018-02-01

    To develop and validate a radiomics prediction model for individualized prediction of perineural invasion (PNI) in colorectal cancer (CRC). After computed tomography (CT) radiomics features extraction, a radiomics signature was constructed in derivation cohort (346 CRC patients). A prediction model was developed to integrate the radiomics signature and clinical candidate predictors [age, sex, tumor location, and carcinoembryonic antigen (CEA) level]. Apparent prediction performance was assessed. After internal validation, independent temporal validation (separate from the cohort used to build the model) was then conducted in 217 CRC patients. The final model was converted to an easy-to-use nomogram. The developed radiomics nomogram that integrated the radiomics signature and CEA level showed good calibration and discrimination performance [Harrell's concordance index (c-index): 0.817; 95% confidence interval (95% CI): 0.811-0.823]. Application of the nomogram in validation cohort gave a comparable calibration and discrimination (c-index: 0.803; 95% CI: 0.794-0.812). Integrating the radiomics signature and CEA level into a radiomics prediction model enables easy and effective risk assessment of PNI in CRC. This stratification of patients according to their PNI status may provide a basis for individualized auxiliary treatment.

  5. The use of model-test data for predicting full-scale ACV resistance

    Science.gov (United States)

    Forstell, B. G.; Harry, C. W.

    The paper summarizes the analysis of test data obtained with a 1/12-scale model of the Amphibious Assault Landing Craft (AALC) JEFF(B). The analysis was conducted with the objective of improving the accuracy of drag predictions for a JEFF(B)-type air-cushion vehicle (ACV). Model test results, scaled to full-scale, are compared with full-scale drag obtained in various sea states during JEFF(B) trials. From the results of this comparison, it is found that the Froude-scale model rough-water drag data is consistently greater than full-scale derived drag, and is a function of both wave height and craft forward speed. Results are presented indicating that Froude scaling model data obtained in calm water also causes an over-prediction of calm-water drag at full-scale. An empirical correction that was developed for use on a JEFF(B)-type craft is discussed.

  6. Validation of the prostate health index in a predictive model of prostate cancer.

    Science.gov (United States)

    Sanchís-Bonet, A; Barrionuevo-González, M; Bajo-Chueca, A M; Pulido-Fonseca, L; Ortega-Polledo, L E; Tamayo-Ruiz, J C; Sánchez-Chapado, M

    To validate and analyse the clinical usefulness of a predictive model of prostate cancer that incorporates the biomarker «[-2] pro prostate-specific antigen» using the prostate health index (PHI) in decision making for performing prostate biopsies. We isolated serum from 197 men with an indication for prostate biopsy to determine the total prostate-specific antigen (tPSA), the free PSA fraction (fPSA) and the [-2] proPSA (p2PSA). The PHI was calculated as p2PSA/fPSA×√tPSA. We created 2 predictive models that incorporated clinical variables along with tPSA or PHI. The performance of PHI was assessed with a discriminant analysis using receiver operating characteristic curves, internal calibration and decision curves. The areas under the curve for the tPSA and PHI models were 0.71 and 0.85, respectively. The PHI model showed a better ability to discriminate and better calibration for predicting prostate cancer but not for predicting a Gleason score in the biopsy ≥7. The decision curves showed a greater net benefit with the PHI model for diagnosing prostate cancer when the probability threshold was 15-35% and greater savings (20%) in the number of biopsies. The incorporation of p2PSA through PHI in predictive models of prostate cancer improves the accuracy of the risk stratification and helps in the decision-making process for performing prostate biopsies. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. A deep auto-encoder model for gene expression prediction.

    Science.gov (United States)

    Xie, Rui; Wen, Jia; Quitadamo, Andrew; Cheng, Jianlin; Shi, Xinghua

    2017-11-17

    Gene expression is a key intermediate level that genotypes lead to a particular trait. Gene expression is affected by various factors including genotypes of genetic variants. With an aim of delineating the genetic impact on gene expression, we build a deep auto-encoder model to assess how good genetic variants will contribute to gene expression changes. This new deep learning model is a regression-based predictive model based on the MultiLayer Perceptron and Stacked Denoising Auto-encoder (MLP-SAE). The model is trained using a stacked denoising auto-encoder for feature selection and a multilayer perceptron framework for backpropagation. We further improve the model by introducing dropout to prevent overfitting and improve performance. To demonstrate the usage of this model, we apply MLP-SAE to a real genomic datasets with genotypes and gene expression profiles measured in yeast. Our results show that the MLP-SAE model with dropout outperforms other models including Lasso, Random Forests and the MLP-SAE model without dropout. Using the MLP-SAE model with dropout, we show that gene expression quantifications predicted by the model solely based on genotypes, align well with true gene expression patterns. We provide a deep auto-encoder model for predicting gene expression from SNP genotypes. This study demonstrates that deep learning is appropriate for tackling another genomic problem, i.e., building predictive models to understand genotypes' contribution to gene expression. With the emerging availability of richer genomic data, we anticipate that deep learning models play a bigger role in modeling and interpreting genomics.

  8. The GREAT-ER model in China: Evaluating the risk of both treated and untreated wastewater discharges and a consideration to the future.

    Science.gov (United States)

    Jackson, Benjamin; Jones, Kevin; Sweetman, Andrew

    2016-04-01

    As a result of rapid economic development, the production and usage of chemicals in China has risen significantly. This has resulted in China's environment becoming degraded. The Chinese government has attempted to ease these problems with significant investment towards upgrading the wastewater network. These efforts have initially focused upon large cities; progressing towards smaller populations within the most recent 5 year plan. However rural populations were largely overlooked, ~90% of rural settlements do not have treatment facilities for their wastewater. The next (13th) five year plan is a great opportunity to improve upon wastewater infrastructure. This transition is particularly important and it is essential for the government to prioritise settlements to provide treatment facilities and to improve water quality in receiving waters. This study focuses upon the use of a catchment model in order make progress towards this goal. A reliable model which can capture the complexity of the catchment is needed, but one without complexity in itself, in order for it to be developed and validated without an excessive requirement for data. The Geo-referenced Regional Exposure Assessment Tool for European Rivers (GREAT-ER) model is a catchment-scale stochastic-deterministic GIS model. It is primarily used for higher-tier chemical risk assessment. Emissions are from point source only and are calculated based upon population and calculated emission rates per capita. Dilution and transportation are determined using low-flow statistics within each stretch; calculated based upon catchment soil and topographic properties. Removal of the contaminant can occur prior to emission and in-stream. The lowest tier methodology applies a simple 1st-order removal rate and a flat percentage removal for in-stream and sewage treatment work removal respectively. The data requirements are relatively low, although still challenging for many situations. Many authors have reported reasonable

  9. Predicting the Yield Stress of SCC using Materials Modelling

    DEFF Research Database (Denmark)

    Thrane, Lars Nyholm; Hasholt, Marianne Tange; Pade, Claus

    2005-01-01

    A conceptual model for predicting the Bingham rheological parameter yield stress of SCC has been established. The model used here is inspired by previous work of Oh et al. (1), predicting that the yield stress of concrete relative to the yield stress of paste is a function of the relative thickne...... and distribution were varied between SCC types. The results indicate that yield stress of SCC may be predicted using the model.......A conceptual model for predicting the Bingham rheological parameter yield stress of SCC has been established. The model used here is inspired by previous work of Oh et al. (1), predicting that the yield stress of concrete relative to the yield stress of paste is a function of the relative thickness...... of excess paste around the aggregate. The thickness of excess paste is itself a function of particle shape, particle size distribution, and particle packing. Seven types of SCC were tested at four different excess paste contents in order to verify the conceptual model. Paste composition and aggregate shape...

  10. Predictive models of prolonged mechanical ventilation yield moderate accuracy.

    Science.gov (United States)

    Figueroa-Casas, Juan B; Dwivedi, Alok K; Connery, Sean M; Quansah, Raphael; Ellerbrook, Lowell; Galvis, Juan

    2015-06-01

    To develop a model to predict prolonged mechanical ventilation within 48 hours of its initiation. In 282 general intensive care unit patients, multiple variables from the first 2 days on mechanical ventilation and their total ventilation duration were prospectively collected. Three models accounting for early deaths were developed using different analyses: (a) multinomial logistic regression to predict duration > 7 days vs duration ≤ 7 days alive vs duration ≤ 7 days death; (b) binary logistic regression to predict duration > 7 days for the entire cohort and for survivors only, separately; and (c) Cox regression to predict time to being free of mechanical ventilation alive. Positive end-expiratory pressure, postoperative state (negatively), and Sequential Organ Failure Assessment score were independently associated with prolonged mechanical ventilation. The multinomial regression model yielded an accuracy (95% confidence interval) of 60% (53%-64%). The binary regression models yielded accuracies of 67% (61%-72%) and 69% (63%-75%) for the entire cohort and for survivors, respectively. The Cox regression model showed an equivalent to area under the curve of 0.67 (0.62-0.71). Different predictive models of prolonged mechanical ventilation in general intensive care unit patients achieve a moderate level of overall accuracy, likely insufficient to assist in clinical decisions. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Comparison of the models of financial distress prediction

    Directory of Open Access Journals (Sweden)

    Jiří Omelka

    2013-01-01

    Full Text Available Prediction of the financial distress is generally supposed as approximation if a business entity is closed on bankruptcy or at least on serious financial problems. Financial distress is defined as such a situation when a company is not able to satisfy its liabilities in any forms, or when its liabilities are higher than its assets. Classification of financial situation of business entities represents a multidisciplinary scientific issue that uses not only the economic theoretical bases but interacts to the statistical, respectively to econometric approaches as well.The first models of financial distress prediction have originated in the sixties of the 20th century. One of the most known is the Altman’s model followed by a range of others which are constructed on more or less conformable bases. In many existing models it is possible to find common elements which could be marked as elementary indicators of potential financial distress of a company. The objective of this article is, based on the comparison of existing models of prediction of financial distress, to define the set of basic indicators of company’s financial distress at conjoined identification of their critical aspects. The sample defined this way will be a background for future research focused on determination of one-dimensional model of financial distress prediction which would subsequently become a basis for construction of multi-dimensional prediction model.

  12. Plant water potential improves prediction of empirical stomatal models.

    Directory of Open Access Journals (Sweden)

    William R L Anderegg

    Full Text Available Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.

  13. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  14. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  15. A Mathematical Model for the Prediction of Injectivity Decline | Odeh ...

    African Journals Online (AJOL)

    Injectivity impairment due to invasion of solid suspensions has been studied by several investigators and some modelling approaches have also been reported. Worthy of note is the development of analytical models for internal and external filtration coupled with transition time concept for predicting the overall decline in ...

  16. Mathematical Model for Prediction of Flexural Strength of Mound ...

    African Journals Online (AJOL)

    The mound soil-cement blended proportions were mathematically optimized by using scheffe's approach and the optimization model developed. A computer program predicting the mix proportion for the model was written. The optimal proportion by the program was used prepare beam samples measuring 150mm x 150mm ...

  17. Katz model prediction of Caenorhabditis elegans mutagenesis on STS-42

    Science.gov (United States)

    Cucinotta, Francis A.; Wilson, John W.; Katz, Robert; Badhwar, Gautam D.

    1992-01-01

    Response parameters that describe the production of recessive lethal mutations in C. elegans from ionizing radiation are obtained with the Katz track structure model. The authors used models of the space radiation environment and radiation transport to predict and discuss mutation rates for C. elegans on the IML-1 experiment aboard STS-42.

  18. Accident Prediction Models for Akure – Ondo Carriageway, Ondo ...

    African Journals Online (AJOL)

    FIRST LADY

    traffic exposure and intersection effects as independent variables. They suggested that the Poisson distribution allows for the relationship between exposure and crashes to be more accurately modeled as opposed to. Accident Prediction Models for Akure-Ondo Carriageway…Using Multiple Linear Regression ...

  19. Multi-model prediction of downward short-wave radiation

    Czech Academy of Sciences Publication Activity Database

    Eben, Kryštof; Resler, Jaroslav; Krč, Pavel; Juruš, Pavel; Pelikán, Emil

    2012-01-01

    Roč. 9, - (2012), EMS2012-384 [EMS Annual Meeting /12./ and European Conference on Applied Climatology /9./. 10.09.2012-14.09.2012, Lodz] Institutional support: RVO:67985807 Keywords : multi-model prediction * NWP * model postprocessing Subject RIV: DG - Athmosphere Sciences, Meteorology

  20. Atmospheric modelling for seasonal prediction at the CSIR

    CSIR Research Space (South Africa)

    Landman, WA

    2014-10-01

    Full Text Available by observed monthly sea-surface temperature (SST) and sea-ice fields. The AGCM is the conformal-cubic atmospheric model (CCAM) administered by the Council for Scientific and Industrial Research. Since the model is forced with observed rather than predicted...

  1. Prediction Models and Decision Support: Chances and Challenges

    NARCIS (Netherlands)

    Kappen, T.H.

    2015-01-01

    A clinical prediction model can assist doctors in arriving at the most likely diagnosis or estimating the prognosis. By utilizing various patient- and disease-related properties, such models can yield objective estimations of the risk of a disease or the probability of a certain disease course for

  2. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  3. Predictive ability of boiler production models | Ogundu | Animal ...

    African Journals Online (AJOL)

    The weekly body weight measurements of a growing strain of Ross broiler were used to compare the of ability of three mathematical models (the multi, linear, quadratic and Exponential) to predict 8 week body weight from early body measurements at weeks I, II, III, IV, V, VI and VII. The results suggest that the three models ...

  4. Predictive modelling of noise level generated during sawing of rocks ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... Influence of the operating variables and rock properties on the noise level are investigated and analysed. Statistical analyses are then employed and models are built for the prediction of noise levels depending on the operating variables and the rock properties. The derived models are validated through ...

  5. Modelling and prediction of non-stationary optical turbulence behaviour

    NARCIS (Netherlands)

    Doelman, N.J.; Osborn, J.

    2016-01-01

    There is a strong need to model the temporal fluctuations in turbulence parameters, for instance for scheduling, simulation and prediction purposes. This paper aims at modelling the dynamic behaviour of the turbulence coherence length r0, utilising measurement data from the Stereo-SCIDAR instrument

  6. Inferential ecosystem models, from network data to prediction

    Science.gov (United States)

    James S. Clark; Pankaj Agarwal; David M. Bell; Paul G. Flikkema; Alan Gelfand; Xuanlong Nguyen; Eric Ward; Jun. Yang

    2011-01-01

    Recent developments suggest that predictive modeling could begin to play a larger role not only for data analysis, but also for data collection. We address the example of efficient wireless sensor networks, where inferential ecosystem models can be used to weigh the value of an observation against the cost of data collection. Transmission costs make observations ‘‘...

  7. Model prediction of maize yield responses to climate change in ...

    African Journals Online (AJOL)

    Observed data of the last three decades (1971 to 2000) from several climatological stations in north-eastern Zimbabwe and outputs from several global climate models were used. The downscaled model simulations consistently predicted a warming of between 1 and 2 ºC above the baseline period (1971-2000) at most of ...

  8. A theoretical model for predicting neutron fluxes for cyclic Neutron ...

    African Journals Online (AJOL)

    A theoretical model has been developed for prediction of thermal neutron fluxes required for cyclic irradiations of a sample to obtain the same activity previously used for the detection of any radionuclide of interest. The model is suitable for radiotracer production or for long-lived neutron activation products where the ...

  9. A model to predict the sound reflection from forests

    NARCIS (Netherlands)

    Wunderli, J.M.; Salomons, E.M.

    2009-01-01

    A model is presented to predict the reflection of sound at forest edges. A single tree is modelled as a vertical cylinder. For the reflection at a cylinder an analytical solution is given based on the theory of scattering of spherical waves. The entire forest is represented by a line of cylinders

  10. Model Predictive Control for Offset-Free Reference Tracking

    Czech Academy of Sciences Publication Activity Database

    Belda, Květoslav

    2016-01-01

    Roč. 5, č. 1 (2016), s. 8-13 ISSN 1805-3386 Institutional support: RVO:67985556 Keywords : offset-free reference tracking * predictive control * ARX model * state-space model * multi-input multi-output system * robotic system * mechatronic system Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/belda-0458355.pdf

  11. Multi-model ensemble schemes for predicting northeast monsoon ...

    Indian Academy of Sciences (India)

    An attempt has been made to improve the accuracy of predicted rainfall using three different multi-model ensemble (MME) schemes, viz., simple arithmetic mean of models (EM), principal component regression (PCR) and singular value decomposition based multiple linear regressions (SVD). It is found out that among ...

  12. Supervisory Model Predictive Control of the Heat Integrated Distillation Column

    DEFF Research Database (Denmark)

    Meyer, Kristian; Bisgaard, Thomas; Huusom, Jakob Kjøbsted

    2017-01-01

    This paper benchmarks a centralized control system based on model predictive control for the operation of the heat integrated distillation column (HIDiC) against a fully decentralized control system using the most complete column model currently available in the literature. The centralized contro...

  13. Evaluation of preformance of Predictive Models for Deoxynivalenol in Wheat

    NARCIS (Netherlands)

    Fels, van der H.J.

    2014-01-01

    The aim of this study was to evaluate the performance of two predictive models for deoxynivalenol contamination of wheat at harvest in the Netherlands, including the use of weather forecast data and external model validation. Data were collected in a different year and from different wheat fields

  14. Three-model ensemble wind prediction in southern Italy

    Directory of Open Access Journals (Sweden)

    R. C. Torcasio

    2016-03-01

    Full Text Available Quality of wind prediction is of great importance since a good wind forecast allows the prediction of available wind power, improving the penetration of renewable energies into the energy market. Here, a 1-year (1 December 2012 to 30 November 2013 three-model ensemble (TME experiment for wind prediction is considered. The models employed, run operationally at National Research Council – Institute of Atmospheric Sciences and Climate (CNR-ISAC, are RAMS (Regional Atmospheric Modelling System, BOLAM (BOlogna Limited Area Model, and MOLOCH (MOdello LOCale in H coordinates. The area considered for the study is southern Italy and the measurements used for the forecast verification are those of the GTS (Global Telecommunication System. Comparison with observations is made every 3 h up to 48 h of forecast lead time. Results show that the three-model ensemble outperforms the forecast of each individual model. The RMSE improvement compared to the best model is between 22 and 30 %, depending on the season. It is also shown that the three-model ensemble outperforms the IFS (Integrated Forecasting System of the ECMWF (European Centre for Medium-Range Weather Forecast for the surface wind forecasts. Notably, the three-model ensemble forecast performs better than each unbiased model, showing the added value of the ensemble technique. Finally, the sensitivity of the three-model ensemble RMSE to the length of the training period is analysed.

  15. Three-model ensemble wind prediction in southern Italy

    Science.gov (United States)

    Torcasio, Rosa Claudia; Federico, Stefano; Calidonna, Claudia Roberta; Avolio, Elenio; Drofa, Oxana; Landi, Tony Christian; Malguzzi, Piero; Buzzi, Andrea; Bonasoni, Paolo

    2016-03-01

    Quality of wind prediction is of great importance since a good wind forecast allows the prediction of available wind power, improving the penetration of renewable energies into the energy market. Here, a 1-year (1 December 2012 to 30 November 2013) three-model ensemble (TME) experiment for wind prediction is considered. The models employed, run operationally at National Research Council - Institute of Atmospheric Sciences and Climate (CNR-ISAC), are RAMS (Regional Atmospheric Modelling System), BOLAM (BOlogna Limited Area Model), and MOLOCH (MOdello LOCale in H coordinates). The area considered for the study is southern Italy and the measurements used for the forecast verification are those of the GTS (Global Telecommunication System). Comparison with observations is made every 3 h up to 48 h of forecast lead time. Results show that the three-model ensemble outperforms the forecast of each individual model. The RMSE improvement compared to the best model is between 22 and 30 %, depending on the season. It is also shown that the three-model ensemble outperforms the IFS (Integrated Forecasting System) of the ECMWF (European Centre for Medium-Range Weather Forecast) for the surface wind forecasts. Notably, the three-model ensemble forecast performs better than each unbiased model, showing the added value of the ensemble technique. Finally, the sensitivity of the three-model ensemble RMSE to the length of the training period is analysed.

  16. Development of Prediction Model and Experimental Validation in Predicting the Curcumin Content of Turmeric (Curcuma longa L.).

    Science.gov (United States)

    Akbar, Abdul; Kuanar, Ananya; Joshi, Raj K; Sandeep, I S; Mohanty, Sujata; Naik, Pradeep K; Mishra, Antaryami; Nayak, Sanghamitra

    2016-01-01

    The drug yielding potential of turmeric ( Curcuma longa L.) is largely due to the presence of phyto-constituent 'curcumin.' Curcumin has been found to possess a myriad of therapeutic activities ranging from anti-inflammatory to neuroprotective. Lack of requisite high curcumin containing genotypes and variation in the curcumin content of turmeric at different agro climatic regions are the major stumbling blocks in commercial production of turmeric. Curcumin content of turmeric is greatly influenced by environmental factors. Hence, a prediction model based on artificial neural network (ANN) was developed to map genome environment interaction basing on curcumin content, soli and climatic factors from different agroclimatic regions for prediction of maximum curcumin content at various sites to facilitate the selection of suitable region for commercial cultivation of turmeric. The ANN model was developed and tested using a data set of 119 generated by collecting samples from 8 different agroclimatic regions of Odisha. The curcumin content from these samples was measured that varied from 7.2% to 0.4%. The ANN model was trained with 11 parameters of soil and climatic factors as input and curcumin content as output. The results showed that feed-forward ANN model with 8 nodes (MLFN-8) was the most suitable one with R 2 value of 0.91. Sensitivity analysis revealed that minimum relative humidity, altitude, soil nitrogen content and soil pH had greater effect on curcumin content. This ANN model has shown proven efficiency for predicting and optimizing the curcumin content at a specific site.

  17. The predictive performance and stability of six species distribution models.

    Science.gov (United States)

    Duan, Ren-Yan; Kong, Xiao-Quan; Huang, Min-Yi; Fan, Wei-Yi; Wang, Zhi-Gao

    2014-01-01

    Predicting species' potential geographical range by species distribution models (SDMs) is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs. We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis) and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials). We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values. The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (pSDMs (MAHAL, RF, MAXENT, and SVM) had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points). According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  18. SHMF: Interest Prediction Model with Social Hub Matrix Factorization

    Directory of Open Access Journals (Sweden)

    Chaoyuan Cui

    2017-01-01

    Full Text Available With the development of social networks, microblog has become the major social communication tool. There is a lot of valuable information such as personal preference, public opinion, and marketing in microblog. Consequently, research on user interest prediction in microblog has a positive practical significance. In fact, how to extract information associated with user interest orientation from the constantly updated blog posts is not so easy. Existing prediction approaches based on probabilistic factor analysis use blog posts published by user to predict user interest. However, these methods are not very effective for the users who post less but browse more. In this paper, we propose a new prediction model, which is called SHMF, using social hub matrix factorization. SHMF constructs the interest prediction model by combining the information of blogs posts published by both user and direct neighbors in user’s social hub. Our proposed model predicts user interest by integrating user’s historical behavior and temporal factor as well as user’s friendships, thus achieving accurate forecasts of user’s future interests. The experimental results on Sina Weibo show the efficiency and effectiveness of our proposed model.

  19. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  20. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Directory of Open Access Journals (Sweden)

    Jaime Cuevas

    2017-01-01

    Full Text Available The phenomenon of genotype × environment (G × E interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects ( u that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP and Gaussian (Gaussian kernel, GK. The other model has the same genetic component as the first model ( u plus an extra component, f, that captures random effects between environments that were not captured by the random effects u . We used five CIMMYT data sets (one maize and four wheat that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u   and   f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u .

  1. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  2. Landscape characteristics influencing the genetic structure of greater sage-grouse within the stronghold of their range: a holistic modeling approach

    Science.gov (United States)

    Row, Jeff R; Oyler-McCance, Sara J.; Fike, Jennifer; O'Donnell, Michael; Doherty, Kevin E.; Aldridge, Cameron L.; Bowen, Zachary H.; Fedy, Brad C.

    2015-01-01

    Given the significance of animal dispersal to population dynamics and geographic variability, understanding how dispersal is impacted by landscape patterns has major ecological and conservation importance. Speaking to the importance of dispersal, the use of linear mixed models to compare genetic differentiation with pairwise resistance derived from landscape resistance surfaces has presented new opportunities to disentangle the menagerie of factors behind effective dispersal across a given landscape. Here, we combine these approaches with novel resistance surface parameterization to determine how the distribution of high- and low-quality seasonal habitat and individual landscape components shape patterns of gene flow for the greater sage-grouse (Centrocercus urophasianus) across Wyoming. We found that pairwise resistance derived from the distribution of low-quality nesting and winter, but not summer, seasonal habitat had the strongest correlation with genetic differentiation. Although the patterns were not as strong as with habitat distribution, multivariate models with sagebrush cover and landscape ruggedness or forest cover and ruggedness similarly had a much stronger fit with genetic differentiation than an undifferentiated landscape. In most cases, landscape resistance surfaces transformed with 17.33-km-diameter moving windows were preferred, suggesting small-scale differences in habitat were unimportant at this large spatial extent. Despite the emergence of these overall patterns, there were differences in the selection of top models depending on the model selection criteria, suggesting research into the most appropriate criteria for landscape genetics is required. Overall, our results highlight the importance of differences in seasonal habitat preferences to patterns of gene flow and suggest the combination of habitat suitability modeling and linear mixed models with our resistance parameterization is a powerful approach to discerning the effects of landscape

  3. Landscape characteristics influencing the genetic structure of greater sage-grouse within the stronghold of their range: a holistic modeling approach.

    Science.gov (United States)

    Row, Jeffrey R; Oyler-McCance, Sara J; Fike, Jennifer A; O'Donnell, Michael S; Doherty, Kevin E; Aldridge, Cameron L; Bowen, Zachary H; Fedy, Bradley C

    2015-05-01

    Given the significance of animal dispersal to population dynamics and geographic variability, understanding how dispersal is impacted by landscape patterns has major ecological and conservation importance. Speaking to the importance of dispersal, the use of linear mixed models to compare genetic differentiation with pairwise resistance derived from landscape resistance surfaces has presented new opportunities to disentangle the menagerie of factors behind effective dispersal across a given landscape. Here, we combine these approaches with novel resistance surface parameterization to determine how the distribution of high- and low-quality seasonal habitat and individual landscape components shape patterns of gene flow for the greater sage-grouse (Centrocercus urophasianus) across Wyoming. We found that pairwise resistance derived from the distribution of low-quality nesting and winter, but not summer, seasonal habitat had the strongest correlation with genetic differentiation. Although the patterns were not as strong as with habitat distribution, multivariate models with sagebrush cover and landscape ruggedness or forest cover and ruggedness similarly had a much stronger fit with genetic differentiation than an undifferentiated landscape. In most cases, landscape resistance surfaces transformed with 17.33-km-diameter moving windows were preferred, suggesting small-scale differences in habitat were unimportant at this large spatial extent. Despite the emergence of these overall patterns, there were differences in the selection of top models depending on the model selection criteria, suggesting research into the most appropriate criteria for landscape genetics is required. Overall, our results highlight the importance of differences in seasonal habitat preferences to patterns of gene flow and suggest the combination of habitat suitability modeling and linear mixed models with our resistance parameterization is a powerful approach to discerning the effects of landscape

  4. Stochastic models for predicting pitting corrosion damage of HLRW containers

    International Nuclear Information System (INIS)

    Henshall, G.A.

    1991-10-01

    Stochastic models for predicting aqueous pitting corrosion damage of high-level radioactive-waste containers are described. These models could be used to predict the time required for the first pit to penetrate a container and the increase in the number of breaches at later times, both of which would be useful in the repository system performance analysis. Monte Carlo implementations of the stochastic models are described, and predictions of induction time, survival probability and pit depth distributions are presented. These results suggest that the pit nucleation probability decreases with exposure time and that pit growth may be a stochastic process. The advantages and disadvantages of the stochastic approach, methods for modeling the effects of environment, and plans for future work are discussed

  5. Verification and improvement of a predictive model for radionuclide migration

    International Nuclear Information System (INIS)

    Miller, C.W.; Benson, L.V.; Carnahan, C.L.

    1982-01-01

    Prediction of the rates of migration of contaminant chemical species in groundwater flowing through toxic waste repositories is essential to the assessment of a repository's capability of meeting standards for release rates. A large number of chemical transport models, of varying degrees of complexity, have been devised for the purpose of providing this predictive capability. In general, the transport of dissolved chemical species through a water-saturated porous medium is influenced by convection, diffusion/dispersion, sorption, formation of complexes in the aqueous phase, and chemical precipitation. The reliability of predictions made with the models which omit certain of these processes is difficult to assess. A numerical model, CHEMTRN, has been developed to determine which chemical processes govern radionuclide migration. CHEMTRN builds on a model called MCCTM developed previously by Lichtner and Benson

  6. A novel Bayesian hierarchical model for road safety hotspot prediction.

    Science.gov (United States)

    Fawcett, Lee; Thorpe, Neil; Matthews, Joseph; Kremer, Karsten

    2017-02-01

    In this paper, we propose a Bayesian hierarchical model for predicting accident counts in future years at sites within a pool of potential road safety hotspots. The aim is to inform road safety practitioners of the location of likely future hotspots to enable a proactive, rather than reactive, approach to road safety scheme implementation. A feature of our model is the ability to rank sites according to their potential to exceed, in some future time period, a threshold accident count which may be used as a criterion for scheme implementation. Our model specification enables the classical empirical Bayes formulation - commonly used in before-and-after studies, wherein accident counts from a single before period are used to estimate counterfactual counts in the after period - to be extended to incorporate counts from multiple time periods. This allows site-specific variations in historical accident counts (e.g. locally-observed trends) to offset estimates of safety generated by a global accident prediction model (APM), which itself is used to help account for the effects of global trend and regression-to-mean (RTM). The Bayesian posterior predictive distribution is exploited to formulate predictions and to properly quantify our uncertainty in these predictions. The main contributions of our model include (i) the ability to allow accident counts from multiple time-points to inform predictions, with counts in more recent years lending more weight to predictions than counts from time-points further in the past; (ii) where appropriate, the ability to offset global estimates of trend by variations in accident counts observed locally, at a site-specific level; and (iii) the ability to account for unknown/unobserved site-specific factors which may affect accident counts. We illustrate our model with an application to accident counts at 734 potential hotspots in the German city of Halle; we also propose some simple diagnostics to validate the predictive capability of our

  7. Artificial neural network models for prediction of intestinal permeability of oligopeptides

    Directory of Open Access Journals (Sweden)

    Kim Min-Kook

    2007-07-01

    Full Text Available Abstract Background Oral delivery is a highly desirable property for candidate drugs under development. Computational modeling could provide a quick and inexpensive way to assess the intestinal permeability of a molecule. Although there have been several studies aimed at predicting the intestinal absorption of chemical compounds, there have been no attempts to predict intestinal permeability on the basis of peptide sequence information. To develop models for predicting the intestinal permeability of peptides, we adopted an artificial neural network as a machine-learning algorithm. The positive control data consisted of intestinal barrier-permeable peptides obtained by the peroral phage display technique, and the negative control data were prepared from random sequences. Results The capacity of our models to make appropriate predictions was validated by statistical indicators including sensitivity, specificity, enrichment curve, and the area under the receiver operating characteristic (ROC curve (the ROC score. The training and test set statistics indicated that our models were of strikingly good quality and could discriminate between permeable and random sequences with a high level of confidence. Conclusion We developed artificial neural network models to predict the intestinal permeabilities of oligopeptides on the basis of peptide sequence information. Both binary and VHSE (principal components score Vectors of Hydrophobic, Steric and Electronic properties descriptors produced statistically significant training models; the models with simple neural network architectures showed slightly greater predictive power than those with complex ones. We anticipate that our models will be applicable to the selection of intestinal barrier-permeable peptides for generating peptide drugs or peptidomimetics.

  8. Intercomparison of model predictions of tritium concentrations in soil and foods following acute airborne HTO exposure

    International Nuclear Information System (INIS)

    Barry, P.J.; Watkins, B.M.; Belot, Y.; Davis, P.A.; Edlund, O.; Galeriu, D.; Raskob, W.; Russell, S.; Togawa, O.

    1998-01-01

    This paper describes the results of a model intercomparision exercise for predicting tritium transport through foodchains. Modellers were asked to assume that farmland was exposed for one hour to an average concentration in air of 10 4 MBq tritium m -3 . They were given the initial soil moisture content and 30 days of hourly averaged historical weather and asked to predict HTO and OBT concentrations in foods at selected times up to 30 days later when crops were assumed to be harvested. Two fumigations were postulated, one at 10.00 h (i.e., in day-light), and the other at 24.00 h (i.e., in darkness).Predicted environmental media concentrations after the daytime exposure agreed within an order of magnitude in most cases. Important sources of differences were variations in choices of numerical values for transport parameters. The different depths of soil layers used in the models appeared to make important contributions to differences in predictions for the given scenario. Following the night-time exposure, however, greater differences in predicted concentrations appeared. These arose largely because of different ways key processes were assumed to be affected by darkness. Uptake of HTO by vegetation and the rate it is converted to OBT were prominent amongst these processes. Further research, experimental data and modelling intercomparisons are required to resolve some of these issues. (Copyright (c) 1998 Elsevier Science B.V., Amsterdam. All rights reserved.)

  9. Assessing the Climate Change Impact on Snow-Glacier Melting Dominated Basins in the Greater Himalaya Region Using a Distributed Glacio-Hydrologic Model

    Science.gov (United States)

    Wi, S.; Yang, Y. C. E.; Khalil, A.

    2014-12-01

    Glacier and snow melting is main source of water supply making a large contribution to streamflow of major river basins in the Greater Himalaya region including the Syr Darya, the Amu Darya, the Indus, the Ganges and the Brahmaputra basins. Due to the critical role of glacier and snow melting as water supply for both food production and hydropower generation in the region (especially during the low flow season), it is important to evaluate the vulnerability of snow and glacier melting streamflow to different climate conditions. In this study, a distributed glacio-hydrologic model with high resolution climate input is developed and calibrated that explicitly simulates all major hydrological processes and the glacier and snow dynamics for area further discretized by elevation bands. The distributed modeling structure and the glacier and snow modules provide a better understanding about how temperature and precipitation alterations are likely to affect current glacier ice reserves. Climate stress test is used to explore changes in the total streamflow change, snow/glacier melting contribution and glacier accumulation and ablation under a variety of different temperature and precipitation conditions. The latest future climate projections provided from the World Climate Research Programme's Coupled Model Intercomparison Project Phase 5 (CMIP5) is used to inform the possibility of different climate conditions.

  10. Numerical weather prediction (NWP) and hybrid ARMA/ANN model to predict global radiation

    International Nuclear Information System (INIS)

    Voyant, Cyril; Muselli, Marc; Paoli, Christophe; Nivet, Marie-Laure

    2012-01-01

    We propose in this paper an original technique to predict global radiation using a hybrid ARMA/ANN model and data issued from a numerical weather prediction model (NWP). We particularly look at the multi-layer perceptron (MLP). After optimizing our architecture with NWP and endogenous data previously made stationary and using an innovative pre-input layer selection method, we combined it to an ARMA model from a rule based on the analysis of hourly data series. This model has been used to forecast the hourly global radiation for five places in Mediterranean area. Our technique outperforms classical models for all the places. The nRMSE for our hybrid model MLP/ARMA is 14.9% compared to 26.2% for the naïve persistence predictor. Note that in the standalone ANN case the nRMSE is 18.4%. Finally, in order to discuss the reliability of the forecaster outputs, a complementary study concerning the confidence interval of each prediction is proposed. -- Highlights: ► Time series forecasting with hybrid method based on the use of ALADIN numerical weather model, ANN and ARMA. ► Innovative pre-input layer selection method. ► Combination of optimized MLP and ARMA model obtained from a rule based on the analysis of hourly data series. ► Stationarity process (method and control) for the global radiation time series.

  11. A simplified building airflow model for agent concentration prediction.

    Science.gov (United States)

    Jacques, David R; Smith, David A

    2010-11-01

    A simplified building airflow model is presented that can be used to predict the spread of a contaminant agent from a chemical or biological attack. If the dominant means of agent transport throughout the building is an air-handling system operating at steady-state, a linear time-invariant (LTI) model can be constructed to predict the concentration in any room of the building as a result of either an internal or external release. While the model does not capture weather-driven and other temperature-driven effects, it is suitable for concentration predictions under average daily conditions. The model is easily constructed using information that should be accessible to a building manager, supplemented with assumptions based on building codes and standard air-handling system design practices. The results of the model are compared with a popular multi-zone model for a simple building and are demonstrated for building examples containing one or more air-handling systems. The model can be used for rapid concentration prediction to support low-cost placement strategies for chemical and biological detection sensors.

  12. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  13. Predicting nucleic acid binding interfaces from structural models of proteins.

    Science.gov (United States)

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2012-02-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However, the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three-dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared with patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. Copyright © 2011 Wiley Periodicals, Inc.

  14. Adaptive Gaussian Predictive Process Models for Large Spatial Datasets

    Science.gov (United States)

    Guhaniyogi, Rajarshi; Finley, Andrew O.; Banerjee, Sudipto; Gelfand, Alan E.

    2011-01-01

    Large point referenced datasets occur frequently in the environmental and natural sciences. Use of Bayesian hierarchical spatial models for analyzing these datasets is undermined by onerous computational burdens associated with parameter estimation. Low-rank spatial process models attempt to resolve this problem by projecting spatial effects to a lower-dimensional subspace. This subspace is determined by a judicious choice of “knots” or locations that are fixed a priori. One such representation yields a class of predictive process models (e.g., Banerjee et al., 2008) for spatial and spatial-temporal data. Our contribution here expands upon predictive process models with fixed knots to models that accommodate stochastic modeling of the knots. We view the knots as emerging from a point pattern and investigate how such adaptive specifications can yield more flexible hierarchical frameworks that lead to automated knot selection and substantial computational benefits. PMID:22298952

  15. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models.

    Science.gov (United States)

    Liu, Bowen; Ramsundar, Bharath; Kawthekar, Prasad; Shi, Jade; Gomes, Joseph; Luu Nguyen, Quang; Ho, Stephen; Sloane, Jack; Wender, Paul; Pande, Vijay

    2017-10-25

    We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder-decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis.

  16. Pulsatile fluidic pump demonstration and predictive model application

    International Nuclear Information System (INIS)

    Morgan, J.G.; Holland, W.D.

    1986-04-01

    Pulsatile fluidic pumps were developed as a remotely controlled method of transferring or mixing feed solutions. A test in the Integrated Equipment Test facility demonstrated the performance of a critically safe geometry pump suitable for use in a 0.1-ton/d heavy metal (HM) fuel reprocessing plant. A predictive model was developed to calculate output flows under a wide range of external system conditions. Predictive and experimental flow rates are compared for both submerged and unsubmerged fluidic pump cases

  17. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    Directory of Open Access Journals (Sweden)

    Milan Vukićević

    2014-01-01

    Full Text Available Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data.

  18. Models for Predicting and Explaining Citation Count of Biomedical Articles

    OpenAIRE

    Fu, Lawrence D.; Aliferis, Constantin

    2008-01-01

    The single most important bibliometric criterion for judging the impact of biomedical papers and their authors’ work is the number of citations received which is commonly referred to as “citation count”. This metric however is unavailable until several years after publication time. In the present work, we build computer models that accurately predict citation counts of biomedical publications within a deep horizon of ten years using only predictive information available at publication time. O...

  19. Predictive Models of Li-ion Battery Lifetime

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Kandler; Wood, Eric; Santhanagopalan, Shriram; Kim, Gi-heon; Shi, Ying; Pesaran, Ahmad

    2015-06-15

    It remains an open question how best to predict real-world battery lifetime based on accelerated calendar and cycle aging data from the laboratory. Multiple degradation mechanisms due to (electro)chemical, thermal, and mechanical coupled phenomena influence Li-ion battery lifetime, each with different dependence on time, cycling and thermal environment. The standardization of life predictive models would benefit the industry by reducing test time and streamlining development of system controls.

  20. Modeling ecological minimum requirements for distribution of greater sage-grouse leks: implications for population connectivity across their western range, U.S.A.

    Science.gov (United States)

    Knick, Steven T; Hanser, Steven E; Preston, Kristine L

    2013-06-01

    Greater sage-grouse Centrocercus urophasianus (Bonaparte) currently occupy approximately half of their historical distribution across western North America. Sage-grouse are a candidate for endangered species listing due to habitat and population fragmentation coupled with inadequate regulation to control development in critical areas. Conservation planning would benefit from accurate maps delineating required habitats and movement corridors. However, developing a species distribution model that incorporates the diversity of habitats used by sage-grouse across their widespread distribution has statistical and logistical challenges. We first identified the ecological minimums limiting sage-grouse, mapped similarity to the multivariate set of minimums, and delineated connectivity across a 920,000 km(2) region. We partitioned a Mahalanobis D (2) model of habitat use into k separate additive components each representing independent combinations of species-habitat relationships to identify the ecological minimums required by sage-grouse. We constructed the model from abiotic, land cover, and anthropogenic variables measured at leks (breeding) and surrounding areas within 5 km. We evaluated model partitions using a random subset of leks and historic locations and selected D (2) (k = 10) for mapping a habitat similarity index (HSI). Finally, we delineated connectivity by converting the mapped HSI to a resistance surface. Sage-grouse required sagebrush-dominated landscapes containing minimal levels of human land use. Sage-grouse used relatively arid regions characterized by shallow slopes, even terrain, and low amounts of forest, grassland, and agriculture in the surrounding landscape. Most populations were interconnected although several outlying populations were isolated because of distance or lack of habitat corridors for exchange. Land management agencies currently are revising land-use plans and designating critical habitat to conserve sage-grouse and avoid endangered

  1. Modelling personality, plasticity and predictability in shelter dogs

    Science.gov (United States)

    2017-01-01

    Behavioural assessments of shelter dogs (Canis lupus familiaris) typically comprise standardized test batteries conducted at one time point, but test batteries have shown inconsistent predictive validity. Longitudinal behavioural assessments offer an alternative. We modelled longitudinal observational data on shelter dog behaviour using the framework of behavioural reaction norms, partitioning variance into personality (i.e. inter-individual differences in behaviour), plasticity (i.e. inter-individual differences in average behaviour) and predictability (i.e. individual differences in residual intra-individual variation). We analysed data on interactions of 3263 dogs (n = 19 281) with unfamiliar people during their first month after arrival at the shelter. Accounting for personality, plasticity (linear and quadratic trends) and predictability improved the predictive accuracy of the analyses compared to models quantifying personality and/or plasticity only. While dogs were, on average, highly sociable with unfamiliar people and sociability increased over days since arrival, group averages were unrepresentative of all dogs and predictions made at the individual level entailed considerable uncertainty. Effects of demographic variables (e.g. age) on personality, plasticity and predictability were observed. Behavioural repeatability was higher one week after arrival compared to arrival day. Our results highlight the value of longitudinal assessments on shelter dogs and identify measures that could improve the predictive validity of behavioural assessments in shelters. PMID:28989764

  2. Prediction of type A behaviour: A structural equation model

    Directory of Open Access Journals (Sweden)

    René van Wyk

    2009-05-01

    Full Text Available The predictability of Type A behaviour was measured in a sample of 375 professionals with a shortened version of the Jenkins Activity Survey (JAS. Two structural equation models were constructed with the Type A behaviour achievement sub-scale and global (total Type A as the predictor variables. The indices showed a reasonable-to-promising fit with the data. Type A achievement was reasonably predicted by service-career orientation, internal locus of control, power self-concept and economic innovation. Type A global was also predicted by internal locus of control, power self-concept and the entrepreneurial attitude of achievement and personal control.

  3. Modelling of physical properties - databases, uncertainties and predictive power

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical and thermodynamic property in the form of raw data or estimated values for pure compounds and mixtures are important pre-requisites for performing tasks such as, process design, simulation and optimization; computer aided molecular/mixture (product) design; and, product-process analysis...... in the estimated/predicted property values, how to assess the quality and reliability of the estimated/predicted property values? The paper will review a class of models for prediction of physical and thermodynamic properties of organic chemicals and their mixtures based on the combined group contribution – atom...

  4. Determining the prediction limits of models and classifiers with applications for disruption prediction in JET

    Science.gov (United States)

    Murari, A.; Peluso, E.; Vega, J.; Gelfusa, M.; Lungaroni, M.; Gaudio, P.; Martínez, F. J.; Contributors, JET

    2017-01-01

    Understanding the many aspects of tokamak physics requires the development of quite sophisticated models. Moreover, in the operation of the devices, prediction of the future evolution of discharges can be of crucial importance, particularly in the case of the prediction of disruptions, which can cause serious damage to various parts of the machine. The determination of the limits of predictability is therefore an important issue for modelling, classifying and forecasting. In all these cases, once a certain level of performance has been reached, the question typically arises as to whether all the information available in the data has been exploited, or whether there are still margins for improvement of the tools being developed. In this paper, a theoretical information approach is proposed to address this issue. The excellent properties of the developed indicator, called the prediction factor (PF), have been proved with the help of a series of numerical tests. Its application to some typical behaviour relating to macroscopic instabilities in tokamaks has shown very positive results. The prediction factor has also been used to assess the performance of disruption predictors running in real time in the JET system, including the one systematically deployed in the feedback loop for mitigation purposes. The main conclusion is that the most advanced predictors basically exploit all the information contained in the locked mode signal on which they are based. Therefore, qualitative improvements in disruption prediction performance in JET would need the processing of additional signals, probably profiles.

  5. Predictive power of theoretical modelling of the nuclear mean field: examples of improving predictive capacities

    Science.gov (United States)

    Dedes, I.; Dudek, J.

    2018-03-01

    We examine the effects of the parametric correlations on the predictive capacities of the theoretical modelling keeping in mind the nuclear structure applications. The main purpose of this work is to illustrate the method of establishing the presence and determining the form of parametric correlations within a model as well as an algorithm of elimination by substitution (see text) of parametric correlations. We examine the effects of the elimination of the parametric correlations on the stabilisation of the model predictions further and further away from the fitting zone. It follows that the choice of the physics case and the selection of the associated model are of secondary importance in this case. Under these circumstances we give priority to the relative simplicity of the underlying mathematical algorithm, provided the model is realistic. Following such criteria, we focus specifically on an important but relatively simple case of doubly magic spherical nuclei. To profit from the algorithmic simplicity we chose working with the phenomenological spherically symmetric Woods–Saxon mean-field. We employ two variants of the underlying Hamiltonian, the traditional one involving both the central and the spin orbit potential in the Woods–Saxon form and the more advanced version with the self-consistent density-dependent spin–orbit interaction. We compare the effects of eliminating of various types of correlations and discuss the improvement of the quality of predictions (‘predictive power’) under realistic parameter adjustment conditions.

  6. Increased prediction accuracy in wheat breeding trials using a marker × environment interaction genomic selection model.

    Science.gov (United States)

    Lopez-Cruz, Marco; Crossa, Jose; Bonnett, David; Dreisigacker, Susanne; Poland, Jesse; Jannink, Jean-Luc; Singh, Ravi P; Autrique, Enrique; de los Campos, Gustavo

    2015-02-06

    Genomic selection (GS) models use genome-wide genetic information to predict genetic values of candidates of selection. Originally, these models were developed without considering genotype × environment interaction(G×E). Several authors have proposed extensions of the single-environment GS model that accommodate G×E using either covariance functions or environmental covariates. In this study, we model G×E using a marker × environment interaction (M×E) GS model; the approach is conceptually simple and can be implemented with existing GS software. We discuss how the model can be implemented by using an explicit regression of phenotypes on markers or using co-variance structures (a genomic best linear unbiased prediction-type model). We used the M×E model to analyze three CIMMYT wheat data sets (W1, W2, and W3), where more than 1000 lines were genotyped using genotyping-by-sequencing and evaluated at CIMMYT's research station in Ciudad Obregon, Mexico, under simulated environmental conditions that covered different irrigation levels, sowing dates and planting systems. We compared the M×E model with a stratified (i.e., within-environment) analysis and with a standard (across-environment) GS model that assumes that effects are constant across environments (i.e., ignoring G×E). The prediction accuracy of the M×E model was substantially greater of that of an across-environment analysis that ignores G×E. Depending on the prediction problem, the M×E model had either similar or greater levels of prediction accuracy than the stratified analyses. The M×E model decomposes marker effects and genomic values into components that are stable across environments (main effects) and others that are environment-specific (interactions). Therefore, in principle, the interaction model could shed light over which variants have effects that are stable across environments and which ones are responsible for G×E. The data set and the scripts required to reproduce the analysis are

  7. GA-ARMA Model for Predicting IGS RTS Corrections

    Directory of Open Access Journals (Sweden)

    Mingyu Kim

    2017-01-01

    Full Text Available The global navigation satellite system (GNSS is widely used to estimate user positions. For precise positioning, users should correct for GNSS error components such as satellite orbit and clock errors as well as ionospheric delay. The international GNSS service (IGS real-time service (RTS can be used to correct orbit and clock errors in real-time. Since the IGS RTS provides real-time corrections via the Internet, intermittent data loss can occur due to software or hardware failures. We propose applying a genetic algorithm autoregressive moving average (GA-ARMA model to predict the IGS RTS corrections during data loss periods. The RTS orbit and clock corrections are predicted up to 900 s via the GA-ARMA model, and the prediction accuracies are compared with the results from a generic ARMA model. The orbit prediction performance of the GA-ARMA is nearly equivalent to that of ARMA, but GA-ARMA’s clock prediction performance is clearly better than that of ARMA, achieving a 32% error reduction. Predicted RTS corrections are applied to the broadcast ephemeris, and precise point positioning accuracies are compared. GA-ARMA shows a significant accuracy improvement over ARMA, particularly in terms of vertical positioning.

  8. Modeling the prediction of business intelligence system effectiveness.

    Science.gov (United States)

    Weng, Sung-Shun; Yang, Ming-Hsien; Koo, Tian-Lih; Hsiao, Pei-I

    2016-01-01

    Although business intelligence (BI) technologies are continually evolving, the capability to apply BI technologies has become an indispensable resource for enterprises running in today's complex, uncertain and dynamic business environment. This study performed pioneering work by constructing models and rules for the prediction of business intelligence system effectiveness (BISE) in relation to the implementation of BI solutions. For enterprises, effectively managing critical attributes that determine BISE to develop prediction models with a set of rules for self-evaluation of the effectiveness of BI solutions is necessary to improve BI implementation and ensure its success. The main study findings identified the critical prediction indicators of BISE that are important to forecasting BI performance and highlighted five classification and prediction rules of BISE derived from decision tree structures, as well as a refined regression prediction model with four critical prediction indicators constructed by logistic regression analysis that can enable enterprises to improve BISE while effectively managing BI solution implementation and catering to academics to whom theory is important.

  9. In silico modeling to predict drug-induced phospholipidosis

    International Nuclear Information System (INIS)

    Choi, Sydney S.; Kim, Jae S.; Valerio, Luis G.; Sadrieh, Nakissa

    2013-01-01

    Drug-induced phospholipidosis (DIPL) is a preclinical finding during pharmaceutical drug development that has implications on the course of drug development and regulatory safety review. A principal characteristic of drugs inducing DIPL is known to be a cationic amphiphilic structure. This provides evidence for a structure-based explanation and opportunity to analyze properties and structures of drugs with the histopathologic findings for DIPL. In previous work from the FDA, in silico quantitative structure–activity relationship (QSAR) modeling using machine learning approaches has shown promise with a large dataset of drugs but included unconfirmed data as well. In this study, we report the construction and validation of a battery of complementary in silico QSAR models using the FDA's updated database on phospholipidosis, new algorithms and predictive technologies, and in particular, we address high performance with a high-confidence dataset. The results of our modeling for DIPL include rigorous external validation tests showing 80–81% concordance. Furthermore, the predictive performance characteristics include models with high sensitivity and specificity, in most cases above ≥ 80% leading to desired high negative and positive predictivity. These models are intended to be utilized for regulatory toxicology applied science needs in screening new drugs for DIPL. - Highlights: • New in silico models for predicting drug-induced phospholipidosis (DIPL) are described. • The training set data in the models is derived from the FDA's phospholipidosis database. • We find excellent predictivity values of the models based on external validation. • The models can support drug screening and regulatory decision-making on DIPL

  10. Cardiopulmonary Circuit Models for Predicting Injury to the Heart

    Science.gov (United States)

    Ward, Richard; Wing, Sarah; Bassingthwaighte, James; Neal, Maxwell

    2004-11-01

    Circuit models have been used extensively in physiology to describe cardiopulmonary function. Such models are being used in the DARPA Virtual Soldier (VS) Project* to predict the response to injury or physiological stress. The most complex model consists of systemic circulation, pulmonary circulation, and a four-chamber heart sub-model. This model also includes baroreceptor feedback, airway mechanics, gas exchange, and pleural pressure influence on the circulation. As part of the VS Project, Oak Ridge National Laboratory has been evaluating various cardiopulmonary circuit models for predicting the effects of injury to the heart. We describe, from a physicist's perspective, the concept of building circuit models, discuss both unstressed and stressed models, and show how the stressed models are used to predict effects of specific wounds. *This work was supported by a grant from the DARPA, executed by the U.S. Army Medical Research and Materiel Command/TATRC Cooperative Agreement, Contract # W81XWH-04-2-0012. The submitted manuscript has been authored by the U.S. Department of Energy, Office of Science of the Oak Ridge National Laboratory, managed for the U.S. DOE by UT-Battelle, LLC, under contract No. DE-AC05-00OR22725. Accordingly, the U.S. Government retains a non-exclusive, royalty-free license to publish or reproduce the published form of this contribution, or allow others to do so, for U.S. Government purpose.

  11. Prediction of gas compressibility factor using intelligent models

    Directory of Open Access Journals (Sweden)

    Mohamad Mohamadi-Baghmolaei

    2015-10-01

    Full Text Available The gas compressibility factor, also known as Z-factor, plays the determinative role for obtaining thermodynamic properties of gas reservoir. Typically, empirical correlations have been applied to determine this important property. However, weak performance and some limitations of these correlations have persuaded the researchers to use intelligent models instead. In this work, prediction of Z-factor is aimed using different popular intelligent models in order to find the accurate one. The developed intelligent models are including Artificial Neural Network (ANN, Fuzzy Interface System (FIS and Adaptive Neuro-Fuzzy System (ANFIS. Also optimization of equation of state (EOS by Genetic Algorithm (GA is done as well. The validity of developed intelligent models was tested using 1038 series of published data points in literature. It was observed that the accuracy of intelligent predicting models for Z-factor is significantly better than conventional empirical models. Also, results showed the improvement of optimized EOS predictions when coupled with GA optimization. Moreover, of the three intelligent models, ANN model outperforms other models considering all data and 263 field data points of an Iranian offshore gas condensate with R2 of 0.9999, while the R2 for best empirical correlation was about 0.8334.

  12. A single-system model predicts recognition memory and repetition priming in amnesia.

    Science.gov (United States)

    Berry, Christopher J; Kessels, Roy P C; Wester, Arie J; Shanks, David R

    2014-08-13

    We challenge the claim that there are distinct neural systems for explicit and implicit memory by demonstrating that a formal single-system model predicts the pattern of recognition memory (explicit) and repetition priming (implicit) in amnesia. In the current investigation, human participants with amnesia categorized pictures of objects at study and then, at test, identified fragmented versions of studied (old) and nonstudied (new) objects (providing a measure of priming), and made a recognition memory judgment (old vs new) for each object. Numerous results in the amnesic patients were predicted in advance by the single-system model, as follows: (1) deficits in recognition memory and priming were evident relative to a control group; (2) items judged as old were identified at greater levels of fragmentation than items judged new, regardless of whether the items were actually old or new; and (3) the magnitude of the priming effect (the identification advantage for old vs new items) overall was greater than that of items judged new. Model evidence measures also favored the single-system model over two formal multiple-systems models. The findings support the single-system model, which explains the pattern of recognition and priming in amnesia primarily as a reduction in the strength of a single dimension of memory strength, rather than a selective explicit memory system deficit. Copyright © 2014 the authors 0270-6474/14/3410963-12$15.00/0.

  13. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Predict Moral Distress Using Workplace Stress, Stress of Conscience Mediated by Coping Using Roy Adaptation Model: A Path Analysis.

    Science.gov (United States)

    Alkrisat, Muder

    2016-12-01

    Moral distress can be predisposed when nurses are exposed to ambiguous moral situations. Is to test a conceptual model based on Roy adaptation model (RAM) to examine the relationship among workplace stress, conscience stress, and moral distress mediated by coping. A correlational, cross sectional. Data were collected from 199 licensed nurses. The findings indicated that workplace stress was related negatively to coping processes (β = -.12) and that stress of conscience was predictive of greater use of coping process (β = -.21). The results indicated that the model suggested based on RAM is saturated and is the perfect fit. However, the alternative models indicated that workplace stress moderately predicted moral distress.

  15. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit

    2015-04-16

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  16. Model Predictive Control of Wind Turbines using Uncertain LIDAR Measurements

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Soltani, Mohsen; Poulsen, Niels Kjølstad

    2013-01-01

    The problem of Model predictive control (MPC) of wind turbines using uncertain LIDAR (LIght Detection And Ranging) measurements is considered. A nonlinear dynamical model of the wind turbine is obtained. We linearize the obtained nonlinear model for different operating points, which are determined......, we simplify state prediction for the MPC. Consequently, the control problem of the nonlinear system is simplified into a quadratic programming. We consider uncertainty in the wind propagation time, which is the traveling time of wind from the LIDAR measurement point to the rotor. An algorithm based...... by the effective wind speed on the rotor disc. We take the wind speed as a scheduling variable. The wind speed is measurable ahead of the turbine using LIDARs, therefore, the scheduling variable is known for the entire prediction horizon. By taking the advantage of having future values of the scheduling variable...

  17. Data Quality Enhanced Prediction Model for Massive Plant Data

    Energy Technology Data Exchange (ETDEWEB)

    Park, Moon-Ghu [Nuclear Engr. Sejong Univ., Seoul (Korea, Republic of); Kang, Seong-Ki [Monitoring and Diagnosis, Suwon (Korea, Republic of); Shin, Hajin [Saint Paul Preparatory Seoul, Seoul (Korea, Republic of)

    2016-10-15

    This paper introduces an integrated signal preconditioning and model prediction mainly by kernel functions. The performance and benefits of the methods are demonstrated by a case study with measurement data from a power plant and its components transient data. The developed methods will be applied as a part of monitoring massive or big data platform where human experts cannot detect the fault behaviors due to too large size of the measurements. Recent extensive efforts for on-line monitoring implementation insists that a big surprise in the modeling for predicting process variables was the extent of data quality problems in measurement data especially for data-driven modeling. Bad data for training will be learned as normal and can make significant degrade in prediction performance. For this reason, the quantity and quality of measurement data in modeling phase need special care. Bad quality data must be removed from training sets to the bad data considered as normal system behavior. This paper presented an integrated structure of supervisory system for monitoring the plants or sensors performance. The quality of the data-driven model is improved with a bilateral kernel filter for preprocessing of the noisy data. The prediction module is also based on kernel regression having the same basis with noise filter. The model structure is optimized by a grouping process with nonlinear Hoeffding correlation function.

  18. Data Quality Enhanced Prediction Model for Massive Plant Data

    International Nuclear Information System (INIS)

    Park, Moon-Ghu; Kang, Seong-Ki; Shin, Hajin

    2016-01-01

    This paper introduces an integrated signal preconditioning and model prediction mainly by kernel functions. The performance and benefits of the methods are demonstrated by a case study with measurement data from a power plant and its components transient data. The developed methods will be applied as a part of monitoring massive or big data platform where human experts cannot detect the fault behaviors due to too large size of the measurements. Recent extensive efforts for on-line monitoring implementation insists that a big surprise in the modeling for predicting process variables was the extent of data quality problems in measurement data especially for data-driven modeling. Bad data for training will be learned as normal and can make significant degrade in prediction performance. For this reason, the quantity and quality of measurement data in modeling phase need special care. Bad quality data must be removed from training sets to the bad data considered as normal system behavior. This paper presented an integrated structure of supervisory system for monitoring the plants or sensors performance. The quality of the data-driven model is improved with a bilateral kernel filter for preprocessing of the noisy data. The prediction module is also based on kernel regression having the same basis with noise filter. The model structure is optimized by a grouping process with nonlinear Hoeffding correlation function

  19. Computational modeling of oligonucleotide positional densities for human promoter prediction.

    Science.gov (United States)

    Narang, Vipin; Sung, Wing-Kin; Mittal, Ankush

    2005-01-01

    The gene promoter region controls transcriptional initiation of a gene, which is the most important step in gene regulation. In-silico detection of promoter region in genomic sequences has a number of applications in gene discovery and understanding gene expression regulation. However, computational prediction of eukaryotic poly-II promoters has remained a difficult task. This paper introduces a novel statistical technique for detecting promoter regions in long genomic sequences. A number of existing techniques analyze the occurrence frequencies of oligonucleotides in promoter sequences as compared to other genomic regions. In contrast, the present work studies the positional densities of oligonucleotides in promoter sequences. The analysis does not require any non-promoter sequence dataset or any model of the background oligonucleotide content of the genome. The statistical model learnt from a dataset of promoter sequences automatically recognizes a number of transcription factor binding sites simultaneously with their occurrence positions relative to the transcription start site. Based on this model, a continuous naïve Bayes classifier is developed for the detection of human promoters and transcription start sites in genomic sequences. The present study extends the scope of statistical models in general promoter modeling and prediction. Promoter sequence features learnt by the model correlate well with known biological facts. Results of human transcription start site prediction compare favorably with existing 2nd generation promoter prediction tools.

  20. Predictive modeling of mosquito abundance and dengue transmission in Kenya

    Science.gov (United States)

    Caldwell, J.; Krystosik, A.; Mutuku, F.; Ndenga, B.; LaBeaud, D.; Mordecai, E.

    2017-12-01

    Approximately 390 million people are exposed to dengue virus every year, and with no widely available treatments or vaccines, predictive models of disease risk are valuable tools for vector control and disease prevention. The aim of this study was to modify and improve climate-driven predictive models of dengue vector abundance (Aedes spp. mosquitoes) and viral transmission to people in Kenya. We simulated disease transmission using a temperature-driven mechanistic model and compared model predictions with vector trap data for larvae, pupae, and adult mosquitoes collected between 2014 and 2017 at four sites across urban and rural villages in Kenya. We tested predictive capacity of our models using four temperature measurements (minimum, maximum, range, and anomalies) across daily, weekly, and monthly time scales. Our results indicate seasonal temperature variation is a key driving factor of Aedes mosquito abundance and disease transmission. These models can help vector control programs target specific locations and times when vectors are likely to be present, and can be modified for other Aedes-transmitted diseases and arboviral endemic regions around the world.

  1. Prediction models and control algorithms for predictive applications of setback temperature in cooling systems

    International Nuclear Information System (INIS)

    Moon, Jin Woo; Yoon, Younju; Jeon, Young-Hoon; Kim, Sooyoung

    2017-01-01

    Highlights: • Initial ANN model was developed for predicting the time to the setback temperature. • Initial model was optimized for producing accurate output. • Optimized model proved its prediction accuracy. • ANN-based algorithms were developed and tested their performance. • ANN-based algorithms presented superior thermal comfort or energy efficiency. - Abstract: In this study, a temperature control algorithm was developed to apply a setback temperature predictively for the cooling system of a residential building during occupied periods by residents. An artificial neural network (ANN) model was developed to determine the required time for increasing the current indoor temperature to the setback temperature. This study involved three phases: development of the initial ANN-based prediction model, optimization and testing of the initial model, and development and testing of three control algorithms. The development and performance testing of the model and algorithm were conducted using TRNSYS and MATLAB. Through the development and optimization process, the final ANN model employed indoor temperature and the temperature difference between the current and target setback temperature as two input neurons. The optimal number of hidden layers, number of neurons, learning rate, and moment were determined to be 4, 9, 0.6, and 0.9, respectively. The tangent–sigmoid and pure-linear transfer function was used in the hidden and output neurons, respectively. The ANN model used 100 training data sets with sliding-window method for data management. Levenberg-Marquart training method was employed for model training. The optimized model had a prediction accuracy of 0.9097 root mean square errors when compared with the simulated results. Employing the ANN model, ANN-based algorithms maintained indoor temperatures better within target ranges. Compared to the conventional algorithm, the ANN-based algorithms reduced the duration of time, in which the indoor temperature

  2. Relative sensitivity analysis of the predictive properties of sloppy models.

    Science.gov (United States)

    Myasnikova, Ekaterina; Spirov, Alexander

    2018-01-25

    Commonly among the model parameters characterizing complex biological systems are those that do not significantly influence the quality of the fit to experimental data, so-called "sloppy" parameters. The sloppiness can be mathematically expressed through saturating response functions (Hill's, sigmoid) thereby embodying biological mechanisms responsible for the system robustness to external perturbations. However, if a sloppy model is used for the prediction of the system behavior at the altered input (e.g. knock out mutations, natural expression variability), it may demonstrate the poor predictive power due to the ambiguity in the parameter estimates. We introduce a method of the predictive power evaluation under the parameter estimation uncertainty, Relative Sensitivity Analysis. The prediction problem is addressed in the context of gene circuit models describing the dynamics of segmentation gene expression in Drosophila embryo. Gene regulation in these models is introduced by a saturating sigmoid function of the concentrations of the regulatory gene products. We show how our approach can be applied to characterize the essential difference between the sensitivity properties of robust and non-robust solutions and select among the existing solutions those providing the correct system behavior at any reasonable input. In general, the method allows to uncover the sources of incorrect predictions and proposes the way to overcome the estimation uncertainties.

  3. New Temperature-based Models for Predicting Global Solar Radiation

    International Nuclear Information System (INIS)

    Hassan, Gasser E.; Youssef, M. Elsayed; Mohamed, Zahraa E.; Ali, Mohamed A.; Hanafy, Ahmed A.

    2016-01-01

    Highlights: • New temperature-based models for estimating solar radiation are investigated. • The models are validated against 20-years measured data of global solar radiation. • The new temperature-based model shows the best performance for coastal sites. • The new temperature-based model is more accurate than the sunshine-based models. • The new model is highly applicable with weather temperature forecast techniques. - Abstract: This study presents new ambient-temperature-based models for estimating global solar radiation as alternatives to the widely used sunshine-based models owing to the unavailability of sunshine data at all locations around the world. Seventeen new temperature-based models are established, validated and compared with other three models proposed in the literature (the Annandale, Allen and Goodin models) to estimate the monthly average daily global solar radiation on a horizontal surface. These models are developed using a 20-year measured dataset of global solar radiation for the case study location (Lat. 30°51′N and long. 29°34′E), and then, the general formulae of the newly suggested models are examined for ten different locations around Egypt. Moreover, the local formulae for the models are established and validated for two coastal locations where the general formulae give inaccurate predictions. Mostly common statistical errors are utilized to evaluate the performance of these models and identify the most accurate model. The obtained results show that the local formula for the most accurate new model provides good predictions for global solar radiation at different locations, especially at coastal sites. Moreover, the local and general formulas of the most accurate temperature-based model also perform better than the two most accurate sunshine-based models from the literature. The quick and accurate estimations of the global solar radiation using this approach can be employed in the design and evaluation of performance for

  4. A Deep Learning Prediction Model Based on Extreme-Point Symmetric Mode Decomposition and Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Guohui Li

    2017-01-01

    Full Text Available Aiming at the irregularity of nonlinear signal and its predicting difficulty, a deep learning prediction model based on extreme-point symmetric mode decomposition (ESMD and clustering analysis is proposed. Firstly, the original data is decomposed by ESMD to obtain the finite number of intrinsic mode functions (IMFs and residuals. Secondly, the fuzzy c-means is used to cluster the decomposed components, and then the deep belief network (DBN is used to predict it. Finally, the reconstructed IMFs and residuals are the final prediction results. Six kinds of prediction models are compared, which are DBN prediction model, EMD-DBN prediction model, EEMD-DBN prediction model, CEEMD-DBN prediction model, ESMD-DBN prediction model, and the proposed model in this paper. The same sunspots time series are predicted with six kinds of prediction models. The experimental results show that the proposed model has better prediction accuracy and smaller error.

  5. 4K Video Traffic Prediction using Seasonal Autoregressive Modeling

    Directory of Open Access Journals (Sweden)

    D. R. Marković

    2017-06-01

    Full Text Available From the perspective of average viewer, high definition video streams such as HD (High Definition and UHD (Ultra HD are increasing their internet presence year over year. This is not surprising, having in mind expansion of HD streaming services, such as YouTube, Netflix etc. Therefore, high definition video streams are starting to challenge network resource allocation with their bandwidth requirements and statistical characteristics. Need for analysis and modeling of this demanding video traffic has essential importance for better quality of service and experience support. In this paper we use an easy-to-apply statistical model for prediction of 4K video traffic. Namely, seasonal autoregressive modeling is applied in prediction of 4K video traffic, encoded with HEVC (High Efficiency Video Coding. Analysis and modeling were performed within R programming environment using over 17.000 high definition video frames. It is shown that the proposed methodology provides good accuracy in high definition video traffic modeling.

  6. Greater-confinement disposal

    International Nuclear Information System (INIS)

    Trevorrow, L.E.; Schubert, J.P.

    1989-01-01

    Greater-confinement disposal (GCD) is a general term for low-level waste (LLW) disposal technologies that employ natural and/or engineered barriers and provide a degree of confinement greater than that of shallow-land burial (SLB) but possibly less than that of a geologic repository. Thus GCD is associated with lower risk/hazard ratios than SLB. Although any number of disposal technologies might satisfy the definition of GCD, eight have been selected for consideration in this discussion. These technologies include: (1) earth-covered tumuli, (2) concrete structures, both above and below grade, (3) deep trenches, (4) augered shafts, (5) rock cavities, (6) abandoned mines, (7) high-integrity containers, and (8) hydrofracture. Each of these technologies employ several operations that are mature,however, some are at more advanced stages of development and demonstration than others. Each is defined and further described by information on design, advantages and disadvantages, special equipment requirements, and characteristic operations such as construction, waste emplacement, and closure

  7. Deep structure of the continental margin and basin off Greater Kabylia, Algeria - New insights from wide-angle seismic data modeling and multichannel seismic interpretation

    Science.gov (United States)

    Aïdi, Chafik; Beslier, Marie-Odile; Yelles-Chaouche, Abdel Karim; Klingelhoefer, Frauke; Bracene, Rabah; Galve, Audrey; Bounif, Abdallah; Schenini, Laure; Hamai, Lamine; Schnurle, Philippe; Djellit, Hamou; Sage, Françoise; Charvis, Philippe; Déverchère, Jacques

    2018-03-01

    During the Algerian-French SPIRAL survey aimed at investigating the deep structure of the Algerian margin and basin, two coincident wide-angle and reflection seismic profiles were acquired in central Algeria, offshore Greater Kabylia, together with gravimetric, bathymetric and magnetic data. This 260 km-long offshore-onshore profile spans the Balearic basin, the central Algerian margin and the Greater Kabylia block up to the southward limit of the internal zones onshore. Results are obtained from modeling and interpretation of the combined data sets. The Algerian basin offshore Greater Kabylia is floored by a thin oceanic crust ( 4 km) with P-wave velocities ranging between 5.2 and 6.8 km/s. In the northern Hannibal High region, the atypical 3-layer crustal structure is interpreted as volcanic products stacked over a thin crust similar to that bordering the margin and related to Miocene post-accretion volcanism. These results support a two-step back-arc opening of the west-Algerian basin, comprising oceanic crust accretion during the first southward stage, and a magmatic and probably tectonic reworking of this young oceanic basement during the second, westward, opening phase. The structure of the central Algerian margin is that of a narrow ( 70 km), magma-poor rifted margin, with a wider zone of distal thinned continental crust than on the other margin segments. There is no evidence for mantle exhumation in the sharp ocean-continent transition, but transcurrent movements during the second opening phase may have changed its initial geometry. The Plio-Quaternary inversion of the margin related to ongoing convergence between Africa and Eurasia is expressed by a blind thrust system under the margin rising toward the surface at the slope toe, and by an isostatic disequilibrium resulting from opposite flexures of two plates decoupled at the continental slope. This disequilibrium is likely responsible for the peculiar asymmetrical shape of the crustal neck that may thus

  8. Vaginally delivered tenofovir disoproxil fumarate provides greater protection than tenofovir against genital herpes in a murine model of efficacy and safety.

    Science.gov (United States)

    Nixon, Briana; Jandl, Thomas; Teller, Ryan S; Taneva, Ekaterina; Wang, Yanhua; Nagaraja, Umadevi; Kiser, Patrick F; Herold, Betsy C

    2014-01-01

    Increased susceptibility to genital herpes in medroxyprogesterone-treated mice may provide a surrogate of increased HIV risk and a preclinical biomarker of topical preexposure prophylaxis safety. We evaluated tenofovir disoproxil fumarate (TDF) in this murine model because an intravaginal ring eluting this drug is being advanced into clinical trials. To avoid the complications of surgically inserting a ring, hydroxyethylcellulose (HEC)-stable formulations of TDF were prepared. One week of twice-daily 0.3% TDF gel was well tolerated and did not result in any increase in HSV-2 susceptibility but protected mice from herpes simplex virus 2 (HSV-2) disease compared to mice treated with the HEC placebo gel. No significant increase in inflammatory cytokines or chemokines in vaginal washes or change in cytokine, chemokine, or mitochondrial gene expression in RNA extracted from genital tract tissue was detected. To further evaluate efficacy, mice were treated with gel once daily beginning 12 h prior to high-dose HSV-2 challenge or 2 h before and after viral challenge (BAT24 dosing). The 0.3% TDF gel provided significant protection compared to the HEC gel following either daily (in 9/10 versus 1/10 mice, P < 0.01) or BAT24 (in 14/20 versus 4/20 mice, P < 0.01) dosing. In contrast, 1% tenofovir (TFV) gel protected only 4/10 mice treated with either regimen. Significant protection was also observed with daily 0.03% TDF compared to HEC. Protection was associated with greater murine cellular permeability of radiolabeled TDF than of TFV. Together, these findings suggest that TDF is safe, may provide substantially greater protection against HSV than TFV, and support the further clinical development of a TDF ring.

  9. Selection of References in Wind Turbine Model Predictive Control Design

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Hovgaard, Tobias

    2015-01-01

    a model predictive controller for a wind turbine. One of the important aspects for a tracking control problem is how to setup the optimal reference tracking problem, as it might be relevant to track, e.g., the three concurrent references: optimal pitch angle, optimal rotational speed, and optimal power....... The importance if the individual references differ depending in particular on the wind speed. In this paper we investigate the performance of a reference tracking model predictive controller with two different setups of the used optimal reference signals. The controllers are evaluated using an industrial high...

  10. A predictive model of music preference using pairwise comparisons

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand; Gallego, Javier Saez; Larsen, Jan

    2012-01-01

    Music recommendation is an important aspect of many streaming services and multi-media systems, however, it is typically based on so-called collaborative filtering methods. In this paper we consider the recommendation task from a personal viewpoint and examine to which degree music preference can...... be elicited and predicted using simple and robust queries such as pairwise comparisons. We propose to model - and in turn predict - the pairwise music preference using a very flexible model based on Gaussian Process priors for which we describe the required inference. We further propose a specific covariance...

  11. Evaluating the reliability of predictions made using environmental transfer models

    International Nuclear Information System (INIS)

    1989-01-01

    The development and application of mathematical models for predicting the consequences of releases of radionuclides into the environment from normal operations in the nuclear fuel cycle and in hypothetical accident conditions has increased dramatically in the last two decades. This Safety Practice publication has been prepared to provide guidance on the available methods for evaluating the reliability of environmental transfer model predictions. It provides a practical introduction of the subject and a particular emphasis has been given to worked examples in the text. It is intended to supplement existing IAEA publications on environmental assessment methodology. 60 refs, 17 figs, 12 tabs

  12. Physical/chemical modeling for photovoltaic module life prediction

    Science.gov (United States)

    Moacanin, J.; Carroll, W. F.; Gupta, A.

    1979-01-01

    The paper presents a generalized methodology for identification and evaluation of potential degradation and failure of terrestrial photovoltaic encapsulation. Failure progression modeling and an interaction matrix are utilized to complement the conventional approach to failure degradation mode identification. Comparison of the predicted performance based on these models can produce: (1) constraints on system or component design, materials or operating conditions, (2) qualification (predicted satisfactory function), and (3) uncertainty. The approach has been applied to an investigation of an unexpected delamination failure; it is being used to evaluate thermomechanical interactions in photovoltaic modules and to study corrosion of contacts and interconnects.

  13. A neural network model for olfactory glomerular activity prediction

    Science.gov (United States)

    Soh, Zu; Tsuji, Toshio; Takiguchi, Noboru; Ohtake, Hisao

    2012-12-01

    Recently, the importance of odors and methods for their evaluation have seen increased emphasis, especially in the fragrance and food industries. Although odors can be characterized by their odorant components, their chemical information cannot be directly related to the flavors we perceive. Biological research has revealed that neuronal activity related to glomeruli (which form part of the olfactory system) is closely connected to odor qualities. Here we report on a neural network model of the olfactory system that can predict glomerular activity from odorant molecule structures. We also report on the learning and prediction ability of the proposed model.

  14. Predicted and measured velocity distribution in a model heat exchanger

    International Nuclear Information System (INIS)

    Rhodes, D.B.; Carlucci, L.N.

    1984-01-01

    This paper presents a comparison between numerical predictions, using the porous media concept, and measurements of the two-dimensional isothermal shell-side velocity distributions in a model heat exchanger. Computations and measurements were done with and without tubes present in the model. The effect of tube-to-baffle leakage was also investigated. The comparison was made to validate certain porous media concepts used in a computer code being developed to predict the detailed shell-side flow in a wide range of shell-and-tube heat exchanger geometries

  15. Mathematical modeling to predict residential solid waste generation.

    Science.gov (United States)

    Benítez, Sara Ojeda; Lozano-Olvera, Gabriela; Morelos, Raúl Adalberto; Vega, Carolina Armijo de

    2008-01-01

    One of the challenges faced by waste management authorities is determining the amount of waste generated by households in order to establish waste management systems, as well as trying to charge rates compatible with the principle applied worldwide, and design a fair payment system for households according to the amount of residential solid waste (RSW) they generate. The goal of this research work was to establish mathematical models that correlate the generation of RSW per capita to the following variables: education, income per household, and number of residents. This work was based on data from a study on generation, quantification and composition of residential waste in a Mexican city in three stages. In order to define prediction models, five variables were identified and included in the model. For each waste sampling stage a different mathematical model was developed, in order to find the model that showed the best linear relation to predict residential solid waste generation. Later on, models to explore the combination of included variables and select those which showed a higher R(2) were established. The tests applied were normality, multicolinearity and heteroskedasticity. Another model, formulated with four variables, was generated and the Durban-Watson test was applied to it. Finally, a general mathematical model is proposed to predict residential waste generation, which accounts for 51% of the total.

  16. Using Deep Learning Model for Meteorological Satellite Cloud Image Prediction

    Science.gov (United States)

    Su, X.

    2017-12-01

    A satellite cloud image contains much weather information such as precipitation information. Short-time cloud movement forecast is important for precipitation forecast and is the primary means for typhoon monitoring. The traditional methods are mostly using the cloud feature matching and linear extrapolation to predict the cloud movement, which makes that the nonstationary process such as inversion and deformation during the movement of the cloud is basically not considered. It is still a hard task to predict cloud movement timely and correctly. As deep learning model could perform well in learning spatiotemporal features, to meet this challenge, we could regard cloud image prediction as a spatiotemporal sequence forecasting problem and introduce deep learning model to solve this problem. In this research, we use a variant of Gated-Recurrent-Unit(GRU) that has convolutional structures to deal with spatiotemporal features and build an end-to-end model to solve this forecast problem. In this model, both the input and output are spatiotemporal sequences. Compared to Convolutional LSTM(ConvLSTM) model, this model has lower amount of parameters. We imply this model on GOES satellite data and the model perform well.

  17. Predictive models in cancer management: A guide for clinicians.

    Science.gov (United States)

    Kazem, Mohammed Ali

    2017-04-01

    Predictive tools in cancer management are used to predict different outcomes including survival probability or risk of recurrence. The uptake of these tools by clinicians involved in cancer management has not been as common as other clinical tools, which may be due to the complexity of some of these tools or a lack of understanding of how they can aid decision-making in particular clinical situations. The aim of this article is to improve clinicians' knowledge and understanding of predictive tools used in cancer management, including how they are built, how they can be applied to medical practice, and what their limitations may be. Literature review was conducted to investigate the role of predictive tools in cancer management. All predictive models share similar characteristics, but depending on the type of the tool its ability to predict an outcome will differ. Each type has its own pros and cons, and its generalisability will depend on the cohort used to build the tool. These factors will affect the clinician's decision whether to apply the model to their cohort or not. Before a model is used in clinical practice, it is important to appreciate how the model is constructed, what its use may add over and above traditional decision-making tools, and what problems or limitations may be associated with it. Understanding all the above is an important step for any clinician who wants to decide whether or not use predictive tools in their practice. Copyright © 2016 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  18. Ensemble ecosystem modeling for predicting ecosystem response to predator reintroduction.

    Science.gov (United States)

    Baker, Christopher M; Gordon, Ascelin; Bode, Michael

    2017-04-01

    Introducing a new or extirpated species to an ecosystem is risky, and managers need quantitative methods that can predict the consequences for the recipient ecosystem. Proponents of keystone predator reintroductions commonly argue that the presence of the predator will restore ecosystem function, but this has not always been the case, and mathematical modeling has an important role to play in predicting how reintroductions will likely play out. We devised an ensemble modeling method that integrates species interaction networks and dynamic community simulations and used it to describe the range of plausible consequences of 2 keystone-predator reintroductions: wolves (Canis lupus) to Yellowstone National Park and dingoes (Canis dingo) to a national park in Australia. Although previous methods for predicting ecosystem responses to such interventions focused on predicting changes around a given equilibrium, we used Lotka-Volterra equations to predict changing abundances through time. We applied our method to interaction networks for wolves in Yellowstone National Park and for dingoes in Australia. Our model replicated the observed dynamics in Yellowstone National Park and produced a larger range of potential outcomes for the dingo network. However, we also found that changes in small vertebrates or invertebrates gave a good indication about the potential future state of the system. Our method allowed us to predict when the systems were far from equilibrium. Our results showed that the method can also be used to predict which species may increase or decrease following a reintroduction and can identify species that are important to monitor (i.e., species whose changes in abundance give extra insight into broad changes in the system). Ensemble ecosystem modeling can also be applied to assess the ecosystem-wide implications of other types of interventions including assisted migration, biocontrol, and invasive species eradication. © 2016 Society for Conservation Biology.

  19. Hybrid multiscale modeling and prediction of cancer cell behavior.

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Zangooei

    Full Text Available Understanding cancer development crossing several spatial-temporal scales is of great practical significance to better understand and treat cancers. It is difficult to tackle this challenge with pure biological means. Moreover, hybrid modeling techniques have been proposed that combine the advantages of the continuum and the discrete methods to model multiscale problems.In light of these problems, we have proposed a new hybrid vascular model to facilitate the multiscale modeling and simulation of cancer development with respect to the agent-based, cellular automata and machine learning methods. The purpose of this simulation is to create a dataset that can be used for prediction of cell phenotypes. By using a proposed Q-learning based on SVR-NSGA-II method, the cells have the capability to predict their phenotypes autonomously that is, to act on its own without external direction in response to situations it encounters.Computational simulations of the model were performed in order to analyze its performance. The most striking feature of our results is that each cell can select its phenotype at each time step according to its condition. We provide evidence that the prediction of cell phenotypes is reliable.Our proposed model, which we term a hybrid multiscale modeling of cancer cell behavior, has the potential to combine the best features of both continuum and discrete models. The in silico results indicate that the 3D model can represent key features of cancer growth, angiogenesis, and its related micro-environment and show that the findings are in good agreement with biological tumor behavior. To the best of our knowledge, this paper is the first hybrid vascular multiscale modeling of cancer cell behavior that has the capability to predict cell phenotypes individually by a self-generated dataset.

  20. Predicting Power Outages Using Multi-Model Ensemble Forecasts

    Science.gov (United States)

    Cerrai, D.; Anagnostou, E. N.; Yang, J.; Astitha, M.

    2017-12-01

    Power outages affect every year millions of people in the United States, affecting the economy and conditioning the everyday life. An Outage Prediction Model (OPM) has been developed at the University of Connecticut for helping utilities to quickly restore outages and to limit their adverse consequences on the population. The OPM, operational since 2015, combines several non-parametric machine learning (ML) models that use historical weather storm simulations and high-resolution weather forecasts, satellite remote sensing data, and infrastructure and land cover data to predict the number and spatial distribution of power outages. A new methodology, developed for improving the outage model performances by combining weather- and soil-related variables using three different weather models (WRF 3.7, WRF 3.8 and RAMS/ICLAMS), will be presented in this study. First, we will present a performance evaluation of each model variable, by comparing historical weather analyses with station data or reanalysis over the entire storm data set. Hence, each variable of the new outage model version is extracted from the best performing weather model for that variable, and sensitivity tests are performed for investigating the most efficient variable combination for outage prediction purposes. Despite that the final variables combination is extracted from different weather models, this ensemble based on multi-weather forcing and multi-statistical model power outage prediction outperforms the currently operational OPM version that is based on a single weather forcing variable (WRF 3.7), because each model component is the closest to the actual atmospheric state.

  1. Watershed Regressions for Pesticides (WARP) models for predicting stream concentrations of multiple pesticides

    Science.gov (United States)

    Stone, Wesley W.; Crawford, Charles G.; Gilliom, Robert J.

    2013-01-01

    Watershed Regressions for Pesticides for multiple pesticides (WARP-MP) are statistical models developed to predict concentration statistics for a wide range of pesticides in unmonitored streams. The WARP-MP models use the national atrazine WARP models in conjunction with an adjustment factor for each additional pesticide. The WARP-MP models perform best for pesticides with application timing and methods similar to those used with atrazine. For other pesticides, WARP-MP models tend to overpredict concentration statistics for the model development sites. For WARP and WARP-MP, the less-than-ideal sampling frequency for the model development sites leads to underestimation of the shorter-duration concentration; hence, the WARP models tend to underpredict 4- and 21-d maximum moving-average concentrations, with median errors ranging from 9 to 38% As a result of this sampling bias, pesticides that performed well with the model development sites are expected to have predictions that are biased low for these shorter-duration concentration statistics. The overprediction by WARP-MP apparent for some of the pesticides is variably offset by underestimation of the model development concentration statistics. Of the 112 pesticides used in the WARP-MP application to stream segments nationwide, 25 were predicted to have concentration statistics with a 50% or greater probability of exceeding one or more aquatic life benchmarks in one or more stream segments. Geographically, many of the modeled streams in the Corn Belt Region were predicted to have one or more pesticides that exceeded an aquatic life benchmark during 2009, indicating the potential vulnerability of streams in this region.

  2. An updated PREDICT breast cancer prognostication and treatment benefit prediction model with independent validation.

    Science.gov (United States)

    Candido Dos Reis, Francisco J; Wishart, Gordon C; Dicks, Ed M; Greenberg, David; Rashbass, Jem; Schmidt, Marjanka K; van den Broek, Alexandra J; Ellis, Ian O; Green, Andrew; Rakha, Emad; Maishman, Tom; Eccles, Diana M; Pharoah, Paul D P

    2017-05-22

    PREDICT is a breast cancer prognostic and treatment benefit model implemented online. The overall fit of the model has been good in multiple independent case series, but PREDICT has been shown to underestimate breast cancer specific mortality in women diagnosed under the age of 40. Another limitation is the use of discrete categories for tumour size and node status resulting in 'step' changes in risk estimates on moving between categories. We have refitted the PREDICT prognostic model using the original cohort of cases from East Anglia with updated survival time in order to take into account age at diagnosis and to smooth out the survival function for tumour size and node status. Multivariable Cox regression models were used to fit separate models for ER negative and ER positive disease. Continuous variables were fitted using fractional polynomials and a smoothed baseline hazard was obtained by regressing the baseline cumulative hazard for each patients against time using fractional polynomials. The fit of the prognostic models were then tested in three independent data sets that had also been used to validate the original version of PREDICT. In the model fitting data, after adjusting for other prognostic variables, there is an increase in risk of breast cancer specific mortality in younger and older patients with ER positive disease, with a substantial increase in risk for women diagnosed before the age of 35. In ER negative disease the risk increases slightly with age. The association between breast cancer specific mortality and both tumour size and number of positive nodes was non-linear with a more marked increase in risk with increasing size and increasing number of nodes in ER positive disease. The overall calibration and discrimination of the new version of PREDICT (v2) was good and comparable to that of the previous version in both model development and validation data sets. However, the calibration of v2 improved over v1 in patients diagnosed under the age

  3. Performance of ANFIS versus MLP-NN dissolved oxygen prediction models in water quality monitoring.

    Science.gov (United States)

    Najah, A; El-Shafie, A; Karim, O A; El-Shafie, Amr H

    2014-02-01

    We discuss the accuracy and performance of the adaptive neuro-fuzzy inference system (ANFIS) in training and prediction of dissolved oxygen (DO) concentrations. The model was used to analyze historical data generated through continuous monitoring of water quality parameters at several stations on the Johor River to predict DO concentrations. Four water quality parameters were selected for ANFIS modeling, including temperature, pH, nitrate (NO3) concentration, and ammoniacal nitrogen concentration (NH3-NL). Sensitivity analysis was performed to evaluate the effects of the input parameters. The inputs with the greatest effect were those related to oxygen content (NO3) or oxygen demand (NH3-NL). Temperature was the parameter with the least effect, whereas pH provided the lowest contribution to the proposed model. To evaluate the performance of the model, three statistical indices were used: the coefficient of determination (R (2)), the mean absolute prediction error, and the correlation coefficient. The performance of the ANFIS model was compared with an artificial neural network model. The ANFIS model was capable of providing greater accuracy, particularly in the case of extreme events.

  4. Clinical and epidemiological round: Approach to clinical prediction models

    Directory of Open Access Journals (Sweden)

    Isaza-Jaramillo, Sandra

    2017-01-01

    Full Text Available Research related to prognosis can be classified as follows: fundamental, which shows differences in health outcomes; prognostic factors, which identifies and characterizes variables; development, validation and impact of predictive models; and finally, stratified medicine, to establish groups that share a risk factor associated with the outcome of interest. The outcome of a person regarding health or disease status can be predicted considering certain characteristics associated, before or simultaneously, with that outcome. This can be done by means of prognostic or diagnostic predictive models. The development of a predictive model requires to be careful in the selection, definition, measurement and categorization of predictor variables; in the exploration of interactions; in the number of variables to be included; in the calculation of sample size; in the handling of lost data; in the statistical tests to be used, and in the presentation of the model. The model thus developed must be validated in a different group of patients to establish its calibration, discrimination and usefulness.

  5. Neural Network Modeling to Predict Shelf Life of Greenhouse Lettuce

    Directory of Open Access Journals (Sweden)

    Wei-Chin Lin

    2009-04-01

    Full Text Available Greenhouse-grown butter lettuce (Lactuca sativa L. can potentially be stored for 21 days at constant 0°C. When storage temperature was increased to 5°C or 10°C, shelf life was shortened to 14 or 10 days, respectively, in our previous observations. Also, commercial shelf life of 7 to 10 days is common, due to postharvest temperature fluctuations. The objective of this study was to establish neural network (NN models to predict the remaining shelf life (RSL under fluctuating postharvest temperatures. A box of 12 - 24 lettuce heads constituted a sample unit. The end of the shelf life of each head was determined when it showed initial signs of decay or yellowing. Air temperatures inside a shipping box were recorded. Daily average temperatures in storage and averaged shelf life of each box were used as inputs, and the RSL was modeled as an output. An R2 of 0.57 could be observed when a simple NN structure was employed. Since the "future" (or remaining storage temperatures were unavailable at the time of making a prediction, a second NN model was introduced to accommodate a range of future temperatures and associated shelf lives. Using such 2-stage NN models, an R2 of 0.61 could be achieved for predicting RSL. This study indicated that NN modeling has potential for cold chain quality control and shelf life prediction.

  6. A new, accurate predictive model for incident hypertension.

    Science.gov (United States)

    Völzke, Henry; Fung, Glenn; Ittermann, Till; Yu, Shipeng; Baumeister, Sebastian E; Dörr, Marcus; Lieb, Wolfgang; Völker, Uwe; Linneberg, Allan; Jørgensen, Torben; Felix, Stephan B; Rettig, Rainer; Rao, Bharat; Kroemer, Heyo K

    2013-11-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures. The primary study population consisted of 1605 normotensive individuals aged 20-79 years with 5-year follow-up from the population-based study, that is the Study of Health in Pomerania (SHIP). The initial set was randomly split into a training and a testing set. We used a probabilistic graphical model applying a Bayesian network to create a predictive model for incident hypertension and compared the predictive performance with the established Framingham risk score for hypertension. Finally, the model was validated in 2887 participants from INTER99, a Danish community-based intervention study. In the training set of SHIP data, the Bayesian network used a small subset of relevant baseline features including age, mean arterial pressure, rs16998073, serum glucose and urinary albumin concentrations. Furthermore, we detected relevant interactions between age and serum glucose as well as between rs16998073 and urinary albumin concentrations [area under the receiver operating characteristic (AUC 0.76)]. The model was confirmed in the SHIP validation set (AUC 0.78) and externally replicated in INTER99 (AUC 0.77). Compared to the established Framingham risk score for hypertension, the predictive performance of the new model was similar in the SHIP validation set and moderately better in INTER99. Data mining procedures identified a predictive model for incident hypertension, which included innovative and easy-to-measure variables. The findings promise great applicability in screening settings and clinical practice.

  7. An international model to predict recurrent cardiovascular disease.

    Science.gov (United States)

    Wilson, Peter W F; D'Agostino, Ralph; Bhatt, Deepak L; Eagle, Kim; Pencina, Michael J; Smith, Sidney C; Alberts, Mark J; Dallongeville, Jean; Goto, Shinya; Hirsch, Alan T; Liau, Chiau-Suong; Ohman, E Magnus; Röther, Joachim; Reid, Christopher; Mas, Jean-Louis; Steg, Ph Gabriel

    2012-07-01

    Prediction models for cardiovascular events and cardiovascular death in patients with established cardiovascular disease are not generally available. Participants from the prospective REduction of Atherothrombosis for Continued Health (REACH) Registry provided a global outpatient population with known cardiovascular disease at entry. Cardiovascular prediction models were estimated from the 2-year follow-up data of 49,689 participants from around the world. A developmental prediction model was estimated from 33,419 randomly selected participants (2394 cardiovascular events with 1029 cardiovascular deaths) from the pool of 49,689. The number of vascular beds with clinical disease, diabetes, smoking, low body mass index, history of atrial fibrillation, cardiac failure, and history of cardiovascular event(s) <1 year before baseline examination increased risk of a subsequent cardiovascular event. Statin (hazard ratio 0.75; 95% confidence interval, 0.69-0.82) and acetylsalicylic acid therapy (hazard ratio 0.90; 95% confidence interval, 0.83-0.99) also were significantly associated with reduced risk of cardiovascular events. The prediction model was validated in the remaining 16,270 REACH subjects (1172 cardiovascular events, 494 cardiovascular deaths). Risk of cardiovascular death was similarly estimated with the same set of risk factors. Simple algorithms were developed for prediction of overall cardiovascular events and for cardiovascular death. This study establishes and validates a risk model to predict secondary cardiovascular events and cardiovascular death in outpatients with established atherothrombotic disease. Traditional risk factors, burden of disease, lack of treatment, and geographic location all are related to an increased risk of subsequent cardiovascular morbidity and cardiovascular mortality. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. A hybrid predictive model for acoustic noise in urban areas based on time series analysis and artificial neural network

    Science.gov (United States)

    Guarnaccia, Claudio; Quartieri, Joseph; Tepedino, Carmine

    2017-06-01

    The dangerous effect of noise on human health is well known. Both the auditory and non-auditory effects are largely documented in literature, and represent an important hazard in human activities. Particular care is devoted to road traffic noise, since it is growing according to the growth of residential, industrial and commercial areas. For these reasons, it is important to develop effective models able to predict the noise in a certain area. In this paper, a hybrid predictive model is presented. The model is based on the mixing of two different approach: the Time Series Analysis (TSA) and the Artificial Neural Network (ANN). The TSA model is based on the evaluation of trend and seasonality in the data, while the ANN model is based on the capacity of the network to "learn" the behavior of the data. The mixed approach will consist in the evaluation of noise levels by means of TSA and, once the differences (residuals) between TSA estimations and observed data have been calculated, in the training of a ANN on the residuals. This hybrid model will exploit interesting features and results, with a significant variation related to the number of steps forward in the prediction. It will be shown that the best results, in terms of prediction, are achieved predicting one step ahead in the future. Anyway, a 7 days prediction can be performed, with a slightly greater error, but offering a larger range of prediction, with respect to the single day ahead predictive model.

  9. Maximum likelihood Bayesian model averaging and its predictive analysis for groundwater reactive transport models

    Science.gov (United States)

    Curtis, Gary P.; Lu, Dan; Ye, Ming

    2015-01-01

    While Bayesian model averaging (BMA) has been widely used in groundwater modeling, it is infrequently applied to groundwater reactive transport modeling because of multiple sources of uncertainty in the coupled hydrogeochemical processes and because of the long execution time of each model run. To resolve these problems, this study analyzed different levels of uncertainty in a hierarchical way, and used the maximum likelihood version of BMA, i.e., MLBMA, to improve the computational efficiency. This study demonstrates the applicability of MLBMA to groundwater reactive transport modeling in a synthetic case in which twenty-seven reactive transport models were designed to predict the reactive transport of hexavalent uranium (U(VI)) based on observations at a former uranium mill site near Naturita, CO. These reactive transport models contain three uncertain model components, i.e., parameterization of hydraulic conductivity, configuration of model boundary, and surface complexation reactions that simulate U(VI) adsorption. These uncertain model components were aggregated into the alternative models by integrating a hierarchical structure into MLBMA. The modeling results of the individual models and MLBMA were analyzed to investigate their predictive performance. The predictive logscore results show that MLBMA generally outperforms the best model, suggesting that using MLBMA is a sound strategy to achieve more robust model predictions relative to a single model. MLBMA works best when the alternative models are structurally distinct and have diverse model predictions. When correlation in model structure exists, two strategies were used to improve predictive performance by retaining structurally distinct models or assigning smaller prior model probabilities to correlated models. Since the synthetic models were designed using data from the Naturita site, the results of this study are expected to provide guidance for real-world modeling. Limitations of applying MLBMA to the

  10. The development of U. S. soil erosion prediction and modeling

    Directory of Open Access Journals (Sweden)

    John M. Laflen

    2013-09-01

    Full Text Available Soil erosion prediction technology began over 70 years ago when Austin Zingg published a relationship between soil erosion (by water and land slope and length, followed shortly by a relationship by Dwight Smith that expanded this equation to include conservation practices. But, it was nearly 20 years before this work's expansion resulted in the Universal Soil Loss Equation (USLE, perhaps the foremost achievement in soil erosion prediction in the last century. The USLE has increased in application and complexity, and its usefulness and limitations have led to the development of additional technologies and new science in soil erosion research and prediction. Main among these new technologies is the Water Erosion Prediction Project (WEPP model, which has helped to overcome many of the shortcomings of the USLE, and increased the scale over which erosion by water can be predicted. Areas of application of erosion prediction include almost all land types: urban, rural, cropland, forests, rangeland, and construction sites. Specialty applications of WEPP include prediction of radioactive material movement with soils at a superfund cleanup site, and near real-time daily estimation of soil erosion for the entire state of Iowa.

  11. Intra prediction based on Markov process modeling of images.

    Science.gov (United States)

    Kamisli, Fatih

    2013-10-01

    In recent video coding standards, intraprediction of a block of pixels is performed by copying neighbor pixels of the block along an angular direction inside the block. Each block pixel is predicted from only one or few directionally aligned neighbor pixels of the block. Although this is a computationally efficient approach, it ignores potentially useful correlation of other neighbor pixels of the block. To use this correlation, a general linear prediction approach is proposed, where each block pixel is predicted using a weighted sum of all neighbor pixels of the block. The disadvantage of this approach is the increased complexity because of the large number of weights. In this paper, we propose an alternative approach to intraprediction, where we model image pixels with a Markov process. The Markov process model accounts for the ignored correlation in standard intraprediction methods, but uses few neighbor pixels and enables a computationally efficient recursive prediction algorithm. Compared with the general linear prediction approach that has a large number of independent weights, the Markov process modeling approach uses a much smaller number of independent parameters and thus offers significantly reduced memory or computation requirements, while achieving similar coding gains with offline computed parameters.

  12. Nonlinear mixed-effects modeling: individualization and prediction.

    Science.gov (United States)

    Olofsen, Erik; Dinges, David F; Van Dongen, Hans P A

    2004-03-01

    The development of biomathematical models for the prediction of fatigue and performance relies on statistical techniques to analyze experimental data and model simulations. Statistical models of empirical data have adjustable parameters with a priori unknown values. Interindividual variability in estimates of those values requires a form of smoothing. This traditionally consists of averaging observations across subjects, or fitting a model to the data of individual subjects first and subsequently averaging the parameter estimates. However, the standard errors of the parameter estimates are assessed inaccurately by such averaging methods. The reason is that intra- and inter-individual variabilities are intertwined. They can be separated by mixed-effects modeling in which model predictions are not only determined by fixed effects (usually constant parameters or functions of time) but also by random effects, describing the sampling of subject-specific parameter values from probability distributions. By estimating the parameters of the distributions of the random effects, mixed-effects models can describe experimental observations involving multiple subjects properly (i.e., yielding correct estimates of the standard errors) and parsimoniously (i.e., estimating no more parameters than necessary). Using a Bayesian approach, mixed-effects models can be "individualized" as observations are acquired that capture the unique characteristics of the individual at hand. Mixed-effects models, therefore, have unique advantages in research on human neurobehavioral functions, which frequently show large inter-individual differences. To illustrate this we analyzed laboratory neurobehavioral performance data acquired during sleep deprivation, using a nonlinear mixed-effects model. The results serve to demonstrate the usefulness of mixed-effects modeling for data-driven development of individualized predictive models of fatigue and performance.

  13. Predictive assessment of models for dynamic functional connectivity.

    Science.gov (United States)

    Nielsen, Søren F V; Schmidt, Mikkel N; Madsen, Kristoffer H; Mørup, Morten

    2018-05-01

    In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework on synthetic data, and apply it on two real-world examples: a face recognition EEG experiment and resting-state fMRI. Our results evidence that both EEG and fMRI are better characterized using dynamic modeling approaches than by their static counterparts, but we also demonstrate that one must be cautious when interpreting dFC because parameter settings and modeling assumptions, such as window lengths and emission models, can have a large impact on the estimated states and consequently on the interpretation of the brain dynamics. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Estimation and prediction under local volatility jump-diffusion model

    Science.gov (United States)

    Kim, Namhyoung; Lee, Younhee

    2018-02-01

    Volatility is an important factor in operating a company and managing risk. In the portfolio optimization and risk hedging using the option, the value of the option is evaluated using the volatility model. Various attempts have been made to predict option value. Recent studies have shown that stochastic volatility models and jump-diffusion models reflect stock price movements accurately. However, these models have practical limitations. Combining them with the local volatility model, which is widely used among practitioners, may lead to better performance. In this study, we propose a more effective and efficient method of estimating option prices by combining the local volatility model with the jump-diffusion model and apply it using both artificial and actual market data to evaluate its performance. The calibration process for estimating the jump parameters and local volatility surfaces is divided into three stages. We apply the local volatility model, stochastic volatility model, and local volatility jump-diffusion model estimated by the proposed method to KOSPI 200 index option pricing. The proposed method displays good estimation and prediction performance.

  15. Modeling and Prediction of Soil Water Vapor Sorption Isotherms

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per

    2015-01-01

    Soil water vapor sorption isotherms describe the relationship between water activity (aw) and moisture content along adsorption and desorption paths. The isotherms are important for modeling numerous soil processes and are also used to estimate several soil (specific surface area, clay content.......93) for a wide range of soils; and (ii) develop and test regression models for estimating the isotherms from clay content. Preliminary results show reasonable fits of the majority of the investigated empirical and theoretical models to the measured data although some models were not capable to fit both sorption...... directions accurately. Evaluation of the developed prediction equations showed good estimation of the sorption/desorption isotherms for tested soils....

  16. Developing and Testing a Model to Predict Outcomes of Organizational Change

    Science.gov (United States)

    Gustafson, David H; Sainfort, François; Eichler, Mary; Adams, Laura; Bisognano, Maureen; Steudel, Harold

    2003-01-01

    Objective To test the effectiveness of a Bayesian model employing subjective probability estimates for predicting success and failure of health care improvement projects. Data Sources Experts' subjective assessment data for model development and independent retrospective data on 221 healthcare improvement projects in the United States, Canada, and the Netherlands collected between 1996 and 2000 for validation. Methods A panel of theoretical and practical experts and literature in organizational change were used to identify factors predicting the outcome of improvement efforts. A Bayesian model was developed to estimate probability of successful change using subjective estimates of likelihood ratios and prior odds elicited from the panel of experts. A subsequent retrospective empirical analysis of change efforts in 198 health care organizations was performed to validate the model. Logistic regression and ROC analysis were used to evaluate the model's performance using three alternative definitions of success. Data Collection For the model development, experts' subjective assessments were elicited using an integrative group process. For the validation study, a staff person intimately involved in each improvement project responded to a written survey asking questions about model factors and project outcomes. Results Logistic regression chi-square statistics and areas under the ROC curve demonstrated a high level of model performance in predicting success. Chi-square statistics were significant at the 0.001 level and areas under the ROC curve were greater than 0.84. Conclusions A subjective Bayesian model was effective in predicting the outcome of actual improvement projects. Additional prospective evaluations as well as testing the impact of this model as an intervention are warranted. PMID:12785571

  17. Prediction of ultrasonic probe characteristics through modeling and simulation

    International Nuclear Information System (INIS)

    Amry Amin Abas; Mohamad Pauzi Ismail; Suhairy Sani

    2004-01-01

    One of the main component in an ultrasonic probe is piezoelectric material. It converts electrical energy supplied to it into mechanical energy (i.e. sound waves) and vice versa. In industrial application, the characteristic of ultrasonic probes is important as it will affect the results obtained. The probes fabricated must possess the characteristics suitable to the intended application. Through modeling and simulation, we can predict the characteristics of the probes. Mason equivalent circuit is used to make a model and simulation of the probes. In this model, the probes will be treated and simplified as a one dimensional electrical line. From simulation, the electrical properties such as impedance, operating frequency bandwidth and others can be predicted. From this model, the correct material to be used for actual probe construction can be obtained. The limitation of this method is details such as bond line between layers is not taken into consideration. (Author)

  18. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  19. [A predictive model on turnover intention of nurses in Korea].

    Science.gov (United States)

    Moon, Sook Ja; Han, Sang Sook

    2011-10-01

    The purpose of this study was to propose and test a predictive model that could explain and predict Korean nurses' turnover intentions. A survey using a structured questionnaire was conducted with 445 nurses in Korea. Six instruments were used in this model. The data were analyzed using SPSS 15.0 and Amos 7.0 program. Based on the constructed model, organizational commitment, and burnout were found to have a significant direct effect on turnover intention of nurses. In addition, factors such as empowerment, job satisfaction, and organizational commitment were found to indirectly affect turnover intention of nurse. The final modified model yielded χ²=402.30, pturnover intention in Korean nurses. Findings from this study can be used to design appropriate strategies to further decrease the nurses' turnover intention in Korea.

  20. PVT characterization and viscosity modeling and prediction of crude oils

    DEFF Research Database (Denmark)

    Cisneros, Eduardo Salvador P.; Dalberg, Anders; Stenby, Erling Halfdan

    2004-01-01

    method based on an accurate description of the fluid mass distribution is presented. The characterization procedure accurately matches the fluid saturation pressure. Additionally, a Peneloux volume translation scheme, capable of accurately reproducing the fluid density above and below the saturation...... deliver accurate viscosity predictions. The modeling approach presented in this work can deliver accurate viscosity and density modeling and prediction results over wide ranges of reservoir conditions, including the compositional changes induced by recovery processes such as gas injection.......In previous works, the general, one-parameter friction theory (f-theory), models have been applied to the accurate viscosity modeling of reservoir fluids. As a base, the f-theory approach requires a compositional characterization procedure for the application of an equation of state (EOS), in most...

  1. Prediction of conductivity by adaptive neuro-fuzzy model.

    Directory of Open Access Journals (Sweden)

    S Akbarzadeh

    Full Text Available Electrochemical impedance spectroscopy (EIS is a key method for the characterizing the ionic and electronic conductivity of materials. One of the requirements of this technique is a model to forecast conductivity in preliminary experiments. The aim of this paper is to examine the prediction of conductivity by neuro-fuzzy inference with basic experimental factors such as temperature, frequency, thickness of the film and weight percentage of salt. In order to provide the optimal sets of fuzzy logic rule bases, the grid partition fuzzy inference method was applied. The validation of the model was tested by four random data sets. To evaluate the validity of the model, eleven statistical features were examined. Statistical analysis of the results clearly shows that modeling with an adaptive neuro-fuzzy is powerful enough for the prediction of conductivity.

  2. Predictive Model of Energy Consumption in Beer Production

    Directory of Open Access Journals (Sweden)

    Tiecheng Pu

    2013-07-01

    Full Text Available The predictive model of energy consumption is presented based on subtractive clustering and Adaptive-Network-Based Fuzzy Inference System (for short ANFIS in the beer production. Using the subtractive clustering on the historical data of energy consumption, the limit of artificial experience is conquered while confirming the number of fuzzy rules. The parameters of the fuzzy inference system are acquired by the structure of adaptive network and hybrid on-line learning algorithm. The method can predict and guide the energy consumption of the factual production process. The reducing consumption scheme is provided based on the actual situation of the enterprise. Finally, using concrete examples verified the feasibility of this method comparing with the Radial Basis Functions (for short RBF neural network predictive model.

  3. Predicting water main failures using Bayesian model averaging and survival modelling approach

    International Nuclear Information System (INIS)

    Kabir, Golam; Tesfamariam, Solomon; Sadiq, Rehan

    2015-01-01

    To develop an effective preventive or proactive repair and replacement action plan, water utilities often rely on water main failure prediction models. However, in predicting the failure of water mains, uncertainty is inherent regardless of the quality and quantity of data used in the model. To improve the understanding of water main failure, a Bayesian framework is developed for predicting the failure of water mains considering uncertainties. In this study, Bayesian model averaging method (BMA) is presented to identify the influential pipe-dependent and time-dependent covariates considering model uncertainties whereas Bayesian Weibull Proportional Hazard Model (BWPHM) is applied to develop the survival curves and to predict the failure rates of water mains. To accredit the proposed framework, it is implemented to predict the failure of cast iron (CI) and ductile iron (DI) pipes of the water distribution network of the City of Calgary, Alberta, Canada. Results indicate that the predicted 95% uncertainty bounds of the proposed BWPHMs capture effectively the observed breaks for both CI and DI water mains. Moreover, the performance of the proposed BWPHMs are better compare to the Cox-Proportional Hazard Model (Cox-PHM) for considering Weibull distribution for the baseline hazard function and model uncertainties. - Highlights: • Prioritize rehabilitation and replacements (R/R) strategies of water mains. • Consider the uncertainties for the failure prediction. • Improve the prediction capability of the water mains failure models. • Identify the influential and appropriate covariates for different models. • Determine the effects of the covariates on failure

  4. Predictive Models in Differentiating Vertebral Lesions Using Multiparametric MRI.

    Science.gov (United States)

    Rathore, R; Parihar, A; Dwivedi, D K; Dwivedi, A K; Kohli, N; Garg, R K; Chandra, A

    2017-12-01

    Conventional MR imaging has high sensitivity but limited specificity in differentiating various vertebral lesions. We aimed to assess the ability of multiparametric MR imaging in differentiating spinal vertebral lesions and to develop statistical models for predicting the probability of malignant vertebral lesions. One hundred twenty-six consecutive patients underwent multiparametric MRI (conventional MR imaging, diffusion-weighted MR imaging, and in-phase/opposed-phase imaging) for vertebral lesions. Vertebral lesions were divided into 3 subgroups: infectious, noninfectious benign, and malignant. The cutoffs for apparent diffusion coefficient (expressed as 10 -3 mm 2 /s) and signal intensity ratio values were calculated, and 3 predictive models were established for differentiating these subgroups. Of the lesions of the 126 patients, 62 were infectious, 22 were noninfectious benign, and 42 were malignant. The mean ADC was 1.23 ± 0.16 for infectious, 1.41 ± 0.31 for noninfectious benign, and 1.01 ± 0.22 mm 2 /s for malignant lesions. The mean signal intensity ratio was 0.80 ± 0.13 for infectious, 0.75 ± 0.19 for noninfectious benign, and 0.98 ± 0.11 for the malignant group. The combination of ADC and signal intensity ratio showed strong discriminatory ability to differentiate lesion type. We found an area under the curve of 0.92 for the predictive model in differentiating infectious from malignant lesions and an area under the curve of 0.91 for the predictive model in differentiating noninfectious benign from malignant lesions. On the basis of the mean ADC and signal intensity ratio, we established automated statistical models that would be helpful in differentiating vertebral lesions. Our study shows that multiparametric MRI differentiates various vertebral lesions, and we established prediction models for the same. © 2017 by American Journal of Neuroradiology.

  5. Predictive Modelling of Contagious Deforestation in the Brazilian Amazon

    Science.gov (United States)

    Rosa, Isabel M. D.; Purves, Drew; Souza, Carlos; Ewers, Robert M.

    2013-01-01

    Tropical forests are diminishing in extent due primarily to the rapid expansion of agriculture, but the future magnitude and geographical distribution of future tropical deforestation is uncertain. Here, we introduce a dynamic and spatially-explicit model of deforestation that predicts the potential magnitude and spatial pattern of Amazon deforestation. Our model differs from previous models in three ways: (1) it is probabilistic and quantifies uncertainty around predictions and parameters; (2) the overall deforestation rate emerges “bottom up”, as the sum of local-scale deforestation driven by local processes; and (3) deforestation is contagious, such that local deforestation rate increases through time if adjacent locations are deforested. For the scenarios evaluated–pre- and post-PPCDAM (“Plano de Ação para Proteção e Controle do Desmatamento na Amazônia”)–the parameter estimates confirmed that forests near roads and already deforested areas are significantly more likely to be deforested in the near future and less likely in protected areas. Validation tests showed that our model correctly predicted the magnitude and spatial pattern of deforestation that accumulates over time, but that there is very high uncertainty surrounding the exact sequence in which pixels are deforested. The model predicts that under pre-PPCDAM (assuming no change in parameter values due to, for example, changes in government policy), annual deforestation rates would halve between 2050 compared to 2002, although this partly reflects reliance on a static map of the road network. Consistent with other models, under the pre-PPCDAM scenario, states in the south and east of the Brazilian Amazon have a high predicted probability of losing nearly all forest outside of protected areas by 2050. This pattern is less strong in the post-PPCDAM scenario. Contagious spread along roads and through areas lacking formal protection could allow deforestation to reach the core, which is

  6. Predictive modelling of contagious deforestation in the Brazilian Amazon.

    Science.gov (United States)

    Rosa, Isabel M D; Purves, Drew; Souza, Carlos; Ewers, Robert M

    2013-01-01

    Tropical forests are diminishing in extent due primarily to the rapid expansion of agriculture, but the future magnitude and geographical distribution of future tropical deforestation is uncertain. Here, we introduce a dynamic and spatially-explicit model of deforestation that predicts the potential magnitude and spatial pattern of Amazon deforestation. Our model differs from previous models in three ways: (1) it is probabilistic and quantifies uncertainty around predictions and parameters; (2) the overall deforestation rate emerges "bottom up", as the sum of local-scale deforestation driven by local processes; and (3) deforestation is contagious, such that local deforestation rate increases through time if adjacent locations are deforested. For the scenarios evaluated-pre- and post-PPCDAM ("Plano de Ação para Proteção e Controle do Desmatamento na Amazônia")-the parameter estimates confirmed that forests near roads and already deforested areas are significantly more likely to be deforested in the near future and less likely in protected areas. Validation tests showed that our model correctly predicted the magnitude and spatial pattern of deforestation that accumulates over time, but that there is very high uncertainty surrounding the exact sequence in which pixels are deforested. The model predicts that under pre-PPCDAM (assuming no change in parameter values due to, for example, changes in government policy), annual deforestation rates would halve between 2050 compared to 2002, although this partly reflects reliance on a static map of the road network. Consistent with other models, under the pre-PPCDAM scenario, states in the south and east of the Brazilian Amazon have a high predicted probability of losing nearly all forest outside of protected areas by 2050. This pattern is less strong in the post-PPCDAM scenario. Contagious spread along roads and through areas lacking formal protection could allow deforestation to reach the core, which is currently

  7. Impact of the model resolution on the simulation of elevation-dependent warming in the Tibetan Plateau-Himalayas, Greater Alpine Region, and Rocky mountains

    Science.gov (United States)

    Palazzi, Elisa; Mortarini, Luca; Terzago, Silvia; von Hardenberg, Jost

    2017-04-01

    The enhancement of warming rates with elevation, the so-called elevation-dependent warming (EDW), is one of the clearest regional expressions of global warming. Real sentinels of climate and environmental changes, mountains have experienced more rapid and intense warming rates in the recent decades, leading to serious impacts on mountain ecosystems and downstream societies, some of which are already occurring. In this study we use the historical and scenario simulations of one state-of-the-art global climate model, the EC-Earth GCM, run at five different spatial resolutions, from ˜125 km to ˜16 km, to explore the existence, characteristics and driving mechanisms of EDW in three different mountain regions of the world - the Colorado Rocky Mountains, the Greater Alpine Region and the Tibetan Plateau-Himalayas. The aim of this study is twofold: to investigate the impact (if any) of increasing model resolution on the representation of EDW and to highlight possible differences in this phenomenon and its driving mechanisms in different mountain regions of the northern hemisphere. Preliminary results indicate that autumn (September to November) is the only season in which EDW is simulated by the model in both the maximum and the minimum temperature, in all three regions and across all model resolutions. Regional differences emerge in the other seasons: for example, the Tibetan Plateau-Himalayas is the only area in which EDW is detected in winter. As for the analysis of EDW drivers, we identify albedo and downward longwave radiation as being the most important variables for EDW, in all three areas considered and in all seasons. Further these results are robust to changes in model resolution, even though a clearer signal is associated with finer resolutions. We finally use the highest resolution EC-Earth simulations available (˜16 km) to identify what areas, within the three considered mountain ranges, are expected to undergo a significant reduction of snow or ice cover

  8. Predictive models in churn data mining: a review

    OpenAIRE

    García, David L.; Vellido Alcacena, Alfredo; Nebot Castells, M. Àngela

    2007-01-01

    The development of predictive models of customer abandonment plays a central role in any churn management strategy. These models can be developed using either qualitative approaches or can take a data-centred point of view. In the latter case, the use of Data Mining procedures and techniques can provide useful and actionable insights into the processes leading to abandonment. In this report, we provide a brief and structured review of some of the Data Mining approaches that have been put forw...

  9. Quantifying Confidence in Model Predictions for Hypersonic Aircraft Structures

    Science.gov (United States)

    2015-03-01

    Falsification Power of Posterior p-Value Approach for Various Sample Sizes (Light Blue = 10, Dark Blue = 20, Green = 50, Red = 100...aerothermal model predictions and Glass and Hunt data........... 36 Table 4.12. Correlations between model error parameters in simultaneous posterior samples ...M1 using Latin Hypercube sampling . For each of those samples , a Markov Chain Monte Carlo ( MCMC ) algorithm called slice sampling is employed using

  10. The predictive performance and stability of six species distribution models.

    Directory of Open Access Journals (Sweden)

    Ren-Yan Duan

    Full Text Available Predicting species' potential geographical range by species distribution models (SDMs is central to understand their ecological requirements. However, the effects of using different modeling techniques need further investigation. In order to improve the prediction effect, we need to assess the predictive performance and stability of different SDMs.We collected the distribution data of five common tree species (Pinus massoniana, Betula platyphylla, Quercus wutaishanica, Quercus mongolica and Quercus variabilis and simulated their potential distribution area using 13 environmental variables and six widely used SDMs: BIOCLIM, DOMAIN, MAHAL, RF, MAXENT, and SVM. Each model run was repeated 100 times (trials. We compared the predictive performance by testing the consistency between observations and simulated distributions and assessed the stability by the standard deviation, coefficient of variation, and the 99% confidence interval of Kappa and AUC values.The mean values of AUC and Kappa from MAHAL, RF, MAXENT, and SVM trials were similar and significantly higher than those from BIOCLIM and DOMAIN trials (p<0.05, while the associated standard deviations and coefficients of variation were larger for BIOCLIM and DOMAIN trials (p<0.05, and the 99% confidence intervals for AUC and Kappa values were narrower for MAHAL, RF, MAXENT, and SVM. Compared to BIOCLIM and DOMAIN, other SDMs (MAHAL, RF, MAXENT, and SVM had higher prediction accuracy, smaller confidence intervals, and were more stable and less affected by the random variable (randomly selected pseudo-absence points.According to the prediction performance and stability of SDMs, we can divide these six SDMs into two categories: a high performance and stability group including MAHAL, RF, MAXENT, and SVM, and a low performance and stability group consisting of BIOCLIM, and DOMAIN. We highlight that choosing appropriate SDMs to address a specific problem is an important part of the modeling process.

  11. [Hyperspectrum based prediction model for nitrogen content of apple flowers].

    Science.gov (United States)

    Zhu, Xi-Cun; Zhao, Geng-Xing; Wang, Ling; Dong, Fang; Lei, Tong; Zhan, Bing

    2010-02-01

    The present paper aims to quantitatively retrieve nitrogen content in apple flowers, so as to provide an important basis for apple informationization management. By using ASD FieldSpec 3 field spectrometer, hyperspectral reflectivity of 120 apple flower samples in full-bloom stage was measured and their nitrogen contents were analyzed. Based on the apple flower original spectrum and first derivative spectral characteristics, correlation analysis was carried out between apple flowers original spectrum and first derivative spectrum reflectivity and nitrogen contents, so as to determine the sensitive bands. Based on characteristic spectral parameters, prediction models were built, optimized and tested. The results indicated that the nitrogen content of apple was very significantly negatively correlated with the original spectral reflectance in the 374-696, 1 340-1 890 and 2 052-2 433 nm, while in 736-913 nm they were very significantly positively correlated; the first derivative spectrum in 637-675 nm was very significantly negatively correlated, and in 676-746 nm was very significantly positively correlated. All the six spectral parameters established were significantly correlated with the nitrogen content of apple flowers. Through further comparison and selection, the prediction models built with original spectral reflectance of 640 and 676 nm were determined as the best for nitrogen content prediction of apple flowers. The test results showed that the coefficients of determination (R2) of the two models were 0.825 8 and 0.893 6, the total root mean square errors (RMSE) were 0.732 and 0.638 6, and the slopes were 0.836 1 and 1.019 2 respectively. Therefore the models produced desired results for nitrogen content prediction of apple flowers with average prediction accuracy of 92.9% and 94.0%. This study will provide theoretical basis and technical support for rapid apple flower nitrogen content prediction and nutrition diagnosis.

  12. Modeling a Predictive Energy Equation Specific for Maintenance Hemodialysis.

    Science.gov (United States)

    Byham-Gray, Laura D; Parrott, J Scott; Peters, Emily N; Fogerite, Susan Gould; Hand, Rosa K; Ahrens, Sean; Marcus, Andrea Fleisch; Fiutem, Justin J

    2017-03-01

    Hypermetabolism is theorized in patients diagnosed with chronic kidney disease who are receiving maintenance hemodialysis (MHD). We aimed to distinguish key disease-specific determinants of resting energy expenditure to create a predictive energy equation that more precisely establishes energy needs with the intent of preventing protein-energy wasting. For this 3-year multisite cross-sectional study (N = 116), eligible participants were diagnosed with chronic kidney disease and were receiving MHD for at least 3 months. Predictors for the model included weight, sex, age, C-reactive protein (CRP), glycosylated hemoglobin, and serum creatinine. The outcome variable was measured resting energy expenditure (mREE). Regression modeling was used to generate predictive formulas and Bland-Altman analyses to evaluate accuracy. The majority were male (60.3%), black (81.0%), and non-Hispanic (76.7%), and 23% were ≥65 years old. After screening for multicollinearity, the best predictive model of mREE ( R 2 = 0.67) included weight, age, sex, and CRP. Two alternative models with acceptable predictability ( R 2 = 0.66) were derived with glycosylated hemoglobin or serum creatinine. Based on Bland-Altman analyses, the maintenance hemodialysis equation that included CRP had the best precision, with the highest proportion of participants' predicted energy expenditure classified as accurate (61.2%) and with the lowest number of individuals with underestimation or overestimation. This study confirms disease-specific factors as key determinants of mREE in patients on MHD and provides a preliminary predictive energy equation. Further prospective research is necessary to test the reliability and validity of this equation across diverse populations of patients who are receiving MHD.

  13. Validation of an internal hardwood log defect prediction model

    Science.gov (United States)

    R. Edward. Thomas

    2011-01-01

    The type, size, and location of internal defects dictate the grade and value of lumber sawn from hardwood logs. However, acquiring internal defect knowledge with x-ray/computed-tomography or magnetic-resonance imaging technology can be expensive both in time and cost. An alternative approach uses prediction models based on correlations among external defect indicators...

  14. Transferring the Malaria Epidemic Prediction Model to Users in East ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Transferring the Malaria Epidemic Prediction Model to Users in East Africa. In the highlands of East Africa, epidemic malaria is an emerging climate-related hazard that urgently needs addressing. Malaria incidence increased by 337% during the 1987 epidemic in Rwanda. In Tanzania, Uganda and Kenya, malaria incidence ...

  15. Mathematical models for prediction of safety factors for a simply ...

    African Journals Online (AJOL)

    From the results obtained, mathematical prediction models were developed using a least square regression analysis for bending, shear and deflection modes of failure considered in the study. The results showed that the safety factors for material, dead and live load are not unique, but they are influenced by safety index ...

  16. Predictive Model Equations for Palm Kernel (Elaeis guneensis J ...

    African Journals Online (AJOL)

    A 3-factor experimental design was used to determine the influence of moisture content, roasting duration and temperature on palm kernel and sesame oil colours. Four levels each of these parameters were used. The data obtained were used to develop prediction models for palm kernel and sesame oil colours. Coefficient ...

  17. Large-area dry bean yield prediction modeling in Mexico

    Science.gov (United States)

    Given the importance of dry bean in Mexico, crop yield predictions before harvest are valuable for authorities of the agricultural sector, in order to define support for producers. The aim of this study was to develop an empirical model to estimate the yield of dry bean at the regional level prior t...

  18. Predictive ability of egg production models | Oni | Nigerian Journal of ...

    African Journals Online (AJOL)

    The monthly egg production data of a strain of Rhode Island chickens were used to compare three mathematical models (the Parabolic exponential, Wood's Gamma and modified Gamma by McNally) on their ability to predict 52 week total egg production from part-production at 16, 20, and 24 weeks, on a hen-housed basis.

  19. Model predictive control for cooperative control of space robots

    Science.gov (United States)

    Kannan, Somasundar; Alamdari, Seyed Amin Sajadi; Dentler, Jan; Olivares-Mendez, Miguel A.; Voos, Holger

    2017-01-01

    The problem of Orbital Manipulation of Passive body is discussed here. Two scenarios including passive object rigidly attached to robotic servicers and passive body attached to servicers through manipulators are discussed. The Model Predictive Control (MPC) technique is briefly presented and successfully tested through simulations on two cases of position control of passive body in the orbit.

  20. Economic Model Predictive Control for Smart Energy Systems

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus

    Model Predictive Control (MPC) can be used to control the energy distribution in a Smart Grid with a high share of stochastic energy production from renewable energy sources like wind. Heat pumps for heating residential buildings can exploit the slow heat dynamics of a building to store heat...