WorldWideScience

Sample records for models predicting trophic

  1. Trophic dynamics of a simple model ecosystem.

    Science.gov (United States)

    Bell, Graham; Fortier-Dubois, Étienne

    2017-09-13

    We have constructed a model of community dynamics that is simple enough to enumerate all possible food webs, yet complex enough to represent a wide range of ecological processes. We use the transition matrix to predict the outcome of succession and then investigate how the transition probabilities are governed by resource supply and immigration. Low-input regimes lead to simple communities whereas trophically complex communities develop when there is an adequate supply of both resources and immigrants. Our interpretation of trophic dynamics in complex communities hinges on a new principle of mutual replenishment, defined as the reciprocal alternation of state in a pair of communities linked by the invasion and extinction of a shared species. Such neutral couples are the outcome of succession under local dispersal and imply that food webs will often be made up of suites of trophically equivalent species. When immigrants arrive from an external pool of fixed composition a similar principle predicts a dynamic core of webs constituting a neutral interchange network, although communities may express an extensive range of other webs whose membership is only in part predictable. The food web is not in general predictable from whole-community properties such as productivity or stability, although it may profoundly influence these properties. © 2017 The Author(s).

  2. Modelling emergent trophic strategies in plankton

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Aksnes, Dag L.; Berge, Terje

    2015-01-01

    Plankton are typically divided into phytoplankton and zooplankton in marine ecosystem models. Yet, most protists in the photic zone engage in some degree of phagotrophy, and it has been suggested that trophic strategy is really a continuum between pure phototrophs (phytoplankton) and pure...

  3. Damped trophic cascades driven by fishing in model marine ecosystems

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Pedersen, Martin

    2010-01-01

    The largest perturbation on upper trophic levels of many marine ecosystems stems from fishing. The reaction of the ecosystem goes beyond the trophic levels directly targeted by the fishery. This reaction has been described either as a change in slope of the overall size spectrum or as a trophic...... cascade triggered by the removal of top predators. Here we use a novel size- and trait-based model to explore how marine ecosystems might react to perturbations from different types of fishing pressure. The model explicitly resolves the whole life history of fish, from larvae to adults. The results show...... that fishing does not change the overall slope of the size spectrum, but depletes the largest individuals and induces trophic cascades. A trophic cascade can propagate both up and down in trophic levels driven by a combination of changes in predation mortality and food limitation. The cascade is damped...

  4. Use of Landsat data to predict the trophic state of Minnesota lakes

    Science.gov (United States)

    Lillesand, T. M.; Johnson, W. L.; Deuell, R. L.; Lindstrom, O. M.; Meisner, D. E.

    1983-01-01

    Near-concurrent Landsat Multispectral Scanner (MSS) and ground data were obtained for 60 lakes distributed in two Landsat scene areas. The ground data included measurement of secchi disk depth, chlorophyll-a, total phosphorous, turbidity, color, and total nitrogen, as well as Carlson Trophic State Index (TSI) values derived from the first three parameters. The Landsat data best correlated with the TSI values. Prediction models were developed to classify some 100 'test' lakes appearing in the two analysis scenes on the basis of TSI estimates. Clouds, wind, poor image data, small lake size, and shallow lake depth caused some problems in lake TSI prediction. Overall, however, the Landsat-predicted TSI estimates were judged to be very reliable for the secchi-derived TSI estimation, moderately reliable for prediction of the chlorophyll-a TSI, and unreliable for the phosphorous value. Numerous Landsat data extraction procedures were compared, and the success of the Landsat TSI prediction models was a strong function of the procedure employed.

  5. Modeling lake trophic state: a random forest approach

    Science.gov (United States)

    Productivity of lentic ecosystems has been well studied and it is widely accepted that as nutrient inputs increase, productivity increases and lakes transition from low trophic state (e.g. oligotrophic) to higher trophic states (e.g. eutrophic). These broad trophic state classi...

  6. Predicting Trophic Interactions and Habitat Utilization in the California Current Ecosystem

    Science.gov (United States)

    2015-09-30

    Foraging patterns: model-data comparison . Simulated (colored circles) and observed (black circles) foraging locations for male sea lion individuals off...focusing on trophic interactions affecting habitat utilization and foraging patterns of California sea lions (CSL) in the California Current Large Marine...by considering patterns of covariability between environmental variables (e.g., temperature, primary production) and foraging patterns and success of

  7. Assessing Lake Trophic Status: A Proportional Odds Logistic Regression Model

    Science.gov (United States)

    Lake trophic state classifications are good predictors of ecosystem condition and are indicative of both ecosystem services (e.g., recreation and aesthetics), and disservices (e.g., harmful algal blooms). Methods for classifying trophic state are based off the foundational work o...

  8. Modeling of nitrogen transformation in an integrated multi-trophic aquaculture (IMTA)

    Science.gov (United States)

    Silfiana; Widowati; Putro, S. P.; Udjiani, T.

    2018-03-01

    The dynamic model of nitrogen transformation in IMTA (Integrated Multi-Trophic Aquaculture) is purposed. IMTA is a polyculture with several biotas maintained in it to optimize waste recycling as a food source. The purpose of this paper is to predict nitrogen decrease and nitrogen transformation in IMTA consisting of ammonia (NH3), Nitrite (NO2) and Nitrate (NO3). Nitrogen transformation of several processes, nitrification, assimilation, and volatilization. Numerical simulations are performed by providing initial parameters and values based on a review of previous research. The numerical results show that the rate of change in nitrogen concentration in IMTA decrease and reaches stable at different times.

  9. Trophic strategies of unicellular plankton

    DEFF Research Database (Denmark)

    Chakraborty, Subhendu; Nielsen, Lasse Tor; Andersen, Ken Haste

    2017-01-01

    . To this end, we develop and calibrate a trait-based model for unicellular planktonic organisms characterized by four traits: cell size and investments in phototrophy, nutrient uptake, and phagotrophy. We use the model to predict how optimal trophic strategies depend on cell size under various environmental...... unicellulars are colimited by organic carbon and nutrients, and only large photoautotrophs and smaller mixotrophs are nutrient limited; (2) trophic strategy is bottom-up selected by the environment, while optimal size is top-down selected by predation. The focus on cell size and trophic strategies facilitates......Unicellular plankton employ trophic strategies ranging from pure photoautotrophs over mixotrophy to obligate heterotrophs (phagotrophs), with cell sizes from 10-8 to 1 μg C. A full understanding of how trophic strategy and cell size depend on resource environment and predation is lacking...

  10. The exploration of trophic structure modeling using mass balance Ecopath model of Tangerang coastal waters

    Science.gov (United States)

    Dewi, N. N.; Kamal, M.; Wardiatno, Y.; Rozi

    2018-04-01

    Ecopath model approach was used to describe trophic interaction, energy flows and ecosystem condition of Tangerang coastal waters. This model consists of 42 ecological groups, of which 41 are living groups and one is a detritus group. Trophic levels of these groups vary between 1.0 (for primary producers and detritus) to 4.03 (for tetraodontidae). Groups with trophic levels 2≤TLfish, while detritus has a positive impact on the majority of demersal fish. Leiognathidae havea negative impact on phytoplankton, zooplankton and several other groups. System omnivory index for this ecosystem is 0.151. System primary production/respiration (P/R) ratio of Tangerang coastal waters is 1.505. This coastal ecosystem is an immatureecosystem because it hasdegraded. Pedigree index for this model is 0.57. This model describes ecosystem condition affected by overfishing and antropogenic activities. Therefore, through Ecopath model we provide some suggestions about the ecosystem-based fisheries management.

  11. Sensitivity of secondary production and export flux to choice of trophic transfer formulation in marine ecosystem models

    Science.gov (United States)

    Anderson, Thomas R.; Hessen, Dag O.; Mitra, Aditee; Mayor, Daniel J.; Yool, Andrew

    2013-09-01

    The performance of four contemporary formulations describing trophic transfer, which have strongly contrasting assumptions as regards the way that consumer growth is calculated as a function of food C:N ratio and in the fate of non-limiting substrates, was compared in two settings: a simple steady-state ecosystem model and a 3D biogeochemical general circulation model. Considerable variation was seen in predictions for primary production, transfer to higher trophic levels and export to the ocean interior. The physiological basis of the various assumptions underpinning the chosen formulations is open to question. Assumptions include Liebig-style limitation of growth, strict homeostasis in zooplankton biomass, and whether excess C and N are released by voiding in faecal pellets or via respiration/excretion post-absorption by the gut. Deciding upon the most appropriate means of formulating trophic transfer is not straightforward because, despite advances in ecological stoichiometry, the physiological mechanisms underlying these phenomena remain incompletely understood. Nevertheless, worrying inconsistencies are evident in the way in which fundamental transfer processes are justified and parameterised in the current generation of marine ecosystem models, manifested in the resulting simulations of ocean biogeochemistry. Our work highlights the need for modellers to revisit and appraise the equations and parameter values used to describe trophic transfer in marine ecosystem models.

  12. Successional changes in trophic interactions support a mechanistic model of post-fire population dynamics.

    Science.gov (United States)

    Smith, Annabel L

    2018-01-01

    Models based on functional traits have limited power in predicting how animal populations respond to disturbance because they do not capture the range of demographic and biological factors that drive population dynamics, including variation in trophic interactions. I tested the hypothesis that successional changes in vegetation structure, which affected invertebrate abundance, would influence growth rates and body condition in the early-successional, insectivorous gecko Nephrurus stellatus. I captured geckos at 17 woodland sites spanning a succession gradient from 2 to 48 years post-fire. Body condition and growth rates were analysed as a function of the best-fitting fire-related predictor (invertebrate abundance or time since fire) with different combinations of the co-variates age, sex and location. Body condition in the whole population was positively affected by increasing invertebrate abundance and, in the adult population, this effect was most pronounced for females. There was strong support for a decline in growth rates in weight with time since fire. The results suggest that increased early-successional invertebrate abundance has filtered through to a higher trophic level with physiological benefits for insectivorous geckos. I integrated the new findings about trophic interactions into a general conceptual model of mechanisms underlying post-fire population dynamics based on a long-term research programme. The model highlights how greater food availability during early succession could drive rapid population growth by contributing to previously reported enhanced reproduction and dispersal. This study provides a framework to understand links between ecological and physiological traits underlying post-fire population dynamics.

  13. A trophic model of fringing coral reefs in Nanwan Bay, southern Taiwan suggests overfishing.

    Science.gov (United States)

    Liu, Pi-Jen; Shao, Kwang-Tsao; Jan, Rong-Quen; Fan, Tung-Yung; Wong, Saou-Lien; Hwang, Jiang-Shiou; Chen, Jen-Ping; Chen, Chung-Chi; Lin, Hsing-Juh

    2009-09-01

    Several coral reefs of Nanwan Bay, Taiwan have recently undergone shifts to macroalgal or sea anemone dominance. Thus, a mass-balance trophic model was constructed to analyze the structure and functioning of the food web. The fringing reef model was comprised of 18 compartments, with the highest trophic level of 3.45 for piscivorous fish. Comparative analyses with other reef models demonstrated that Nanwan Bay was similar to reefs with high fishery catches. While coral biomass was not lower, fish biomass was lower than those of reefs with high catches. Consequently, the sums of consumption and respiratory flows and total system throughput were also decreased. The Nanwan Bay model potentially suggests an overfished status in which the mean trophic level of the catch, matter cycling, and trophic transfer efficiency are extremely reduced.

  14. Trophic modeling of Eastern Boundary Current Systems: a review and prospectus for solving the “Peruvian Puzzle”

    Directory of Open Access Journals (Sweden)

    Marc H. Taylor

    2013-04-01

    Full Text Available Eastern Boundary Current systems (EBCSs are among the most productive fishing areas in the world. High primary and secondary productivity supports a large biomass of small planktivorous pelagic fish, “small pelagics”, which are important drivers of production to the entire system whereby they can influence both higher and lower trophic levels. Environmental variability causes changes in plankton (food quality and quantity, which can affect population sizes, distribution and domi-nance among small pelagics. This variability combined with impacts from the fishery complicate the development of management strategies. Consequently, much recent work has been in the development of multispecies trophic models to better understand interdependencies and system dynamics. Despite similarities in extent, structure and primary productivity between EBCSs, the Peruvian system greatly differs from the others in the magnitude of fish catches, due mainly to the incredible production of the anchovy Engraulis ringens. This paper reviews literature concerning EBCSs dynamics and the state-of-the-art in the trophic modeling of EBCSs. The objective is to critically analyze the potential of this approach for system understanding and management and to adapt existing steady-state models of the Peruvian system for use in (future dynamic simulations. A guideline for the construction of trophodynamic models is presented taking into account the important trophic and environmental interactions. In consideration of the importance of small pelagics for the system dynamics, emphasis is placed on developing appropriate model compartmentalization and spatial delineation that facilitates dynamic simulations. Methods of model validation to historical changes are presented to support hypotheses concerning EBCS dynamics and as a critical step to the development of predictive models. Finally, the identification of direct model links to easily obtainable abiotic parameters is

  15. Modelling for an improved integrated multi-trophic aquaculture system for the production of highly valued marine species

    Directory of Open Access Journals (Sweden)

    Luana Granada

    2014-05-01

    Full Text Available Integrated multi-trophic aquaculture (IMTA is regarded as a suitable approach to limit aquaculture nutrients and organic matter outputs through biomitigation. Here, species from different trophic or nutritional levels are connected through water transfer. The co-cultured species are used as biofilters, and each level has its own independent commercial value, providing both economic and environmental sustainability. In order to better understand and optimize aquaculture production systems, dynamic modelling has been developed towards the use of models for analysis and simulation of aquacultures. Several models available determine the carrying capacity of farms and the environmental effects of bivalve and fish aquaculture. Also, in the last two decades, modelling strategies have been designed in order to predict the dispersion and deposition of organic fish farm waste, usually using the mean settling velocity of faeces and feed pellets. Cultured organisms growth, effects of light and temperature on algae growth, retention of suspended solids, biodegradation of nitrogen and wastewater treatment are examples of other modelled parameters in aquaculture. Most modelling equations have been developed for monocultures, despite the increasing importance of multi-species systems, such as polyculture and IMTA systems. The main reason for the development of multi-species models is to maximize the production and optimize species combinations in order to reduce the environmental impacts of aquaculture. Some multi-species system models are available, including from the polyculture of different species of bivalves with fish to more complex systems with four trophic levels. These can incorporate ecosystem models and use dynamic energy budgets for each trophic group. In the proposed IMTA system, the bioremediation potential of the marine seaweed Gracilaria vermiculophylla (nutrient removal performance and the Mediterranean filter-feeding polychaete Sabella

  16. Trophic models: What do we learn about Celtic Sea and Bay of Biscay ecosystems?

    Science.gov (United States)

    Moullec, Fabien; Gascuel, Didier; Bentorcha, Karim; Guénette, Sylvie; Robert, Marianne

    2017-08-01

    Trophic models are key tools to go beyond the single-species approaches used in stock assessments to adopt a more holistic view and implement the Ecosystem Approach to Fisheries Management (EAFM). This study aims to: (i) analyse the trophic functioning of the Celtic Sea and the Bay of Biscay, (ii) investigate ecosystem changes over the 1980-2013 period and, (iii) explore the response to management measures at the food web scale. Ecopath models were built for each ecosystem for years 1980 and 2013, and Ecosim models were fitted to time series data of biomass and catches. EcoTroph diagnosis showed that in both ecosystems, fishing pressure focuses on high trophic levels (TLs) and, to a lesser extent, on intermediate TLs. However, the interplay between local environmental conditions, species composition and ecosystem functioning could explain the different responses to fisheries management observed between these two contiguous ecosystems. Indeed, over the study period, the ecosystem's exploitation status has improved in the Bay of Biscay but not in the Celtic Sea. This improvement does not seem to be sufficient to achieve the objectives of an EAFM, as high trophic levels were still overexploited in 2013 and simulations conducted with Ecosim in the Bay of Biscay indicate that at current fishing effort the biomass will not be rebuilt by 2030. The ecosystem's response to a reduction in fishing mortality depends on which trophic levels receive protection. Reducing fishing mortality on pelagic fish, instead of on demersal fish, appears more efficient at maximising catch and total biomass and at conserving both top-predator and intermediate TLs. Such advice-oriented trophic models should be used on a regular basis to monitor the health status of marine food webs and analyse the trade-offs between multiple objectives in an ecosystem-based fisheries management context.

  17. Infusing considerations of trophic dependencies into species distribution modelling.

    Science.gov (United States)

    Trainor, Anne M; Schmitz, Oswald J

    2014-12-01

    Community ecology involves studying the interdependence of species with each other and their environment to predict their geographical distribution and abundance. Modern species distribution analyses characterise species-environment dependency well, but offer only crude approximations of species interdependency. Typically, the dependency between focal species and other species is characterised using other species' point occurrences as spatial covariates to constrain the focal species' predicted range. This implicitly assumes that the strength of interdependency is homogeneous across space, which is not generally supported by analyses of species interactions. This discrepancy has an important bearing on the accuracy of inferences about habitat suitability for species. We introduce a framework that integrates principles from consumer-resource analyses, resource selection theory and species distribution modelling to enhance quantitative prediction of species geographical distributions. We show how to apply the framework using a case study of lynx and snowshoe hare interactions with each other and their environment. The analysis shows how the framework offers a spatially refined understanding of species distribution that is sensitive to nuances in biophysical attributes of the environment that determine the location and strength of species interactions. © 2014 John Wiley & Sons Ltd/CNRS.

  18. Trophic modeling of the Northern Humboldt Current Ecosystem, Part I: Comparing trophic linkages under La Niña and El Niño conditions

    Science.gov (United States)

    Tam, Jorge; Taylor, Marc H.; Blaskovic, Verónica; Espinoza, Pepe; Michael Ballón, R.; Díaz, Erich; Wosnitza-Mendo, Claudia; Argüelles, Juan; Purca, Sara; Ayón, Patricia; Quipuzcoa, Luis; Gutiérrez, Dimitri; Goya, Elisa; Ochoa, Noemí; Wolff, Matthias

    2008-10-01

    The El Niño of 1997-98 was one of the strongest warming events of the past century; among many other effects, it impacted phytoplankton along the Peruvian coast by changing species composition and reducing biomass. While responses of the main fish resources to this natural perturbation are relatively well known, understanding the ecosystem response as a whole requires an ecotrophic multispecies approach. In this work, we construct trophic models of the Northern Humboldt Current Ecosystem (NHCE) and compare the La Niña (LN) years in 1995-96 with the El Niño (EN) years in 1997-98. The model area extends from 4°S-16°S and to 60 nm from the coast. The model consists of 32 functional groups of organisms and differs from previous trophic models of the Peruvian system through: (i) division of plankton into size classes to account for EN-associated changes and feeding preferences of small pelagic fish, (ii) increased division of demersal groups and separation of life history stages of hake, (iii) inclusion of mesopelagic fish, and (iv) incorporation of the jumbo squid ( Dosidicus gigas), which became abundant following EN. Results show that EN reduced the size and organization of energy flows of the NHCE, but the overall functioning (proportion of energy flows used for respiration, consumption by predators, detritus and export) of the ecosystem was maintained. The reduction of diatom biomass during EN forced omnivorous planktivorous fish to switch to a more zooplankton-dominated diet, raising their trophic level. Consequently, in the EN model the trophic level increased for several predatory groups (mackerel, other large pelagics, sea birds, pinnipeds) and for fishery catch. A high modeled biomass of macrozooplankton was needed to balance the consumption by planktivores, especially during EN condition when observed diatoms biomass diminished dramatically. Despite overall lower planktivorous fish catches, the higher primary production required-to-catch ratio implied a

  19. Trophic position and metabolic rate predict the long-term decay process of radioactive cesium in fish: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Hideyuki Doi

    Full Text Available Understanding the long-term behavior of radionuclides in organisms is important for estimating possible associated risks to human beings and ecosystems. As radioactive cesium (¹³⁷Cs can be accumulated in organisms and has a long physical half-life, it is very important to understand its long-term decay in organisms; however, the underlying mechanisms determining the decay process are little known. We performed a meta-analysis to collect published data on the long-term ¹³⁷Cs decay process in fish species to estimate biological (metabolic rate and ecological (trophic position, habitat, and diet type influences on this process. From the linear mixed models, we found that 1 trophic position could predict the day of maximum ¹³⁷Cs activity concentration in fish; and 2 the metabolic rate of the fish species and environmental water temperature could predict ecological half-lives and decay rates for fish species. These findings revealed that ecological and biological traits are important to predict the long-term decay process of ¹³⁷Cs activity concentration in fish.

  20. A model of trophic flows in the northern Benguela upwelling system during the 1980s

    DEFF Research Database (Denmark)

    Shannon, L.J.; Jarre, Astrid

    1999-01-01

    A model of trophic flows through the northern Benguela between 1980 and 1989 was constructed using the ECOPATH approach. The model serves to close the temporal gap between models of the system for the 1970s and 1990s. The aim is to provide a workable model, with the intention of encouraging...... in the northern Benguela during the 1980s was high, comparable to that of the Peruvian system in the 1960s and almost double that of the northern Benguela during the 1970s. Horse mackerel and hake catches were both high, with fishing on hake being ecologically more expensive. Biomass of benthic producers, meio...

  1. Spatially Explicit Modeling Reveals Cephalopod Distributions Match Contrasting Trophic Pathways in the Western Mediterranean Sea.

    Directory of Open Access Journals (Sweden)

    Patricia Puerta

    Full Text Available Populations of the same species can experience different responses to the environment throughout their distributional range as a result of spatial and temporal heterogeneity in habitat conditions. This highlights the importance of understanding the processes governing species distribution at local scales. However, research on species distribution often averages environmental covariates across large geographic areas, missing variability in population-environment interactions within geographically distinct regions. We used spatially explicit models to identify interactions between species and environmental, including chlorophyll a (Chla and sea surface temperature (SST, and trophic (prey density conditions, along with processes governing the distribution of two cephalopods with contrasting life-histories (octopus and squid across the western Mediterranean Sea. This approach is relevant for cephalopods, since their population dynamics are especially sensitive to variations in habitat conditions and rarely stable in abundance and location. The regional distributions of the two cephalopod species matched two different trophic pathways present in the western Mediterranean Sea, associated with the Gulf of Lion upwelling and the Ebro river discharges respectively. The effects of the studied environmental and trophic conditions were spatially variant in both species, with usually stronger effects along their distributional boundaries. We identify areas where prey availability limited the abundance of cephalopod populations as well as contrasting effects of temperature in the warmest regions. Despite distributional patterns matching productive areas, a general negative effect of Chla on cephalopod densities suggests that competition pressure is common in the study area. Additionally, results highlight the importance of trophic interactions, beyond other common environmental factors, in shaping the distribution of cephalopod populations. Our study presents

  2. Modelling impacts of offshore wind farms on trophic web: the Courseulles-sur-Mer case study

    Science.gov (United States)

    Raoux, Aurore; Pezy, Jean-Philippe; Dauvin, Jean-Claude; Tecchio, samuele; Degraer, Steven; Wilhelmsson, Dan; Niquil, Nathalie

    2016-04-01

    The French government is planning the construction of three offshore wind farms in Normandy. These offshore wind farms will integrate into an ecosystem already subject to a growing number of anthropogenic disturbances such as transportation, fishing, sediment deposit, and sediment extraction. The possible effects of this cumulative stressors on ecosystem functioning are still unknown, but they could impact their resilience, making them susceptible to changes from one stable state to another. Understanding the behaviour of these marine coastal complex systems is essential in order to anticipate potential state changes, and to implement conservation actions in a sustainable manner. Currently, there are no global and integrated studies on the effects of construction and exploitation of offshore wind farms. Moreover, approaches are generally focused on the conservation of some species or groups of species. Here, we develop a holistic and integrated view of ecosystem impacts through the use of trophic webs modelling tools. Trophic models describe the interaction between biological compartments at different trophic levels and are based on the quantification of flow of energy and matter in ecosystems. They allow the application of numerical methods for the characterization of emergent properties of the ecosystem, also called Ecological Network Analysis (ENA). These indices have been proposed as ecosystem health indicators as they have been demonstrated to be sensitive to different impacts on marine ecosystems. We present here in detail the strategy for analysing the potential environmental impacts of the construction of the Courseulles-sur-Mer offshore wind farm (Bay of Seine) such as the reef effect through the use of the Ecopath with Ecosim software. Similar Ecopath simulations will be made in the future on the Le Tréport offshore wind farm site. Results will contribute to a better knowledge of the impacts of the offshore wind farms on ecosystems. They also allow to

  3. Predicting the effects of climate change on trophic status of three morphologically varying lakes: Implications for lake restoration and management

    DEFF Research Database (Denmark)

    Trolle, Dennis; Hamilton, David P.; Pilditch, Conrad A.

    2011-01-01

    To quantify the effects of a future climate on three morphologically different lakes that varied in trophic status from oligo-mesotrophic to highly eutrophic, we applied the one-dimensional lake ecosystem model DYRESM-CAEDYM to oligo-mesotrophic Lake Okareka, eutrophic Lake Rotoehu, both in the t....... Therefore, future climate effects should be taken into account in the long-term planning and implementation of lake management as strategies may need to be refined and adapted to preserve or improve the present-day lake water quality....

  4. Description of the East Brazil Large Marine Ecosystem using a trophic model

    Directory of Open Access Journals (Sweden)

    Kátia M.F. Freire

    2008-09-01

    Full Text Available The objective of this study was to describe the marine ecosystem off northeastern Brazil. A trophic model was constructed for the 1970s using Ecopath with Ecosim. The impact of most of the forty-one functional groups was modest, probably due to the highly reticulated diet matrix. However, seagrass and macroalgae exerted a strong positive impact on manatee and herbivorous reef fishes, respectively. A high negative impact of omnivorous reef fishes on spiny lobsters and of sharks on swordfish was observed. Spiny lobsters and swordfish had the largest biomass changes for the simulation period (1978-2000; tunas, other large pelagics and sharks showed intermediate rates of biomass decline; and a slight increase in biomass was observed for toothed cetaceans, large carnivorous reef fishes, and dolphinfish. Recycling was an important feature of this ecosystem with low phytoplankton-originated primary production. The mean transfer efficiency between trophic levels was 11.4%. The gross efficiency of the fisheries was very low (0.00002, probably due to the low exploitation rate of most of the resources in the 1970s. Basic local information was missing for many groups. When information gaps are filled, this model may serve more credibly for the exploration of fishing policies for this area within an ecosystem approach.

  5. Holistic assessment of Chwaka Bay's multi-gear fishery - Using a trophic modeling approach

    Science.gov (United States)

    Rehren, Jennifer; Wolff, Matthias; Jiddawi, Narriman

    2018-04-01

    East African coastal communities highly depend on marine resources for not just income but also protein supply. The multi-species, multi-gear nature of East African fisheries makes this type of fishery particularly difficult to manage, as there is a trade-off between maximizing total catch from all gears and species and minimizing overfishing of target species and the disintegration of the ecosystem. The use and spatio-temporal overlap of multiple gears in Chwaka Bay (Zanzibar) has led to severe conflicts between fishermen. There is a general concern of overfishing in the bay because of the widespread use of small mesh sizes and destructive gears such as dragnets and spear guns. We constructed an Ecopath food web model to describe the current trophic flow structure and fishing pattern of the bay. Based on this model, we explored the impact of different gears on the ecosystem and the fishing community in order to give advice for gear based management in the bay. Results indicate that Chwaka bay is a productive, shallow water system, with biomass concentrations around the first and second trophic level. The system is greatly bottom-up driven and dominated by primary producers and invertebrates. The trophic and network indicators as well as the community energetics characterize Chwaka Bay as relatively mature. Traps and dragnets have the strongest impact on the ecosystem and on the catches obtained by other gears. Both gears potentially destabilize the ecosystem by reducing the biomass of top-down controlling key species (including important herbivores of macroalgae). The dragnet fishery is the least profitable, but provides most jobs for the fishing community. Thus, a complete ban of dragnets in the bay would require the provision of alternative livelihoods. Due to the low resource biomass of fish in the bay and the indication of a loss of structural control of certain fish groups, Chwaka Bay does not seem to provide scope for further expansion of the fishery

  6. Assessment, modelization and analysis of 106 Ru experimental transfers through a freshwater trophic system

    International Nuclear Information System (INIS)

    Vray, F.

    1994-01-01

    Experiments are carried out in order to study 106 RU transfers through a freshwater ecosystem including 2 abiotic compartments (water and sediment) and 3 trophic levels (10 species). Experimental results are expressed mathematically so as they can be included into a global model which is then tested in two different situations. The comparison of the available data concerning the in situ measured concentrations to the corresponding calculated ones validates the whole procedure. Analysis of the so validated results lightens ruthenium distribution process in the environment. The rare detection of this radionuclide in organisms living in areas contaminated by known meaningful releases can be explained by a relativity high detection limit and by a slight role of the sediment as a secondary contamination source. (author). 78 figs., 18 tabs

  7. Trophic flow structure of a neotropical estuary in northeastern Brazil and the comparison of ecosystem model indicators of estuaries

    Science.gov (United States)

    Lira, Alex; Angelini, Ronaldo; Le Loc'h, François; Ménard, Frédéric; Lacerda, Carlos; Frédou, Thierry; Lucena Frédou, Flávia

    2018-06-01

    We developed an Ecopath model for the Estuary of Sirinhaém River (SIR), a small-sized system surrounded by mangroves, subject to high impact, mainly by the sugar cane and other farming industries in order to describe the food web structure and trophic interactions. In addition, we compared our findings with those of 20 available Ecopath estuarine models for tropical, subtropical and temperate regions, aiming to synthesize the knowledge on trophic dynamics and provide a comprehensive analysis of the structures and functioning of estuaries. Our model consisted of 25 compartments and its indicators were within the expected range for estuarine areas around the world. The average trophic transfer efficiency for the entire system was 11.8%, similar to the theoretical value of 10%. The Keystone Index and MTI (Mixed Trophic Impact) analysis indicated that the snook (Centropomus undecimalis and Centropomus parallelus) and jack (Caranx latus and Caranx hippos) are considered as key resources in the system, revealing their high impact in the food web. Both groups have a high ecological and commercial relevance, despite the unregulated fisheries. As result of the comparison of ecosystem model indicators in estuaries, differences in the ecosystem structure from the low latitude zones (tropical estuaries) to the high latitude zones (temperate system) were noticed. The structure of temperate and sub-tropical estuaries is based on high flows of detritus and export, while tropical systems have high biomass, respiration and consumption rates. Higher values of System Omnivory Index (SOI) and Overhead (SO) were observed in the tropical and subtropical estuaries, denoting a more complex food chain. Globally, none of the estuarine models were classified as fully mature ecosystems, although the tropical ecosystems were considered more mature than the subtropical and temperate ecosystems. This study is an important contribution to the trophic modeling of estuaries, which may also help

  8. Predicting species distribution and abundance responses to climate change: why it is essential to include biotic interactions across trophic levels.

    Science.gov (United States)

    Van der Putten, Wim H; Macel, Mirka; Visser, Marcel E

    2010-07-12

    Current predictions on species responses to climate change strongly rely on projecting altered environmental conditions on species distributions. However, it is increasingly acknowledged that climate change also influences species interactions. We review and synthesize literature information on biotic interactions and use it to argue that the abundance of species and the direction of selection during climate change vary depending on how their trophic interactions become disrupted. Plant abundance can be controlled by aboveground and belowground multitrophic level interactions with herbivores, pathogens, symbionts and their enemies. We discuss how these interactions may alter during climate change and the resulting species range shifts. We suggest conceptual analogies between species responses to climate warming and exotic species introduced in new ranges. There are also important differences: the herbivores, pathogens and mutualistic symbionts of range-expanding species and their enemies may co-migrate, and the continuous gene flow under climate warming can make adaptation in the expansion zone of range expanders different from that of cross-continental exotic species. We conclude that under climate change, results of altered species interactions may vary, ranging from species becoming rare to disproportionately abundant. Taking these possibilities into account will provide a new perspective on predicting species distribution under climate change.

  9. Study of a tri-trophic prey-dependent food chain model of interacting populations.

    Science.gov (United States)

    Haque, Mainul; Ali, Nijamuddin; Chakravarty, Santabrata

    2013-11-01

    The current paper accounts for the influence of intra-specific competition among predators in a prey dependent tri-trophic food chain model of interacting populations. We offer a detailed mathematical analysis of the proposed food chain model to illustrate some of the significant results that has arisen from the interplay of deterministic ecological phenomena and processes. Biologically feasible equilibria of the system are observed and the behaviours of the system around each of them are described. In particular, persistence, stability (local and global) and bifurcation (saddle-node, transcritical, Hopf-Andronov) analysis of this model are obtained. Relevant results from previous well known food chain models are compared with the current findings. Global stability analysis is also carried out by constructing appropriate Lyapunov functions. Numerical simulations show that the present system is capable enough to produce chaotic dynamics when the rate of self-interaction is very low. On the other hand such chaotic behaviour disappears for a certain value of the rate of self interaction. In addition, numerical simulations with experimented parameters values confirm the analytical results and shows that intra-specific competitions bears a potential role in controlling the chaotic dynamics of the system; and thus the role of self interactions in food chain model is illustrated first time. Finally, a discussion of the ecological applications of the analytical and numerical findings concludes the paper. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Trophic State and Toxic Cyanobacteria Density in Optimization Modeling of Multi-Reservoir Water Resource Systems

    Directory of Open Access Journals (Sweden)

    Andrea Sulis

    2014-04-01

    Full Text Available The definition of a synthetic index for classifying the quality of water bodies is a key aspect in integrated planning and management of water resource systems. In previous works [1,2], a water system optimization modeling approach that requires a single quality index for stored water in reservoirs has been applied to a complex multi-reservoir system. Considering the same modeling field, this paper presents an improved quality index estimated both on the basis of the overall trophic state of the water body and on the basis of the density values of the most potentially toxic Cyanobacteria. The implementation of the index into the optimization model makes it possible to reproduce the conditions limiting water use due to excessive nutrient enrichment in the water body and to the health hazard linked to toxic blooms. The analysis of an extended limnological database (1996–2012 in four reservoirs of the Flumendosa-Campidano system (Sardinia, Italy provides useful insights into the strengths and limitations of the proposed synthetic index.

  11. Trophic state and toxic cyanobacteria density in optimization modeling of multi-reservoir water resource systems.

    Science.gov (United States)

    Sulis, Andrea; Buscarinu, Paola; Soru, Oriana; Sechi, Giovanni M

    2014-04-22

    The definition of a synthetic index for classifying the quality of water bodies is a key aspect in integrated planning and management of water resource systems. In previous works [1,2], a water system optimization modeling approach that requires a single quality index for stored water in reservoirs has been applied to a complex multi-reservoir system. Considering the same modeling field, this paper presents an improved quality index estimated both on the basis of the overall trophic state of the water body and on the basis of the density values of the most potentially toxic Cyanobacteria. The implementation of the index into the optimization model makes it possible to reproduce the conditions limiting water use due to excessive nutrient enrichment in the water body and to the health hazard linked to toxic blooms. The analysis of an extended limnological database (1996-2012) in four reservoirs of the Flumendosa-Campidano system (Sardinia, Italy) provides useful insights into the strengths and limitations of the proposed synthetic index.

  12. Testing the generality of a trophic-cascade model for plague

    Science.gov (United States)

    Collinge, S.K.; Johnson, W.C.; Ray, C.; Matchett, R.; Grensten, J.; Cully, J.F.; Gage, K.L.; Kosoy, M.Y.; Loye, J.E.; Martin, A.P.

    2005-01-01

    Climate may affect the dynamics of infectious diseases by shifting pathogen, vector, or host species abundance, population dynamics, or community interactions. Black-tailed prairie dogs (Cynomys ludovicianus) are highly susceptible to plague, yet little is known about factors that influence the dynamics of plague epizootics in prairie dogs. We investigated temporal patterns of plague occurrence in black-tailed prairie dogs to assess the generality of links between climate and plague occurrence found in previous analyses of human plague cases. We examined long-term data on climate and plague occurrence in prairie dog colonies within two study areas. Multiple regression analyses revealed that plague occurrence in prairie dogs was not associated with climatic variables in our Colorado study area. In contrast, plague occurrence was strongly associated with climatic variables in our Montana study area. The models with most support included a positive association with precipitation in April-July of the previous year, in addition to a positive association with the number of "warm" days and a negative association with the number of "hot" days in the same year as reported plague events. We conclude that the timing and magnitude of precipitation and temperature may affect plague occurrence in some geographic areas. The best climatic predictors of plague occurrence in prairie dogs within our Montana study area are quite similar to the best climatic predictors of human plague cases in the southwestern United States. This correspondence across regions and species suggests support for a (temperature-modulated) trophic-cascade model for plague, including climatic effects on rodent abundance, flea abundance, and pathogen transmission, at least in regions that experience strong climatic signals. ?? 2005 EcoHealth Journal Consortium.

  13. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  14. Exploring the Use of Multimedia Fate and Bioaccumulation Models to Calculate Trophic Magnification Factors (TMFs)

    Science.gov (United States)

    The trophic magnification factor (TMF) is considered to be a key metric for assessing the bioaccumulation potential of organic chemicals in food webs. Fugacity is an equilibrium criterion and thus reflects the relative thermodynamic status of a chemical in the environment and in ...

  15. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  16. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  17. Establishment and metabolic analysis of a model microbial community for understanding trophic and electron accepting interactions of subsurface anaerobic environments

    Directory of Open Access Journals (Sweden)

    Yang Zamin K

    2010-05-01

    Full Text Available Abstract Background Communities of microorganisms control the rates of key biogeochemical cycles, and are important for biotechnology, bioremediation, and industrial microbiological processes. For this reason, we constructed a model microbial community comprised of three species dependent on trophic interactions. The three species microbial community was comprised of Clostridium cellulolyticum, Desulfovibrio vulgaris Hildenborough, and Geobacter sulfurreducens and was grown under continuous culture conditions. Cellobiose served as the carbon and energy source for C. cellulolyticum, whereas D. vulgaris and G. sulfurreducens derived carbon and energy from the metabolic products of cellobiose fermentation and were provided with sulfate and fumarate respectively as electron acceptors. Results qPCR monitoring of the culture revealed C. cellulolyticum to be dominant as expected and confirmed the presence of D. vulgaris and G. sulfurreducens. Proposed metabolic modeling of carbon and electron flow of the three-species community indicated that the growth of C. cellulolyticum and D. vulgaris were electron donor limited whereas G. sulfurreducens was electron acceptor limited. Conclusions The results demonstrate that C. cellulolyticum, D. vulgaris, and G. sulfurreducens can be grown in coculture in a continuous culture system in which D. vulgaris and G. sulfurreducens are dependent upon the metabolic byproducts of C. cellulolyticum for nutrients. This represents a step towards developing a tractable model ecosystem comprised of members representing the functional groups of a trophic network.

  18. Inverse and Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-27

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.

  19. Echinocandin treatment of pneumocystis pneumonia in rodent models depletes cysts leaving trophic burdens that cannot transmit the infection.

    Directory of Open Access Journals (Sweden)

    Melanie T Cushion

    2010-01-01

    Full Text Available Fungi in the genus Pneumocystis cause pneumonia (PCP in hosts with debilitated immune systems and are emerging as co-morbidity factors associated with chronic diseases such as COPD. Limited therapeutic choices and poor understanding of the life cycle are a result of the inability of these fungi to grow outside the mammalian lung. Within the alveolar lumen, Pneumocystis spp., appear to have a bi-phasic life cycle consisting of an asexual phase characterized by binary fission of trophic forms and a sexual cycle resulting in formation of cysts, but the life cycle stage that transmits the infection is not known. The cysts, but not the trophic forms, express beta -1,3-D-glucan synthetase and contain abundant beta -1,3-D-glucan. Here we show that therapeutic and prophylactic treatment of PCP with echinocandins, compounds which inhibit the synthesis of beta -1,3-D-glucan, depleted cysts in rodent models of PCP, while sparing the trophic forms which remained in significant numbers. Survival was enhanced in the echincandin treated mice, likely due to the decreased beta -1,3-D-glucan content in the lungs of treated mice and rats which coincided with reductions of cyst numbers, and dramatic remodeling of organism morphology. Strong evidence for the cyst as the agent of transmission was provided by the failure of anidulafungin-treated mice to transmit the infection. We show for the first time that withdrawal of anidulafungin treatment with continued immunosuppression permitted the repopulation of cyst forms. Treatment of PCP with an echinocandin alone will not likely result in eradication of infection and cessation of echinocandin treatment while the patient remains immunosuppressed could result in relapse. Importantly, the echinocandins provide novel and powerful chemical tools to probe the still poorly understood bi-phasic life cycle of this genus of fungal pathogens.

  20. Trophic mass-balance model of Alaska's Prince William Sound ecosystem, for the post-spill period 1994-1996

    International Nuclear Information System (INIS)

    Okey, T.A.; Pauly, D.

    1998-01-01

    The Ecopath modelling approach for the Prince William Sound (PWS) ecosystem was described. The area is the site of the 1989 Exxon Valdez oil spill (EVOS), the largest spill in U.S. history. 36,000 tonnes of crude oil spread throughout the central and southwestern PWS into the Gulf of Alaska and along the Kenai and Alaska Peninsula. The initial effects of the oil spill were catastrophic. The Ecopath modelling approach discussed in this report is aimed at providing a cohesive picture of the PWS ecosystem by constructing a mass-balanced model of food-web interactions and trophic flows using information collected since the EVOS. The model includes all biotic components of the ecosystem and provides a quantitative description of food-web interactions and relationships, as well as energy flows among components. The model can provide an understanding of how ecosystems respond to disturbances, such as oil spills. 216 refs., 74 tabs., 13 figs., 8 appendices

  1. Seasonal Trophic Shift of Littoral Consumers in Eutrophic Lake Taihu (China Revealed by a Two-Source Mixing Model

    Directory of Open Access Journals (Sweden)

    Qiong Zhou

    2011-01-01

    Full Text Available We evaluated the seasonal variation in the contributions of planktonic and benthic resources to 11 littoral predators in eutrophic Lake Taihu (China from 2004 to 2005. Seasonal fluctuations in consumer σ13C and σ15N were attributed to the combined impacts of temporal variation in isotopic signatures of basal resources and the diet shift of fishes. Based on a two-end-member mixing model, all target consumers relied on energy sources from coupled benthic and planktonic pathways, but the predominant energy source for most species was highly variable across seasons, showing seasonal trophic shift of littoral consumers. Seasonality in energy mobilization of consumers focused on two aspects: (1 the species number of consumers that relied mainly on planktonic carbon showed the lowest values in the fall and the highest during spring/summer, and (2 most consumer species showed seasonal variation in the percentages of planktonic reliance. We concluded that seasonal trophic shifts of fishes and invertebrates were driven by phytoplankton production, but benthic resources were also important seasonally in supporting littoral consumers in Meiliang Bay. Energy mobilization of carnivorous fishes was more subject to the impact of resource availability than omnivorous species.

  2. Water trophicity of Utricularia microhabitats identlfied by means of SOFM as a tool in ecological modeling

    Directory of Open Access Journals (Sweden)

    Piotr Kosiba

    2011-01-01

    Full Text Available The study objects were 48 microhabitats of five Utricularia species in Lower and Upper Silesia (POLAND. The aim of the paper was to focus on application of the Self-Organizing Feature Map in assessment of water trophicity in Utricularia microhabitats, and to describe how SOFM can be used for the study of ecological subjects. This method was compared with the hierarchical tree plot of cluster analysis to check whether this techniques give similar results. In effect, both topological map of SOFM and dendrogram of cluster analysis show differences between Utricularia species microhabitats in respect of water quality, from eutrophic for U. vulgaris to dystrophic for U. minor and U. intermedia. The used methods give similar results and constitute a validation of the SOFM method in this type of studies.

  3. Competition influence in the segregation of the trophic niche of otariids: a case study using isotopic Bayesian mixing models in Galapagos pinnipeds.

    Science.gov (United States)

    Páez-Rosas, Diego; Rodríguez-Pérez, Mónica; Riofrío-Lazo, Marjorie

    2014-12-15

    The feeding success of predators is associated with the competition level for resources, and, thus, sympatric species are exposed to a potential trophic overlap. Isotopic Bayesian mixing models should provide a better understanding of the contribution of preys to the diet of predators and the feeding behavior of a species over time. The carbon and nitrogen isotopic signatures from pup hair samples of 93 Galapagos sea lions and 48 Galapagos fur seals collected between 2003 and 2009 in different regions (east and west) of the archipelago were analyzed. A PDZ Europa ANCA-GSL elemental analyzer interfaced with a PDZ Europa 20-20 continuous flow gas source mass spectrometer was employed. Bayesian models, SIAR and SIBER, were used to estimate the contribution of prey to the diet of predators, the niche breadth, and the trophic overlap level between the populations. Statistical differences in the isotopic values of both predators were observed over the time. The mixing model determined that Galapagos fur seals had a primarily teutophagous diet, whereas the Galapagos sea lions fed exclusively on fish in both regions of the archipelago. The SIBER analysis showed differences in the trophic niche between the two sea lion populations, with the western rookery of the Galapagos sea lion being the population with the largest trophic niche area. A trophic niche partitioning between Galapagos fur seals and Galapagos sea lions in the west of the archipelago is suggested by our results. At intraspecific level, the western population of the Galapagos sea lion (ZwW) showed higher trophic breadth than the eastern population, a strategy adopted by the ZwW to decrease the interspecific competition levels in the western region. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Network structure beyond food webs: mapping non-trophic and trophic interactions on Chilean rocky shores.

    Science.gov (United States)

    Sonia Kéfi; Berlow, Eric L; Wieters, Evie A; Joppa, Lucas N; Wood, Spencer A; Brose, Ulrich; Navarrete, Sergio A

    2015-01-01

    How multiple types of non-trophic interactions map onto trophic networks in real communities remains largely unknown. We present the first effort, to our knowledge, describing a comprehensive ecological network that includes all known trophic and diverse non-trophic links among >100 coexisting species for the marine rocky intertidal community of the central Chilean coast. Our results suggest that non-trophic interactions exhibit highly nonrandom structures both alone and with respect to food web structure. The occurrence of different types of interactions, relative to all possible links, was well predicted by trophic structure and simple traits of the source and target species. In this community, competition for space and positive interactions related to habitat/refuge provisioning by sessile and/or basal species were by far the most abundant non-trophic interactions. If these patterns are orroborated in other ecosystems, they may suggest potentially important dynamic constraints on the combined architecture of trophic and non-trophic interactions. The nonrandom patterning of non-trophic interactions suggests a path forward for developing a more comprehensive ecological network theory to predict the functioning and resilience of ecological communities.

  5. Predicting lake trophic state by relating Secchi-disk transparency measurements to Landsat-satellite imagery for Michigan inland lakes, 2003-05 and 2007-08

    Science.gov (United States)

    Fuller, L.M.; Jodoin, R.S.; Minnerick, R.J.

    2011-01-01

    Inland lakes are an important economic and environmental resource for Michigan. The U.S. Geological Survey and the Michigan Department of Natural Resources and Environment have been cooperatively monitoring the quality of selected lakes in Michigan through the Lake Water Quality Assessment program. Sampling for this program began in 2001; by 2010, 730 of Michigan’s 11,000 inland lakes are expected to have been sampled once. Volunteers coordinated by the Michigan Department of Natural Resources and Environment began sampling lakes in 1974 and continue to sample (in 2010) approximately 250 inland lakes each year through the Michigan Cooperative Lakes Monitoring Program. Despite these sampling efforts, it still is impossible to physically collect measurements for all Michigan inland lakes; however, Landsat-satellite imagery has been used successfully in Minnesota, Wisconsin, Michigan, and elsewhere to predict the trophic state of unsampled inland lakes greater than 20 acres by producing regression equations relating in-place Secchi-disk measurements to Landsat bands. This study tested three alternatives to methods previously used in Michigan to improve results for predicted statewide Trophic State Index (TSI) computed from Secchi-disk transparency (TSI (SDT)). The alternative methods were used on 14 Landsat-satellite scenes with statewide TSI (SDT) for two time periods (2003– 05 and 2007–08). Specifically, the methods were (1) satellitedata processing techniques to remove areas affected by clouds, cloud shadows, haze, shoreline, and dense vegetation for inland lakes greater than 20 acres in Michigan; (2) comparison of the previous method for producing a single open-water predicted TSI (SDT) value (which was based on an area of interest (AOI) and lake-average approach) to an alternative Gethist method for identifying open-water areas in inland lakes (which follows the initial satellite-data processing and targets the darkest pixels, representing the deepest water

  6. Study of silver-110M transfer mechanisms in freshwater. Conceiving and utilization of an experimental model of ecosystem and of a mathematical model to simulate the radionuclide through a trophic chain

    International Nuclear Information System (INIS)

    Garnier-Laplace, J.

    1990-10-01

    Uptake and retention of 110m Ag are quantified from laboratory studies carried out on an experimental freshwater ecosystem composed by two abiotic units, water and sediment, and by four trophic levels: primary producer (Scenedesmus obliquus), first order consumers (Daphnia magna, Gammarus pulex, Chrionomus sp.), second order consumer (Cyprinus carpio) and third order one (Salmo trutta). The chosen analytical process consists in expressing each transfer by a mathematical equation which formulation is based on a theoric analysis. Experiments allow to calibrate parameters of these equations for each unit of the food chain. All experimental data concerning 110m Ag uptake emphasize the radioprotection implications of this radioelement, because of the high values of the estimated radioecological parameters. On the basis of the results obtained, a determinist mathematical model has been conceived to simulate the radionuclide distribution in the food chain as a function of a chronic or acute contamination mode. Its application gives the development with time of the mean 110m Ag concentration values for each trophic level. The first approaches based on the analysis of the results of field studies, carried out on ecosystems affected by chronic pollution (Rhone river) or acute one (as a consequence of the Chernobyl accident), give to the model an important explicative and global predictive quality. The age of the fish, their dietary habits which vary according to the annual cycle of the prey species and with theirposition in the food chain, appear such as essential parameters. The trophic pathway is clearly predominant whatever the contamination mode and, explains, for acute exposure, why accumulation of 110m Ag can be prolonged for a long time after the surrounding environment contamination [fr

  7. Models of Plankton Community Changes during a Warm Water Anomaly in Arctic Waters Show Altered Trophic Pathways with Minimal Changes in Carbon Export

    Directory of Open Access Journals (Sweden)

    Maria Vernet

    2017-05-01

    Full Text Available Carbon flow through pelagic food webs is an expression of the composition, biomass and activity of phytoplankton as primary producers. In the near future, severe environmental changes in the Arctic Ocean are expected to lead to modifications of phytoplankton communities. Here, we used a combination of linear inverse modeling and ecological network analysis to study changes in food webs before, during, and after an anomalous warm water event in the eastern Fram Strait of the West Spitsbergen Current (WSC that resulted in a shift from diatoms to flagellates during the summer (June–July. The model predicts substantial differences in the pathways of carbon flow in diatom- vs. Phaeocystis/nanoflagellate-dominated phytoplankton communities, but relatively small differences in carbon export. The model suggests a change in the zooplankton community and activity through increasing microzooplankton abundance and the switching of meso- and macrozooplankton feeding from strict herbivory to omnivory, detritivory and coprophagy. When small cells and flagellates dominated, the phytoplankton carbon pathway through the food web was longer and the microbial loop more active. Furthermore, one step was added in the flow from phytoplankton to mesozooplankton, and phytoplankton carbon to higher trophic levels is available via detritus or microzooplankton. Model results highlight how specific changes in phytoplankton community composition, as expected in a climate change scenario, do not necessarily lead to a reduction in carbon export.

  8. Cultural Resource Predictive Modeling

    Science.gov (United States)

    2017-10-01

    CR cultural resource CRM cultural resource management CRPM Cultural Resource Predictive Modeling DoD Department of Defense ESTCP Environmental...resource management ( CRM ) legal obligations under NEPA and the NHPA, military installations need to demonstrate that CRM decisions are based on objective...maxim “one size does not fit all,” and demonstrate that DoD installations have many different CRM needs that can and should be met through a variety

  9. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  10. Trophic redundancy reduces vulnerability to extinction cascades.

    Science.gov (United States)

    Sanders, Dirk; Thébault, Elisa; Kehoe, Rachel; Frank van Veen, F J

    2018-03-06

    Current species extinction rates are at unprecedentedly high levels. While human activities can be the direct cause of some extinctions, it is becoming increasingly clear that species extinctions themselves can be the cause of further extinctions, since species affect each other through the network of ecological interactions among them. There is concern that the simplification of ecosystems, due to the loss of species and ecological interactions, increases their vulnerability to such secondary extinctions. It is predicted that more complex food webs will be less vulnerable to secondary extinctions due to greater trophic redundancy that can buffer against the effects of species loss. Here, we demonstrate in a field experiment with replicated plant-insect communities, that the probability of secondary extinctions is indeed smaller in food webs that include trophic redundancy. Harvesting one species of parasitoid wasp led to secondary extinctions of other, indirectly linked, species at the same trophic level. This effect was markedly stronger in simple communities than for the same species within a more complex food web. We show that this is due to functional redundancy in the more complex food webs and confirm this mechanism with a food web simulation model by highlighting the importance of the presence and strength of trophic links providing redundancy to those links that were lost. Our results demonstrate that biodiversity loss, leading to a reduction in redundant interactions, can increase the vulnerability of ecosystems to secondary extinctions, which, when they occur, can then lead to further simplification and run-away extinction cascades. Copyright © 2018 the Author(s). Published by PNAS.

  11. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  12. Optimal harvesting of a stochastic delay tri-trophic food-chain model with Lévy jumps

    Science.gov (United States)

    Qiu, Hong; Deng, Wenmin

    2018-02-01

    In this paper, the optimal harvesting of a stochastic delay tri-trophic food-chain model with Lévy jumps is considered. We introduce two kinds of environmental perturbations in this model. One is called white noise which is continuous and is described by a stochastic integral with respect to the standard Brownian motion. And the other one is jumping noise which is modeled by a Lévy process. Under some mild assumptions, the critical values between extinction and persistent in the mean of each species are established. The sufficient and necessary criteria for the existence of optimal harvesting policy are established and the optimal harvesting effort and the maximum of sustainable yield are also obtained. We utilize the ergodic method to discuss the optimal harvesting problem. The results show that white noises and Lévy noises significantly affect the optimal harvesting policy while time delays is harmless for the optimal harvesting strategy in some cases. At last, some numerical examples are introduced to show the validity of our results.

  13. Trophic model of the outer continental shelf and upper slope demersal community of the southeastern Brazilian Bight

    Directory of Open Access Journals (Sweden)

    Marcela C. Nascimento

    2012-10-01

    Full Text Available It is increasingly recognized that demersal communities are important for the functioning of continental shelf and slope ecosystems around the world, including tropical regions. Demersal communities are most prominent in areas of high detritus production and transport, and they link benthic and pelagic biological communities. To understand the structure and role of the demersal community on the southeastern Brazilian Bight, we constructed a trophodynamic model with 37 functional groups to represent the demersal community of the outer continental shelf and upper slope of this area, using the Ecopath with Ecosim 6 (EwE approach and software. The model indicates high production and biomass of detritus and benthic invertebrates, and strong linkages of these components to demersal and pelagic sub-webs. The level of omnivory indexes in this ecosystem was high, forming a highly connected trophic web reminiscent of tropical land areas. Although high levels of ascendency may indicate resistance and resilience to disturbance, recent and present fisheries trends are probably degrading the biological community and related ecosystem services.

  14. Trophic shifts of a generalist consumer in response to resource pulses.

    Directory of Open Access Journals (Sweden)

    Pei-Jen L Shaner

    2011-03-01

    Full Text Available Trophic shifts of generalist consumers can have broad food-web and biodiversity consequences through altered trophic flows and vertical diversity. Previous studies have used trophic shifts as indicators of food-web responses to perturbations, such as species invasion, and spatial or temporal subsidies. Resource pulses, as a form of temporal subsidies, have been found to be quite common among various ecosystems, affecting organisms at multiple trophic levels. Although diet switching of generalist consumers in response to resource pulses is well documented, few studies have examined if the switch involves trophic shifts, and if so, the directions and magnitudes of the shifts. In this study, we used stable carbon and nitrogen isotopes with a Bayesian multi-source mixing model to estimate proportional contributions of three trophic groups (i.e. producer, consumer, and fungus-detritivore to the diets of the White-footed mouse (Peromyscus leucopus receiving an artificial seed pulse or a naturally-occurring cicadas pulse. Our results demonstrated that resource pulses can drive trophic shifts in the mice. Specifically, the producer contribution to the mouse diets was increased by 32% with the seed pulse at both sites examined. The consumer contribution to the mouse diets was also increased by 29% with the cicadas pulse in one of the two grids examined. However, the pattern was reversed in the second grid, with a 13% decrease in the consumer contribution with the cicadas pulse. These findings suggest that generalist consumers may play different functional roles in food webs under perturbations of resource pulses. This study provides one of the few highly quantitative descriptions on dietary and trophic shifts of a key consumer in forest food webs, which may help future studies to form specific predictions on changes in trophic interactions following resource pulses.

  15. Variable δ15N Diet-Tissue Discrimination Factors among Sharks: Implications for Trophic Position, Diet and Food Web Models

    Science.gov (United States)

    Olin, Jill A.; Hussey, Nigel E.; Grgicak-Mannion, Alice; Fritts, Mark W.; Wintner, Sabine P.; Fisk, Aaron T.

    2013-01-01

    The application of stable isotopes to characterize the complexities of a species foraging behavior and trophic relationships is dependent on assumptions of δ15N diet-tissue discrimination factors (∆15N). As ∆15N values have been experimentally shown to vary amongst consumers, tissues and diet composition, resolving appropriate species-specific ∆15N values can be complex. Given the logistical and ethical challenges of controlled feeding experiments for determining ∆15N values for large and/or endangered species, our objective was to conduct an assessment of a range of reported ∆15N values that can hypothetically serve as surrogates for describing the predator-prey relationships of four shark species that feed on prey from different trophic levels (i.e., different mean δ15N dietary values). Overall, the most suitable species-specific ∆15N values decreased with increasing dietary-δ15N values based on stable isotope Bayesian ellipse overlap estimates of shark and the principal prey functional groups contributing to the diet determined from stomach content analyses. Thus, a single ∆15N value was not supported for this speciose group of marine predatory fishes. For example, the ∆15N value of 3.7‰ provided the highest percent overlap between prey and predator isotope ellipses for the bonnethead shark (mean diet δ15N = 9‰) whereas a ∆15N value white shark (mean diet δ15N = 15‰). These data corroborate the previously reported inverse ∆15N-dietary δ15N relationship when both isotope ellipses of principal prey functional groups and the broader identified diet of each species were considered supporting the adoption of different ∆15N values that reflect the predators’ δ15N-dietary value. These findings are critical for refining the application of stable isotope modeling approaches as inferences regarding a species’ ecological role in their community will be influenced with consequences for conservation and management actions. PMID:24147026

  16. Variable δ(15N diet-tissue discrimination factors among sharks: implications for trophic position, diet and food web models.

    Directory of Open Access Journals (Sweden)

    Jill A Olin

    Full Text Available The application of stable isotopes to characterize the complexities of a species foraging behavior and trophic relationships is dependent on assumptions of δ(15N diet-tissue discrimination factors (∆(15N. As ∆(15N values have been experimentally shown to vary amongst consumers, tissues and diet composition, resolving appropriate species-specific ∆(15N values can be complex. Given the logistical and ethical challenges of controlled feeding experiments for determining ∆(15N values for large and/or endangered species, our objective was to conduct an assessment of a range of reported ∆(15N values that can hypothetically serve as surrogates for describing the predator-prey relationships of four shark species that feed on prey from different trophic levels (i.e., different mean δ(15N dietary values. Overall, the most suitable species-specific ∆(15N values decreased with increasing dietary-δ(15N values based on stable isotope Bayesian ellipse overlap estimates of shark and the principal prey functional groups contributing to the diet determined from stomach content analyses. Thus, a single ∆(15N value was not supported for this speciose group of marine predatory fishes. For example, the ∆(15N value of 3.7‰ provided the highest percent overlap between prey and predator isotope ellipses for the bonnethead shark (mean diet δ(15N = 9‰ whereas a ∆(15N value < 2.3‰ provided the highest percent overlap between prey and predator isotope ellipses for the white shark (mean diet δ(15N = 15‰. These data corroborate the previously reported inverse ∆(15N-dietary δ(15N relationship when both isotope ellipses of principal prey functional groups and the broader identified diet of each species were considered supporting the adoption of different ∆(15N values that reflect the predators' δ(15N-dietary value. These findings are critical for refining the application of stable isotope modeling approaches as inferences regarding a species

  17. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    In medical statistics, many alternative strategies are available for building a prediction model based on training data. Prediction models are routinely compared by means of their prediction performance in independent validation data. If only one data set is available for training and validation,...

  18. Feeding habits and trophic levels of some demersal fish species in the Persian Gulf (Bushehr Province) using Ecopath model

    OpenAIRE

    Vahabnezhad, Arezoo

    2015-01-01

    A trophic study was carried out in February of 2012 to January 2013 on the ecosystem in the Persian Gulf, Bushehr provience. A total of 2,948 samples of stomach contents were analyzed based on the weight and number of food items and were identified about 40 preys. Crustacean and bony fish were as a main prey in most of the stomach contents . The mean average trophic level was estimated at 3.6 by Ecopath software. In this research, the mean level were studied between eight species varied fr...

  19. Food web model output - Trophic impacts of bald eagles in the Puget Sound food web

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing models to examine the ecological roles of bald eagles in the Puget Sound region. It is primarily being done by NMFS FTEs, in collaboration...

  20. Bioenergetics model output - Trophic impacts of bald eagles in the Puget Sound food web

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing models to examine the ecological roles of bald eagles in the Puget Sound region. It is primarily being done by NMFS FTEs, in collaboration...

  1. Exploring optimal fishing scenarios for the multispecies artisanal fisheries of Eritrea using a trophic model

    NARCIS (Netherlands)

    Tsehaye, I.W.; Nagelkerke, L.A.J.

    2008-01-01

    This study represents the first attempt to assess the potential for fisheries in the artisanal Red Sea reef fisheries of Eritrea in an ecosystem context. We used an Ecopath with Ecosim model to integrate known aspects of the ecosystem and its inhabitants into a single framework, with the aim to gain

  2. A model of trophic flows in the northern Benguela upwelling system ...

    African Journals Online (AJOL)

    Ultimately, this type of model may form a basis for multispecies management approaches in the region. By the 1980s, sardine Sardinops sagax and hake Merluccius spp. stocks in the northern Benguela had both undergone a decline, yet were still heavily fished. Horse mackerel Trachurus trachurus capensis had increased ...

  3. PREDICTED PERCENTAGE DISSATISFIED (PPD) MODEL ...

    African Journals Online (AJOL)

    HOD

    their low power requirements, are relatively cheap and are environment friendly. ... PREDICTED PERCENTAGE DISSATISFIED MODEL EVALUATION OF EVAPORATIVE COOLING ... The performance of direct evaporative coolers is a.

  4. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system

    Energy Technology Data Exchange (ETDEWEB)

    Beckon, William N., E-mail: William_Beckon@fws.gov

    2016-07-15

    Highlights: • A method for estimating response time in cause-effect relationships is demonstrated. • Predictive modeling is appreciably improved by taking into account this lag time. • Bioaccumulation lag is greater for organisms at higher trophic levels. • This methodology may be widely applicable in disparate disciplines. - Abstract: For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  5. A method for improving predictive modeling by taking into account lag time: Example of selenium bioaccumulation in a flowing system

    International Nuclear Information System (INIS)

    Beckon, William N.

    2016-01-01

    Highlights: • A method for estimating response time in cause-effect relationships is demonstrated. • Predictive modeling is appreciably improved by taking into account this lag time. • Bioaccumulation lag is greater for organisms at higher trophic levels. • This methodology may be widely applicable in disparate disciplines. - Abstract: For bioaccumulative substances, efforts to predict concentrations in organisms at upper trophic levels, based on measurements of environmental exposure, have been confounded by the appreciable but hitherto unknown amount of time it may take for bioaccumulation to occur through various pathways and across several trophic transfers. The study summarized here demonstrates an objective method of estimating this lag time by testing a large array of potential lag times for selenium bioaccumulation, selecting the lag that provides the best regression between environmental exposure (concentration in ambient water) and concentration in the tissue of the target organism. Bioaccumulation lag is generally greater for organisms at higher trophic levels, reaching times of more than a year in piscivorous fish. Predictive modeling of bioaccumulation is improved appreciably by taking into account this lag. More generally, the method demonstrated here may improve the accuracy of predictive modeling in a wide variety of other cause-effect relationships in which lag time is substantial but inadequately known, in disciplines as diverse as climatology (e.g., the effect of greenhouse gases on sea levels) and economics (e.g., the effects of fiscal stimulus on employment).

  6. Green Turtle Trophic Ecology

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — SWFSC is currently conducting a study of green sea turtle (Chelonia mydas) trophic ecology in the eastern Pacific. Tissue samples and stable carbon and stable...

  7. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  8. MODEL PREDICTIVE CONTROL FUNDAMENTALS

    African Journals Online (AJOL)

    2012-07-02

    Jul 2, 2012 ... signal based on a process model, coping with constraints on inputs and ... paper, we will present an introduction to the theory and application of MPC with Matlab codes ... section 5 presents the simulation results and section 6.

  9. Trophic cascades of bottom-up and top-down forcing on nutrients and plankton in the Kattegat, evaluated by modelling

    DEFF Research Database (Denmark)

    Petersen, Marcell Elo; Maar, Marie; Larsen, Janus

    2017-01-01

    The aim of the study was to investigate the relative importance of bottom-up and top-down forcing on trophic cascades in the pelagic food-web and the implications for water quality indicators (summer phytoplankton biomass and winter nutrients) in relation to management. The 3D ecological model....... On annual basis, the system was more bottom-up than top-down controlled. Microzooplankton was found to play an important role in the pelagic food web as mediator of nutrient and energy fluxes. This study demonstrated that the best scenario for improved water quality was a combined reduction in nutrient...

  10. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Modelling bankruptcy prediction models in Slovak companies

    Directory of Open Access Journals (Sweden)

    Kovacova Maria

    2017-01-01

    Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.

  12. Predictive models of moth development

    Science.gov (United States)

    Degree-day models link ambient temperature to insect life-stages, making such models valuable tools in integrated pest management. These models increase management efficacy by predicting pest phenology. In Wisconsin, the top insect pest of cranberry production is the cranberry fruitworm, Acrobasis v...

  13. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  14. Predictive Modeling in Race Walking

    Directory of Open Access Journals (Sweden)

    Krzysztof Wiktorowicz

    2015-01-01

    Full Text Available This paper presents the use of linear and nonlinear multivariable models as tools to support training process of race walkers. These models are calculated using data collected from race walkers’ training events and they are used to predict the result over a 3 km race based on training loads. The material consists of 122 training plans for 21 athletes. In order to choose the best model leave-one-out cross-validation method is used. The main contribution of the paper is to propose the nonlinear modifications for linear models in order to achieve smaller prediction error. It is shown that the best model is a modified LASSO regression with quadratic terms in the nonlinear part. This model has the smallest prediction error and simplified structure by eliminating some of the predictors.

  15. Expanding models of lake trophic state to predict cyanobacteria in lakes

    Science.gov (United States)

    Background/Question/Methods: Cyanobacteria are a primary taxonomic group associated with harmful algal blooms in lakes. Understanding the drivers of cyanobacteria presence has important implications for lake management and for the protection of human and ecosystem health. Chlor...

  16. Divergent trophic levels in two cryptic sibling bat species.

    Science.gov (United States)

    Siemers, Björn M; Greif, Stefan; Borissov, Ivailo; Voigt-Heucke, Silke L; Voigt, Christian C

    2011-05-01

    Changes in dietary preferences in animal species play a pivotal role in niche specialization. Here, we investigate how divergence of foraging behaviour affects the trophic position of animals and thereby their role for ecosystem processes. As a model, we used two closely related bat species, Myotis myotis and M. blythii oxygnathus, that are morphologically very similar and share the same roosts, but show clear behavioural divergence in habitat selection and foraging. Based on previous dietary studies on synanthropic populations in Central Europe, we hypothesised that M. myotis would mainly prey on predatory arthropods (i.e., secondary consumers) while M. blythii oxygnathus would eat herbivorous insects (i.e., primary consumers). We thus expected that the sibling bats would be at different trophic levels. We first conducted a validation experiment with captive bats in the laboratory and measured isotopic discrimination, i.e., the stepwise enrichment of heavy in relation to light isotopes between consumer and diet, in insectivorous bats for the first time. We then tested our trophic level hypothesis in the field at an ancient site of natural coexistence for the two species (Bulgaria, south-eastern Europe) using stable isotope analyses. As predicted, secondary consumer arthropods (carabid beetles; Coleoptera) were more enriched in (15)N than primary consumer arthropods (tettigoniids; Orthoptera), and accordingly wing tissue of M. myotis was more enriched in (15)N than tissue of M. blythii oxygnathus. According to a Bayesian mixing model, M. blythii oxygnathus indeed fed almost exclusively on primary consumers (98%), while M. myotis ate a mix of secondary (50%), but also, and to a considerable extent, primary consumers (50%). Our study highlights that morphologically almost identical, sympatric sibling species may forage at divergent trophic levels, and, thus may have different effects on ecosystem processes.

  17. Modelling exposure of oceanic higher trophic-level consumers to polychlorinated biphenyls: pollution 'hotspots' in relation to mass mortality events of marine mammals.

    Science.gov (United States)

    Handoh, Itsuki C; Kawai, Toru

    2014-08-30

    Marine mammals in the past mass mortality events may have been susceptible to infection because their immune systems were suppressed through the bioaccumulation of environmental pollutants such as polychlorinated biphenyls (PCBs). We compiled mortality event data sets of 33 marine mammal species, and employed a Finely-Advanced Transboundary Environmental model (FATE) to model the exposure of the global fish community to PCB congeners, in order to define critical exposure levels (CELs) of PCBs above which mass mortality events are likely to occur. Our modelling approach enabled us to describe the mass mortality events in the context of exposure of higher-trophic consumers to PCBs and to identify marine pollution 'hotspots' such as the Mediterranean Sea and north-western European coasts. We demonstrated that the CELs can be applied to quantify a chemical pollution Planetary Boundary, under which a safe operating space for marine mammals and humanity can exist. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A Trophic Flow Model of the Caeté Mangrove Estuary (North Brazil) with Considerations for the Sustainable Use of its Resources

    Science.gov (United States)

    Wolff, M.; Koch, V.; Isaac, V.

    2000-06-01

    The Caeté Estuary lies within the world's second largest mangrove region, 200 km south-east of the Amazon delta. It has an extension of about 220 km2and is subjected to a considerable human impact through intensive harvest of mangrove crabs (Ucides cordatus) and logging of mangroves. In order to integrate available information on biomass, catches, food spectrum and dynamics of the main species populations of the system, a trophic steady state model of 19 compartments was constructed using the ECOPATH II software (Christensen & Pauly, 1992). Ninety-nine percent of total system biomass is made up by mangroves (Rhizophora mangle, Avicennia germinans andLaguncularia racemosa), which are assumed to cover about 45% of the total area and contribute about 60% to the system's primary production. The remaining biomass (132 g m-2) is distributed between the pelagic and benthic domains in proportions of 10% and 90% respectively. Through litter fall, mangroves inject the main primary food source into the system, which is either consumed directly by herbivores (principally land crabs, Ucides cordatus) or, when already metabolized by bacteria, by detritivors (principally fiddler crabs, Uca spp.). These two groups are prominent in terms of biomass (80 g and 14·5 g m-2), and food intake (1120 g m-2 yr-1and 1378 g m-2 yr-1respectively). According to the model estimates, energy flow through the fish and shrimp compartments is of relatively low importance for the energy cycling within the system, a finding which is contrary to the situation in other mangrove estuaries reported in the literature. The dominance of mangrove epibenthos is attributed to the fact that a large part of the system's production remains within the mangrove forest as material export to the estuary is restricted to spring tides, when the forest is completely indundated. This is also the reason for the low abundance of suspension feeders, which are restricted to a small belt along the Caeté River and the small

  19. Development and evaluation of a regression-based model to predict cesium-137 concentration ratios for saltwater fish

    International Nuclear Information System (INIS)

    Pinder, John E.; Rowan, David J.; Smith, Jim T.

    2016-01-01

    Data from published studies and World Wide Web sources were combined to develop a regression model to predict "1"3"7Cs concentration ratios for saltwater fish. Predictions were developed from 1) numeric trophic levels computed primarily from random resampling of known food items and 2) K concentrations in the saltwater for 65 samplings from 41 different species from both the Atlantic and Pacific Oceans. A number of different models were initially developed and evaluated for accuracy which was assessed as the ratios of independently measured concentration ratios to those predicted by the model. In contrast to freshwater systems, were K concentrations are highly variable and are an important factor in affecting fish concentration ratios, the less variable K concentrations in saltwater were relatively unimportant in affecting concentration ratios. As a result, the simplest model, which used only trophic level as a predictor, had comparable accuracies to more complex models that also included K concentrations. A test of model accuracy involving comparisons of 56 published concentration ratios from 51 species of marine fish to those predicted by the model indicated that 52 of the predicted concentration ratios were within a factor of 2 of the observed concentration ratios. - Highlights: • We developed a model to predict concentration ratios (C_r) for saltwater fish. • The model requires only a single input variable to predict C_r. • That variable is a mean numeric trophic level available at (fishbase.org). • The K concentrations in seawater were not an important predictor variable. • The median-to observed ratio for 56 independently measured C_r was 0.83.

  20. Development and evaluation of a regression-based model to predict cesium concentration ratios for freshwater fish

    International Nuclear Information System (INIS)

    Pinder, John E.; Rowan, David J.; Rasmussen, Joseph B.; Smith, Jim T.; Hinton, Thomas G.; Whicker, F.W.

    2014-01-01

    Data from published studies and World Wide Web sources were combined to produce and test a regression model to predict Cs concentration ratios for freshwater fish species. The accuracies of predicted concentration ratios, which were computed using 1) species trophic levels obtained from random resampling of known food items and 2) K concentrations in the water for 207 fish from 44 species and 43 locations, were tested against independent observations of ratios for 57 fish from 17 species from 25 locations. Accuracy was assessed as the percent of observed to predicted ratios within factors of 2 or 3. Conservatism, expressed as the lack of under prediction, was assessed as the percent of observed to predicted ratios that were less than 2 or less than 3. The model's median observed to predicted ratio was 1.26, which was not significantly different from 1, and 50% of the ratios were between 0.73 and 1.85. The percentages of ratios within factors of 2 or 3 were 67 and 82%, respectively. The percentages of ratios that were <2 or <3 were 79 and 88%, respectively. An example for Perca fluviatilis demonstrated that increased prediction accuracy could be obtained when more detailed knowledge of diet was available to estimate trophic level. - Highlights: • We developed a model to predict Cs concentration ratios for freshwater fish species. • The model uses only two variables to predict a species CR for any location. • One variable is the K concentration in the freshwater. • The other is a species mean trophic level measure easily obtained from (fishbase.org). • The median observed to predicted ratio for 57 independent test cases was 1.26

  1. Trigeminal trophic syndrome

    Directory of Open Access Journals (Sweden)

    Parimalam Kumar

    2014-01-01

    Full Text Available Trigeminal trophic syndrome (TTS is a rare cause of facial ulceration, consequent to damage to the trigeminal nerve or its central sensory connections. We reporta case of TTS in a 48-year-old woman with Bell′s palsy following herpes zoster infection. The patient was treated and counseled. There hasnot been any recurrence for 1 year and the patient is being followed-up. The diagnosis of TTS should be suspected when there is unilateral facial ulceration, especially involving the ala nasi associated with sensory impairment.

  2. Model predictive control using fuzzy decision functions

    NARCIS (Netherlands)

    Kaymak, U.; Costa Sousa, da J.M.

    2001-01-01

    Fuzzy predictive control integrates conventional model predictive control with techniques from fuzzy multicriteria decision making, translating the goals and the constraints to predictive control in a transparent way. The information regarding the (fuzzy) goals and the (fuzzy) constraints of the

  3. The interacting effects of temperature and food chain length on trophic abundance and ecosystem function.

    Science.gov (United States)

    Beveridge, Oliver S; Humphries, Stuart; Petchey, Owen L

    2010-05-01

    1. While much is known about the independent effects of trophic structure and temperature on density and ecosystem processes, less is known about the interaction(s) between the two. 2. We manipulated the temperature of laboratory-based bacteria-protist communities that contained communities with one, two, or three trophic levels, and recorded species' densities and bacterial decomposition. 3. Temperature, food chain length and their interaction produced significant responses in microbial density and bacterial decomposition. Prey and resource density expressed different patterns of temperature dependency during different phases of population dynamics. The addition of a predator altered the temperature-density relationship of prey, from a unimodal trend to a negative one. Bacterial decomposition was greatest in the presence of consumers at higher temperatures. 4. These results are qualitatively consistent with a recent model of direct and indirect temperature effects on resource-consumer population dynamics. Results highlight and reinforce the importance of indirect effects of temperature mediated through trophic interactions. Understanding and predicting the consequences of environmental change will require that indirect effects, trophic structure, and individual species' tolerances be incorporated into theory and models.

  4. Diet compositions and trophic guild structure of the eastern Chukchi Sea demersal fish community

    Science.gov (United States)

    Whitehouse, George A.; Buckley, Troy W.; Danielson, Seth L.

    2017-01-01

    Fishes are an important link in Arctic marine food webs, connecting production of lower trophic levels to apex predators. We analyzed 1773 stomach samples from 39 fish species collected during a bottom trawl survey of the eastern Chukchi Sea in the summer of 2012. We used hierarchical cluster analysis of diet dissimilarities on 21 of the most well sampled species to identify four distinct trophic guilds: gammarid amphipod consumers, benthic invertebrate generalists, fish and shrimp consumers, and zooplankton consumers. The trophic guilds reflect dominant prey types in predator diets. We used constrained analysis of principal coordinates (CAP) to determine if variation within the composite guild diets could be explained by a suite of non-diet variables. All CAP models explained a significant proportion of the variance in the diet matrices, ranging from 7% to 25% of the total variation. Explanatory variables tested included latitude, longitude, predator length, depth, and water mass. These results indicate a trophic guild structure is present amongst the demersal fish community during summer in the eastern Chukchi Sea. Regular monitoring of the food habits of the demersal fish community will be required to improve our understanding of the spatial, temporal, and interannual variation in diet composition, and to improve our ability to identify and predict the impacts of climate change and commercial development on the structure and functioning of the Chukchi Sea ecosystem.

  5. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    Science.gov (United States)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs

  6. Model Prediction Control For Water Management Using Adaptive Prediction Accuracy

    NARCIS (Netherlands)

    Tian, X.; Negenborn, R.R.; Van Overloop, P.J.A.T.M.; Mostert, E.

    2014-01-01

    In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for

  7. Iowa calibration of MEPDG performance prediction models.

    Science.gov (United States)

    2013-06-01

    This study aims to improve the accuracy of AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) pavement : performance predictions for Iowa pavement systems through local calibration of MEPDG prediction models. A total of 130 : representative p...

  8. Model complexity control for hydrologic prediction

    NARCIS (Netherlands)

    Schoups, G.; Van de Giesen, N.C.; Savenije, H.H.G.

    2008-01-01

    A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore

  9. Bioenergetics, Trophic Ecology, and Niche Separation of Tunas.

    Science.gov (United States)

    Olson, R J; Young, J W; Ménard, F; Potier, M; Allain, V; Goñi, N; Logan, J M; Galván-Magaña, F

    Tunas are highly specialized predators that have evolved numerous adaptations for a lifestyle that requires large amounts of energy consumption. Here we review our understanding of the bioenergetics and feeding dynamics of tunas on a global scale, with an emphasis on yellowfin, bigeye, skipjack, albacore, and Atlantic bluefin tunas. Food consumption balances bioenergetics expenditures for respiration, growth (including gonad production), specific dynamic action, egestion, and excretion. Tunas feed across the micronekton and some large zooplankton. Some tunas appear to time their life history to take advantage of ephemeral aggregations of crustacean, fish, and molluscan prey. Ontogenetic and spatial diet differences are substantial, and significant interdecadal changes in prey composition have been observed. Diet shifts from larger to smaller prey taxa highlight ecosystem-wide changes in prey availability and diversity and provide implications for changing bioenergetics requirements into the future. Where tunas overlap, we show evidence of niche separation between them; resources are divided largely by differences in diet percentages and size ranges of prey taxa. The lack of long-term data limits the ability to predict impacts of climate change on tuna feeding behaviour. We note the need for systematic collection of feeding data as part of routine monitoring of these species, and we highlight the advantages of using biochemical techniques for broad-scale analyses of trophic relations. We support the continued development of ecosystem models, which all too often lack the regional-specific trophic data needed to adequately investigate climate and fishing impacts. © 2016 Elsevier Ltd. All rights reserved.

  10. Nonlinear chaotic model for predicting storm surges

    Directory of Open Access Journals (Sweden)

    M. Siek

    2010-09-01

    Full Text Available This paper addresses the use of the methods of nonlinear dynamics and chaos theory for building a predictive chaotic model from time series. The chaotic model predictions are made by the adaptive local models based on the dynamical neighbors found in the reconstructed phase space of the observables. We implemented the univariate and multivariate chaotic models with direct and multi-steps prediction techniques and optimized these models using an exhaustive search method. The built models were tested for predicting storm surge dynamics for different stormy conditions in the North Sea, and are compared to neural network models. The results show that the chaotic models can generally provide reliable and accurate short-term storm surge predictions.

  11. Staying Power of Churn Prediction Models

    NARCIS (Netherlands)

    Risselada, Hans; Verhoef, Peter C.; Bijmolt, Tammo H. A.

    In this paper, we study the staying power of various churn prediction models. Staying power is defined as the predictive performance of a model in a number of periods after the estimation period. We examine two methods, logit models and classification trees, both with and without applying a bagging

  12. Predictive user modeling with actionable attributes

    NARCIS (Netherlands)

    Zliobaite, I.; Pechenizkiy, M.

    2013-01-01

    Different machine learning techniques have been proposed and used for modeling individual and group user needs, interests and preferences. In the traditional predictive modeling instances are described by observable variables, called attributes. The goal is to learn a model for predicting the target

  13. Predictive models for monitoring and analysis of the total zooplankton

    Directory of Open Access Journals (Sweden)

    Obradović Milica

    2014-01-01

    Full Text Available In recent years, modeling and prediction of total zooplankton abundance have been performed by various tools and techniques, among which data mining tools have been less frequent. The purpose of this paper is to automatically determine the dependency degree and the influence of physical, chemical and biological parameters on the total zooplankton abundance, through design of the specific data mining models. For this purpose, the analysis of key influencers was used. The analysis is based on the data obtained from the SeLaR information system - specifically, the data from the two reservoirs (Gruža and Grošnica with different morphometric characteristics and trophic state. The data is transformed into optimal structure for data analysis, upon which, data mining model based on the Naïve Bayes algorithm is constructed. The results of the analysis imply that in both reservoirs, parameters of groups and species of zooplankton have the greatest influence on the total zooplankton abundance. If these inputs (group and zooplankton species are left out, differences in the impact of physical, chemical and other biological parameters in dependences of reservoirs can be noted. In the Grošnica reservoir, analysis showed that the temporal dimension (months, nitrates, water temperature, chemical oxygen demand, chlorophyll and chlorides, had the key influence with strong relative impact. In the Gruža reservoir, key influence parameters for total zooplankton are: spatial dimension (location, water temperature and physiological groups of bacteria. The results show that the presented data mining model is usable on any kind of aquatic ecosystem and can also serve for the detection of inputs which could be the basis for the future analysis and modeling.

  14. Reef Fishes at All Trophic Levels Respond Positively to Effective Marine Protected Areas

    Science.gov (United States)

    Soler, German A.; Edgar, Graham J.; Thomson, Russell J.; Kininmonth, Stuart; Campbell, Stuart J.; Dawson, Terence P.; Barrett, Neville S.; Bernard, Anthony T. F.; Galván, David E.; Willis, Trevor J.; Alexander, Timothy J.; Stuart-Smith, Rick D.

    2015-01-01

    Marine Protected Areas (MPAs) offer a unique opportunity to test the assumption that fishing pressure affects some trophic groups more than others. Removal of larger predators through fishing is often suggested to have positive flow-on effects for some lower trophic groups, in which case protection from fishing should result in suppression of lower trophic groups as predator populations recover. We tested this by assessing differences in the trophic structure of reef fish communities associated with 79 MPAs and open-access sites worldwide, using a standardised quantitative dataset on reef fish community structure. The biomass of all major trophic groups (higher carnivores, benthic carnivores, planktivores and herbivores) was significantly greater (by 40% - 200%) in effective no-take MPAs relative to fished open-access areas. This effect was most pronounced for individuals in large size classes, but with no size class of any trophic group showing signs of depressed biomass in MPAs, as predicted from higher predator abundance. Thus, greater biomass in effective MPAs implies that exploitation on shallow rocky and coral reefs negatively affects biomass of all fish trophic groups and size classes. These direct effects of fishing on trophic structure appear stronger than any top down effects on lower trophic levels that would be imposed by intact predator populations. We propose that exploitation affects fish assemblages at all trophic levels, and that local ecosystem function is generally modified by fishing. PMID:26461104

  15. Reef Fishes at All Trophic Levels Respond Positively to Effective Marine Protected Areas.

    Directory of Open Access Journals (Sweden)

    German A Soler

    Full Text Available Marine Protected Areas (MPAs offer a unique opportunity to test the assumption that fishing pressure affects some trophic groups more than others. Removal of larger predators through fishing is often suggested to have positive flow-on effects for some lower trophic groups, in which case protection from fishing should result in suppression of lower trophic groups as predator populations recover. We tested this by assessing differences in the trophic structure of reef fish communities associated with 79 MPAs and open-access sites worldwide, using a standardised quantitative dataset on reef fish community structure. The biomass of all major trophic groups (higher carnivores, benthic carnivores, planktivores and herbivores was significantly greater (by 40% - 200% in effective no-take MPAs relative to fished open-access areas. This effect was most pronounced for individuals in large size classes, but with no size class of any trophic group showing signs of depressed biomass in MPAs, as predicted from higher predator abundance. Thus, greater biomass in effective MPAs implies that exploitation on shallow rocky and coral reefs negatively affects biomass of all fish trophic groups and size classes. These direct effects of fishing on trophic structure appear stronger than any top down effects on lower trophic levels that would be imposed by intact predator populations. We propose that exploitation affects fish assemblages at all trophic levels, and that local ecosystem function is generally modified by fishing.

  16. EFFICIENT PREDICTIVE MODELLING FOR ARCHAEOLOGICAL RESEARCH

    OpenAIRE

    Balla, A.; Pavlogeorgatos, G.; Tsiafakis, D.; Pavlidis, G.

    2014-01-01

    The study presents a general methodology for designing, developing and implementing predictive modelling for identifying areas of archaeological interest. The methodology is based on documented archaeological data and geographical factors, geospatial analysis and predictive modelling, and has been applied to the identification of possible Macedonian tombs’ locations in Northern Greece. The model was tested extensively and the results were validated using a commonly used predictive gain, which...

  17. Robust predictions of the interacting boson model

    International Nuclear Information System (INIS)

    Casten, R.F.; Koeln Univ.

    1994-01-01

    While most recognized for its symmetries and algebraic structure, the IBA model has other less-well-known but equally intrinsic properties which give unavoidable, parameter-free predictions. These predictions concern central aspects of low-energy nuclear collective structure. This paper outlines these ''robust'' predictions and compares them with the data

  18. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which is a r...

  19. Long term patterns in the late summer trophic niche of the invasive pumpkinseed sunfish Lepomis gibbosus

    Directory of Open Access Journals (Sweden)

    Gkenas C.

    2016-01-01

    Full Text Available Quantifying the trophic dynamics of invasive species in novel habitats is important for predicting the success of potential invaders and evaluating their ecological effects. The North American pumpkinseed sunfish Lepomis gibbosus is a successful invader in Europe, where it has caused negative ecological effects primarily through trophic interactions. Here, we quantified variations in the late summer trophic niche of pumpkinseed during establishment and integration in the mainstem of the Guadiana river, using stomach content analyses over a period of 40 years. Pumpkinseed showed a shift from trophic specialization during establishment to trophic generalism during integration. These results were concomitant with an increase in diet breadth that was accompanied by higher individual diet specialization particularly in large individuals. Irrespective of their drivers, these changes in trophic niche suggest that the potential ecological effects of pumpkinseed on recipient ecosystems can vary temporally along the invasion process.

  20. Trophic signatures of seabirds suggest shifts in oceanic ecosystems

    Science.gov (United States)

    Gagne, Tyler O.; Hyrenbach, K. David; Hagemann, Molly E.; Van Houtan, Kyle S.

    2018-01-01

    Pelagic ecosystems are dynamic ocean regions whose immense natural capital is affected by climate change, pollution, and commercial fisheries. Trophic level–based indicators derived from fishery catch data may reveal the food web status of these systems, but the utility of these metrics has been debated because of targeting bias in fisheries catch. We analyze a unique, fishery-independent data set of North Pacific seabird tissues to inform ecosystem trends over 13 decades (1890s to 2010s). Trophic position declined broadly in five of eight species sampled, indicating a long-term shift from higher–trophic level to lower–trophic level prey. No species increased their trophic position. Given species prey preferences, Bayesian diet reconstructions suggest a shift from fishes to squids, a result consistent with both catch reports and ecosystem models. Machine learning models further reveal that trophic position trends have a complex set of drivers including climate, commercial fisheries, and ecomorphology. Our results show that multiple species of fish-consuming seabirds may track the complex changes occurring in marine ecosystems. PMID:29457134

  1. Trophic levels of fish species of commercial importance in the Colombian Caribbean

    Directory of Open Access Journals (Sweden)

    Camilo B García

    2011-09-01

    Full Text Available Ecological studies on commercial important fish species are of great value to support resource management issues. This study calculated trophic levels of those Colombian Caribbean fish species whose diet has been locally described. Usable diet data of 119 species resulted in 164 trophic level estimates. An ordinary regression model relating trophic level and fish size was formulated. The regression slope was positive and significantly different from zero (p<0.05 suggesting a scaling of trophic level with fish size. Both the list of trophic levels and the regression model should be of help in the formulation of trophic indicators and models of neotropical ecosystems. Rev. Biol. Trop. 59 (3: 1195-1203. Epub 2011 September 01.

  2. Simulated tri-trophic networks reveal complex relationships between species diversity and interaction diversity.

    Science.gov (United States)

    Pardikes, Nicholas A; Lumpkin, Will; Hurtado, Paul J; Dyer, Lee A

    2018-01-01

    Most of earth's biodiversity is comprised of interactions among species, yet it is unclear what causes variation in interaction diversity across space and time. We define interaction diversity as the richness and relative abundance of interactions linking species together at scales from localized, measurable webs to entire ecosystems. Large-scale patterns suggest that two basic components of interaction diversity differ substantially and predictably between different ecosystems: overall taxonomic diversity and host specificity of consumers. Understanding how these factors influence interaction diversity, and quantifying the causes and effects of variation in interaction diversity are important goals for community ecology. While previous studies have examined the effects of sampling bias and consumer specialization on determining patterns of ecological networks, these studies were restricted to two trophic levels and did not incorporate realistic variation in species diversity and consumer diet breadth. Here, we developed a food web model to generate tri-trophic ecological networks, and evaluated specific hypotheses about how the diversity of trophic interactions and species diversity are related under different scenarios of species richness, taxonomic abundance, and consumer diet breadth. We investigated the accumulation of species and interactions and found that interactions accumulate more quickly; thus, the accumulation of novel interactions may require less sampling effort than sampling species in order to get reliable estimates of either type of diversity. Mean consumer diet breadth influenced the correlation between species and interaction diversity significantly more than variation in both species richness and taxonomic abundance. However, this effect of diet breadth on interaction diversity is conditional on the number of observed interactions included in the models. The results presented here will help develop realistic predictions of the relationships

  3. Extracting falsifiable predictions from sloppy models.

    Science.gov (United States)

    Gutenkunst, Ryan N; Casey, Fergal P; Waterfall, Joshua J; Myers, Christopher R; Sethna, James P

    2007-12-01

    Successful predictions are among the most compelling validations of any model. Extracting falsifiable predictions from nonlinear multiparameter models is complicated by the fact that such models are commonly sloppy, possessing sensitivities to different parameter combinations that range over many decades. Here we discuss how sloppiness affects the sorts of data that best constrain model predictions, makes linear uncertainty approximations dangerous, and introduces computational difficulties in Monte-Carlo uncertainty analysis. We also present a useful test problem and suggest refinements to the standards by which models are communicated.

  4. The prediction of epidemics through mathematical modeling.

    Science.gov (United States)

    Schaus, Catherine

    2014-01-01

    Mathematical models may be resorted to in an endeavor to predict the development of epidemics. The SIR model is one of the applications. Still too approximate, the use of statistics awaits more data in order to come closer to reality.

  5. Calibration of PMIS pavement performance prediction models.

    Science.gov (United States)

    2012-02-01

    Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...

  6. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  7. Case studies in archaeological predictive modelling

    NARCIS (Netherlands)

    Verhagen, Jacobus Wilhelmus Hermanus Philippus

    2007-01-01

    In this thesis, a collection of papers is put together dealing with various quantitative aspects of predictive modelling and archaeological prospection. Among the issues covered are the effects of survey bias on the archaeological data used for predictive modelling, and the complexities of testing

  8. Clinical Prediction Models for Cardiovascular Disease: Tufts Predictive Analytics and Comparative Effectiveness Clinical Prediction Model Database.

    Science.gov (United States)

    Wessler, Benjamin S; Lai Yh, Lana; Kramer, Whitney; Cangelosi, Michael; Raman, Gowri; Lutz, Jennifer S; Kent, David M

    2015-07-01

    Clinical prediction models (CPMs) estimate the probability of clinical outcomes and hold the potential to improve decision making and individualize care. For patients with cardiovascular disease, there are numerous CPMs available although the extent of this literature is not well described. We conducted a systematic review for articles containing CPMs for cardiovascular disease published between January 1990 and May 2012. Cardiovascular disease includes coronary heart disease, heart failure, arrhythmias, stroke, venous thromboembolism, and peripheral vascular disease. We created a novel database and characterized CPMs based on the stage of development, population under study, performance, covariates, and predicted outcomes. There are 796 models included in this database. The number of CPMs published each year is increasing steadily over time. Seven hundred seventeen (90%) are de novo CPMs, 21 (3%) are CPM recalibrations, and 58 (7%) are CPM adaptations. This database contains CPMs for 31 index conditions, including 215 CPMs for patients with coronary artery disease, 168 CPMs for population samples, and 79 models for patients with heart failure. There are 77 distinct index/outcome pairings. Of the de novo models in this database, 450 (63%) report a c-statistic and 259 (36%) report some information on calibration. There is an abundance of CPMs available for a wide assortment of cardiovascular disease conditions, with substantial redundancy in the literature. The comparative performance of these models, the consistency of effects and risk estimates across models and the actual and potential clinical impact of this body of literature is poorly understood. © 2015 American Heart Association, Inc.

  9. Trophic transfer of pyrene metabolites between aquatic invertebrates

    International Nuclear Information System (INIS)

    Carrasco Navarro, V.; Leppänen, M.T.; Kukkonen, J.V.K.; Godoy Olmos, S.

    2013-01-01

    The trophic transfer of pyrene metabolites was studied using Gammarus setosus as a predator and the invertebrates Lumbriculus variegatus and Chironomus riparius as prey. The results obtained by liquid scintillation counting confirmed that the pyrene metabolites produced by the aquatic invertebrates L. variegatus and C. riparius were transferred to G. setosus through the diet. More detailed analyses by liquid chromatography discovered that two of the metabolites produced by C. riparius appeared in the chromatograms of G. setosus tissue extracts, proving their trophic transfer. These metabolites were not present in chromatograms of G. setosus exclusively exposed to pyrene. The present study supports the trophic transfer of PAH metabolites between benthic macroinvertebrates and common species of an arctic amphipod. As some PAH metabolites are more toxic than the parent compounds, the present study raises concerns about the consequences of their trophic transfer and the fate and effects of PAHs in natural environments. - Highlights: ► The trophic transfer of pyrene metabolites between invertebrates was evaluated. ► Biotransformation of pyrene by L. variegatus and C. riparius is different. ► Metabolites produced by L. variegatus and C. riparius are transferred to G. setosus. ► Specifically, two metabolites produced by C. riparius were transferred. - Some of the pyrene metabolites produced by the model invertebrates L. variegatus and C. riparius are transferred to G. setosus through the diet, proving their trophic transfer.

  10. Incorporating uncertainty in predictive species distribution modelling.

    Science.gov (United States)

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  11. Model Predictive Control for Smart Energy Systems

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus

    pumps, heat tanks, electrical vehicle battery charging/discharging, wind farms, power plants). 2.Embed forecasting methodologies for the weather (e.g. temperature, solar radiation), the electricity consumption, and the electricity price in a predictive control system. 3.Develop optimization algorithms....... Chapter 3 introduces Model Predictive Control (MPC) including state estimation, filtering and prediction for linear models. Chapter 4 simulates the models from Chapter 2 with the certainty equivalent MPC from Chapter 3. An economic MPC minimizes the costs of consumption based on real electricity prices...... that determined the flexibility of the units. A predictive control system easily handles constraints, e.g. limitations in power consumption, and predicts the future behavior of a unit by integrating predictions of electricity prices, consumption, and weather variables. The simulations demonstrate the expected...

  12. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  13. Model predictive control classical, robust and stochastic

    CERN Document Server

    Kouvaritakis, Basil

    2016-01-01

    For the first time, a textbook that brings together classical predictive control with treatment of up-to-date robust and stochastic techniques. Model Predictive Control describes the development of tractable algorithms for uncertain, stochastic, constrained systems. The starting point is classical predictive control and the appropriate formulation of performance objectives and constraints to provide guarantees of closed-loop stability and performance. Moving on to robust predictive control, the text explains how similar guarantees may be obtained for cases in which the model describing the system dynamics is subject to additive disturbances and parametric uncertainties. Open- and closed-loop optimization are considered and the state of the art in computationally tractable methods based on uncertainty tubes presented for systems with additive model uncertainty. Finally, the tube framework is also applied to model predictive control problems involving hard or probabilistic constraints for the cases of multiplic...

  14. Modeling, robust and distributed model predictive control for freeway networks

    NARCIS (Netherlands)

    Liu, S.

    2016-01-01

    In Model Predictive Control (MPC) for traffic networks, traffic models are crucial since they are used as prediction models for determining the optimal control actions. In order to reduce the computational complexity of MPC for traffic networks, macroscopic traffic models are often used instead of

  15. Deep Predictive Models in Interactive Music

    OpenAIRE

    Martin, Charles P.; Ellefsen, Kai Olav; Torresen, Jim

    2018-01-01

    Automatic music generation is a compelling task where much recent progress has been made with deep learning models. In this paper, we ask how these models can be integrated into interactive music systems; how can they encourage or enhance the music making of human users? Musical performance requires prediction to operate instruments, and perform in groups. We argue that predictive models could help interactive systems to understand their temporal context, and ensemble behaviour. Deep learning...

  16. Unreachable Setpoints in Model Predictive Control

    DEFF Research Database (Denmark)

    Rawlings, James B.; Bonné, Dennis; Jørgensen, John Bagterp

    2008-01-01

    In this work, a new model predictive controller is developed that handles unreachable setpoints better than traditional model predictive control methods. The new controller induces an interesting fast/slow asymmetry in the tracking response of the system. Nominal asymptotic stability of the optimal...... steady state is established for terminal constraint model predictive control (MPC). The region of attraction is the steerable set. Existing analysis methods for closed-loop properties of MPC are not applicable to this new formulation, and a new analysis method is developed. It is shown how to extend...

  17. Bayesian Predictive Models for Rayleigh Wind Speed

    DEFF Research Database (Denmark)

    Shahirinia, Amir; Hajizadeh, Amin; Yu, David C

    2017-01-01

    predictive model of the wind speed aggregates the non-homogeneous distributions into a single continuous distribution. Therefore, the result is able to capture the variation among the probability distributions of the wind speeds at the turbines’ locations in a wind farm. More specifically, instead of using...... a wind speed distribution whose parameters are known or estimated, the parameters are considered as random whose variations are according to probability distributions. The Bayesian predictive model for a Rayleigh which only has a single model scale parameter has been proposed. Also closed-form posterior...... and predictive inferences under different reasonable choices of prior distribution in sensitivity analysis have been presented....

  18. Food chain model to predict westslope cutthroat trout ovary selenium concentrations from water concentrations in the Elk Valley, BC

    International Nuclear Information System (INIS)

    Orr, P.; Wiramanaden, C.; Franklin, W.; Fraser, C.

    2010-01-01

    The 5 coal mines operated by Teck Coal Ltd. in British Columbia's Elk River watershed release selenium during weathering of mine waste rock. Since 1966, several field studies have been conducted in which selenium concentrations in biota were measured. They revealed that tissue concentrations are higher in aquatic biota sampled in lentic compared to lotic habitats of the watershed with similar water selenium concentrations. Two food chain models were developed based on the available data. The models described dietary selenium accumulation in the ovaries of lotic versus lentic westslope cutthroat trout (WCT), a valued aquatic resource in the Elk River system. The following 3 trophic transfer relationships were characterized for each model: (1) water to base of the food web, (2) base of the food web to benthic invertebrates, and (3) benthic invertebrates to WCT ovaries. The lotic and lentic models combined the resulting equations for each trophic transfer relationships to predict WCT ovary concentrations from water concentrations. The models were in very good agreement with the available data, despite fish movement and the fact that composite benthic invertebrate sample data were only an approximation of the feeding preferences of individual fish. Based on the observed rates of increase in water selenium concentrations throughout the watershed, the models predicted very small/slow increases in WCT ovary concentrations with time.

  19. Predictive Modelling and Time: An Experiment in Temporal Archaeological Predictive Models

    OpenAIRE

    David Ebert

    2006-01-01

    One of the most common criticisms of archaeological predictive modelling is that it fails to account for temporal or functional differences in sites. However, a practical solution to temporal or functional predictive modelling has proven to be elusive. This article discusses temporal predictive modelling, focusing on the difficulties of employing temporal variables, then introduces and tests a simple methodology for the implementation of temporal modelling. The temporal models thus created ar...

  20. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  1. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  2. Multi-model analysis in hydrological prediction

    Science.gov (United States)

    Lanthier, M.; Arsenault, R.; Brissette, F.

    2017-12-01

    Hydrologic modelling, by nature, is a simplification of the real-world hydrologic system. Therefore ensemble hydrological predictions thus obtained do not present the full range of possible streamflow outcomes, thereby producing ensembles which demonstrate errors in variance such as under-dispersion. Past studies show that lumped models used in prediction mode can return satisfactory results, especially when there is not enough information available on the watershed to run a distributed model. But all lumped models greatly simplify the complex processes of the hydrologic cycle. To generate more spread in the hydrologic ensemble predictions, multi-model ensembles have been considered. In this study, the aim is to propose and analyse a method that gives an ensemble streamflow prediction that properly represents the forecast probabilities and reduced ensemble bias. To achieve this, three simple lumped models are used to generate an ensemble. These will also be combined using multi-model averaging techniques, which generally generate a more accurate hydrogram than the best of the individual models in simulation mode. This new predictive combined hydrogram is added to the ensemble, thus creating a large ensemble which may improve the variability while also improving the ensemble mean bias. The quality of the predictions is then assessed on different periods: 2 weeks, 1 month, 3 months and 6 months using a PIT Histogram of the percentiles of the real observation volumes with respect to the volumes of the ensemble members. Initially, the models were run using historical weather data to generate synthetic flows. This worked for individual models, but not for the multi-model and for the large ensemble. Consequently, by performing data assimilation at each prediction period and thus adjusting the initial states of the models, the PIT Histogram could be constructed using the observed flows while allowing the use of the multi-model predictions. The under-dispersion has been

  3. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Esophageal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Lung Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Breast Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Pancreatic Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Testicular Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Modeling and Prediction Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Juhl, Rune; Møller, Jan Kloppenborg; Jørgensen, John Bagterp

    2016-01-01

    Pharmacokinetic/pharmakodynamic (PK/PD) modeling for a single subject is most often performed using nonlinear models based on deterministic ordinary differential equations (ODEs), and the variation between subjects in a population of subjects is described using a population (mixed effects) setup...... deterministic and can predict the future perfectly. A more realistic approach would be to allow for randomness in the model due to e.g., the model be too simple or errors in input. We describe a modeling and prediction setup which better reflects reality and suggests stochastic differential equations (SDEs...

  15. Predictive Model of Systemic Toxicity (SOT)

    Science.gov (United States)

    In an effort to ensure chemical safety in light of regulatory advances away from reliance on animal testing, USEPA and L’Oréal have collaborated to develop a quantitative systemic toxicity prediction model. Prediction of human systemic toxicity has proved difficult and remains a ...

  16. Multiple attractors and boundary crises in a tri-trophic food chain.

    NARCIS (Netherlands)

    Boer, M.P.; Kooi, B.W.; Kooijman, S.A.L.M.

    2001-01-01

    The asymptotic behaviour of a model of a tri-trophic food chain in the chemostat is analysed in detail. The Monod growth model is used for all trophic levels, yielding a non-linear dynamical system of four ordinary differential equations. Mass conservation makes it possible to reduce the dimension

  17. Trophic transfer of metal nanoparticles in freshwater ecosystems

    DEFF Research Database (Denmark)

    Tangaa, Stine Rosendal

    freshwater ecosystems range from a few ng/L in surface waters and up to mg/kg in sediments. Several studies have shown Ag ENPs to be toxic, bioaccumulative and harmful to aquatic biota within these concentration ranges. However, research on potential trophic transfer of Ag ENPs is limited. To investigate...... the aquatic ecosystems, Ag ENPs will undergo several transformation processes, ultimately leading to particles settling out of the water column. This will likely result in an increased concentration of ENPs in the sediment. In fact, predicted environmental concentrations of Ag ENPs in Danish and European...... freshwater food web. Future studies should concentrate on the internal distribution of Me-ENPs after uptake in both prey and predator, as this will increase the understanding of fate and effects of Me-ENPs on aquatic biota. Trophic transfer studies including more trophic levels, and higher pelagic organisms...

  18. Spent fuel: prediction model development

    International Nuclear Information System (INIS)

    Almassy, M.Y.; Bosi, D.M.; Cantley, D.A.

    1979-07-01

    The need for spent fuel disposal performance modeling stems from a requirement to assess the risks involved with deep geologic disposal of spent fuel, and to support licensing and public acceptance of spent fuel repositories. Through the balanced program of analysis, diagnostic testing, and disposal demonstration tests, highlighted in this presentation, the goal of defining risks and of quantifying fuel performance during long-term disposal can be attained

  19. Navy Recruit Attrition Prediction Modeling

    Science.gov (United States)

    2014-09-01

    have high correlation with attrition, such as age, job characteristics, command climate, marital status, behavior issues prior to recruitment, and the...the additive model. glm(formula = Outcome ~ Age + Gender + Marital + AFQTCat + Pay + Ed + Dep, family = binomial, data = ltraining) Deviance ...0.1 ‘ ‘ 1 (Dispersion parameter for binomial family taken to be 1) Null deviance : 105441 on 85221 degrees of freedom Residual deviance

  20. Predicting and Modeling RNA Architecture

    Science.gov (United States)

    Westhof, Eric; Masquida, Benoît; Jossinet, Fabrice

    2011-01-01

    SUMMARY A general approach for modeling the architecture of large and structured RNA molecules is described. The method exploits the modularity and the hierarchical folding of RNA architecture that is viewed as the assembly of preformed double-stranded helices defined by Watson-Crick base pairs and RNA modules maintained by non-Watson-Crick base pairs. Despite the extensive molecular neutrality observed in RNA structures, specificity in RNA folding is achieved through global constraints like lengths of helices, coaxiality of helical stacks, and structures adopted at the junctions of helices. The Assemble integrated suite of computer tools allows for sequence and structure analysis as well as interactive modeling by homology or ab initio assembly with possibilities for fitting within electronic density maps. The local key role of non-Watson-Crick pairs guides RNA architecture formation and offers metrics for assessing the accuracy of three-dimensional models in a more useful way than usual root mean square deviation (RMSD) values. PMID:20504963

  1. Predictive Models and Computational Toxicology (II IBAMTOX)

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  2. Finding furfural hydrogenation catalysts via predictive modelling

    NARCIS (Netherlands)

    Strassberger, Z.; Mooijman, M.; Ruijter, E.; Alberts, A.H.; Maldonado, A.G.; Orru, R.V.A.; Rothenberg, G.

    2010-01-01

    We combine multicomponent reactions, catalytic performance studies and predictive modelling to find transfer hydrogenation catalysts. An initial set of 18 ruthenium-carbene complexes were synthesized and screened in the transfer hydrogenation of furfural to furfurol with isopropyl alcohol complexes

  3. FINITE ELEMENT MODEL FOR PREDICTING RESIDUAL ...

    African Journals Online (AJOL)

    FINITE ELEMENT MODEL FOR PREDICTING RESIDUAL STRESSES IN ... the transverse residual stress in the x-direction (σx) had a maximum value of 375MPa ... the finite element method are in fair agreement with the experimental results.

  4. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico; Kryshtafovych, Andriy; Tramontano, Anna

    2009-01-01

    established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic

  5. Mental models accurately predict emotion transitions.

    Science.gov (United States)

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  6. Mental models accurately predict emotion transitions

    Science.gov (United States)

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  7. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  8. Model predictive Controller for Mobile Robot

    OpenAIRE

    Alireza Rezaee

    2017-01-01

    This paper proposes a Model Predictive Controller (MPC) for control of a P2AT mobile robot. MPC refers to a group of controllers that employ a distinctly identical model of process to predict its future behavior over an extended prediction horizon. The design of a MPC is formulated as an optimal control problem. Then this problem is considered as linear quadratic equation (LQR) and is solved by making use of Ricatti equation. To show the effectiveness of the proposed method this controller is...

  9. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  10. Accuracy assessment of landslide prediction models

    International Nuclear Information System (INIS)

    Othman, A N; Mohd, W M N W; Noraini, S

    2014-01-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones

  11. Trophic interaction modifications: an empirical and theoretical framework.

    Science.gov (United States)

    Terry, J Christopher D; Morris, Rebecca J; Bonsall, Michael B

    2017-10-01

    Consumer-resource interactions are often influenced by other species in the community. At present these 'trophic interaction modifications' are rarely included in ecological models despite demonstrations that they can drive system dynamics. Here, we advocate and extend an approach that has the potential to unite and represent this key group of non-trophic interactions by emphasising the change to trophic interactions induced by modifying species. We highlight the opportunities this approach brings in comparison to frameworks that coerce trophic interaction modifications into pairwise relationships. To establish common frames of reference and explore the value of the approach, we set out a range of metrics for the 'strength' of an interaction modification which incorporate increasing levels of contextual information about the system. Through demonstrations in three-species model systems, we establish that these metrics capture complimentary aspects of interaction modifications. We show how the approach can be used in a range of empirical contexts; we identify as specific gaps in current understanding experiments with multiple levels of modifier species and the distributions of modifications in networks. The trophic interaction modification approach we propose can motivate and unite empirical and theoretical studies of system dynamics, providing a route to confront ecological complexity. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  12. From neurons to epidemics: How trophic coherence affects spreading processes

    Science.gov (United States)

    Klaise, Janis; Johnson, Samuel

    2016-06-01

    Trophic coherence, a measure of the extent to which the nodes of a directed network are organised in levels, has recently been shown to be closely related to many structural and dynamical aspects of complex systems, including graph eigenspectra, the prevalence or absence of feedback cycles, and linear stability. Furthermore, non-trivial trophic structures have been observed in networks of neurons, species, genes, metabolites, cellular signalling, concatenated words, P2P users, and world trade. Here, we consider two simple yet apparently quite different dynamical models—one a susceptible-infected-susceptible epidemic model adapted to include complex contagion and the other an Amari-Hopfield neural network—and show that in both cases the related spreading processes are modulated in similar ways by the trophic coherence of the underlying networks. To do this, we propose a network assembly model which can generate structures with tunable trophic coherence, limiting in either perfectly stratified networks or random graphs. We find that trophic coherence can exert a qualitative change in spreading behaviour, determining whether a pulse of activity will percolate through the entire network or remain confined to a subset of nodes, and whether such activity will quickly die out or endure indefinitely. These results could be important for our understanding of phenomena such as epidemics, rumours, shocks to ecosystems, neuronal avalanches, and many other spreading processes.

  13. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  14. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  15. The trophic responses of two different rodent-vector-plague systems to climate change.

    Science.gov (United States)

    Xu, Lei; Schmid, Boris V; Liu, Jun; Si, Xiaoyan; Stenseth, Nils Chr; Zhang, Zhibin

    2015-02-07

    Plague, the causative agent of three devastating pandemics in history, is currently a re-emerging disease, probably due to climate change and other anthropogenic changes. Without understanding the response of plague systems to anthropogenic or climate changes in their trophic web, it is unfeasible to effectively predict years with high risks of plague outbreak, hampering our ability for effective prevention and control of the disease. Here, by using surveillance data, we apply structural equation modelling to reveal the drivers of plague prevalence in two very different rodent systems: those of the solitary Daurian ground squirrel and the social Mongolian gerbil. We show that plague prevalence in the Daurian ground squirrel is not detectably related to its trophic web, and that therefore surveillance efforts should focus on detecting plague directly in this ecosystem. On the other hand, plague in the Mongolian gerbil is strongly embedded in a complex, yet understandable trophic web of climate, vegetation, and rodent and flea densities, making the ecosystem suitable for more sophisticated low-cost surveillance practices, such as remote sensing. As for the trophic webs of the two rodent species, we find that increased vegetation is positively associated with higher temperatures and precipitation for both ecosystems. We furthermore find a positive association between vegetation and ground squirrel density, yet a negative association between vegetation and gerbil density. Our study thus shows how past surveillance records can be used to design and improve existing plague prevention and control measures, by tailoring them to individual plague foci. Such measures are indeed highly needed under present conditions with prevailing climate change. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  16. The trophic responses of two different rodent–vector–plague systems to climate change

    Science.gov (United States)

    Xu, Lei; Schmid, Boris V.; Liu, Jun; Si, Xiaoyan; Stenseth, Nils Chr.; Zhang, Zhibin

    2015-01-01

    Plague, the causative agent of three devastating pandemics in history, is currently a re-emerging disease, probably due to climate change and other anthropogenic changes. Without understanding the response of plague systems to anthropogenic or climate changes in their trophic web, it is unfeasible to effectively predict years with high risks of plague outbreak, hampering our ability for effective prevention and control of the disease. Here, by using surveillance data, we apply structural equation modelling to reveal the drivers of plague prevalence in two very different rodent systems: those of the solitary Daurian ground squirrel and the social Mongolian gerbil. We show that plague prevalence in the Daurian ground squirrel is not detectably related to its trophic web, and that therefore surveillance efforts should focus on detecting plague directly in this ecosystem. On the other hand, plague in the Mongolian gerbil is strongly embedded in a complex, yet understandable trophic web of climate, vegetation, and rodent and flea densities, making the ecosystem suitable for more sophisticated low-cost surveillance practices, such as remote sensing. As for the trophic webs of the two rodent species, we find that increased vegetation is positively associated with higher temperatures and precipitation for both ecosystems. We furthermore find a positive association between vegetation and ground squirrel density, yet a negative association between vegetation and gerbil density. Our study thus shows how past surveillance records can be used to design and improve existing plague prevention and control measures, by tailoring them to individual plague foci. Such measures are indeed highly needed under present conditions with prevailing climate change. PMID:25540277

  17. Enhanced understanding of ectoparasite: host trophic linkages on coral reefs through stable isotope analysis

    Science.gov (United States)

    Demopoulos, Amanda W. J.; Sikkel, Paul C.

    2015-01-01

    Parasitism, although the most common type of ecological interaction, is usually ignored in food web models and studies of trophic connectivity. Stable isotope analysis is widely used in assessing the flow of energy in ecological communities and thus is a potentially valuable tool in understanding the cryptic trophic relationships mediated by parasites. In an effort to assess the utility of stable isotope analysis in understanding the role of parasites in complex coral-reef trophic systems, we performed stable isotope analysis on three common Caribbean reef fish hosts and two kinds of ectoparasitic isopods: temporarily parasitic gnathiids (Gnathia marleyi) and permanently parasitic cymothoids (Anilocra). To further track the transfer of fish-derived carbon (energy) from parasites to parasite consumers, gnathiids from host fish were also fed to captive Pederson shrimp (Ancylomenes pedersoni) for at least 1 month. Parasitic isopods had δ13C and δ15N values similar to their host, comparable with results from the small number of other host–parasite studies that have employed stable isotopes. Adult gnathiids were enriched in 15N and depleted in13C relative to juvenile gnathiids, providing insights into the potential isotopic fractionation associated with blood-meal assimilation and subsequent metamorphosis. Gnathiid-fed Pedersen shrimp also had δ13C values consistent with their food source and enriched in 15N as predicted due to trophic fractionation. These results further indicate that stable isotopes can be an effective tool in deciphering cryptic feeding relationships involving parasites and their consumers, and the role of parasites and cleaners in carbon transfer in coral-reef ecosystems specifically.

  18. Enhanced understanding of ectoparasite–host trophic linkages on coral reefs through stable isotope analysis

    Directory of Open Access Journals (Sweden)

    Amanda W.J. Demopoulos

    2015-04-01

    Full Text Available Parasitism, although the most common type of ecological interaction, is usually ignored in food web models and studies of trophic connectivity. Stable isotope analysis is widely used in assessing the flow of energy in ecological communities and thus is a potentially valuable tool in understanding the cryptic trophic relationships mediated by parasites. In an effort to assess the utility of stable isotope analysis in understanding the role of parasites in complex coral-reef trophic systems, we performed stable isotope analysis on three common Caribbean reef fish hosts and two kinds of ectoparasitic isopods: temporarily parasitic gnathiids (Gnathia marleyi and permanently parasitic cymothoids (Anilocra. To further track the transfer of fish-derived carbon (energy from parasites to parasite consumers, gnathiids from host fish were also fed to captive Pederson shrimp (Ancylomenes pedersoni for at least 1 month. Parasitic isopods had δ13C and δ15N values similar to their host, comparable with results from the small number of other host–parasite studies that have employed stable isotopes. Adult gnathiids were enriched in 15N and depleted in 13C relative to juvenile gnathiids, providing insights into the potential isotopic fractionation associated with blood-meal assimilation and subsequent metamorphosis. Gnathiid-fed Pedersen shrimp also had δ13C values consistent with their food source and enriched in 15N as predicted due to trophic fractionation. These results further indicate that stable isotopes can be an effective tool in deciphering cryptic feeding relationships involving parasites and their consumers, and the role of parasites and cleaners in carbon transfer in coral-reef ecosystems specifically.

  19. Finding Furfural Hydrogenation Catalysts via Predictive Modelling.

    Science.gov (United States)

    Strassberger, Zea; Mooijman, Maurice; Ruijter, Eelco; Alberts, Albert H; Maldonado, Ana G; Orru, Romano V A; Rothenberg, Gadi

    2010-09-10

    We combine multicomponent reactions, catalytic performance studies and predictive modelling to find transfer hydrogenation catalysts. An initial set of 18 ruthenium-carbene complexes were synthesized and screened in the transfer hydrogenation of furfural to furfurol with isopropyl alcohol complexes gave varied yields, from 62% up to >99.9%, with no obvious structure/activity correlations. Control experiments proved that the carbene ligand remains coordinated to the ruthenium centre throughout the reaction. Deuterium-labelling studies showed a secondary isotope effect (k(H):k(D)=1.5). Further mechanistic studies showed that this transfer hydrogenation follows the so-called monohydride pathway. Using these data, we built a predictive model for 13 of the catalysts, based on 2D and 3D molecular descriptors. We tested and validated the model using the remaining five catalysts (cross-validation, R(2)=0.913). Then, with this model, the conversion and selectivity were predicted for four completely new ruthenium-carbene complexes. These four catalysts were then synthesized and tested. The results were within 3% of the model's predictions, demonstrating the validity and value of predictive modelling in catalyst optimization.

  20. Corporate prediction models, ratios or regression analysis?

    NARCIS (Netherlands)

    Bijnen, E.J.; Wijn, M.F.C.M.

    1994-01-01

    The models developed in the literature with respect to the prediction of a company s failure are based on ratios. It has been shown before that these models should be rejected on theoretical grounds. Our study of industrial companies in the Netherlands shows that the ratios which are used in

  1. Predicting Protein Secondary Structure with Markov Models

    DEFF Research Database (Denmark)

    Fischer, Paul; Larsen, Simon; Thomsen, Claus

    2004-01-01

    we are considering here, is to predict the secondary structure from the primary one. To this end we train a Markov model on training data and then use it to classify parts of unknown protein sequences as sheets, helices or coils. We show how to exploit the directional information contained...... in the Markov model for this task. Classifications that are purely based on statistical models might not always be biologically meaningful. We present combinatorial methods to incorporate biological background knowledge to enhance the prediction performance....

  2. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  3. Comparative Study of Bancruptcy Prediction Models

    Directory of Open Access Journals (Sweden)

    Isye Arieshanti

    2013-09-01

    Full Text Available Early indication of bancruptcy is important for a company. If companies aware of  potency of their bancruptcy, they can take a preventive action to anticipate the bancruptcy. In order to detect the potency of a bancruptcy, a company can utilize a a model of bancruptcy prediction. The prediction model can be built using a machine learning methods. However, the choice of machine learning methods should be performed carefully. Because the suitability of a model depends on the problem specifically. Therefore, in this paper we perform a comparative study of several machine leaning methods for bancruptcy prediction. According to the comparative study, the performance of several models that based on machine learning methods (k-NN, fuzzy k-NN, SVM, Bagging Nearest Neighbour SVM, Multilayer Perceptron(MLP, Hybrid of MLP + Multiple Linear Regression, it can be showed that fuzzy k-NN method achieve the best performance with accuracy 77.5%

  4. Prediction Models for Dynamic Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    Aman, Saima; Frincu, Marc; Chelmis, Charalampos; Noor, Muhammad; Simmhan, Yogesh; Prasanna, Viktor K.

    2015-11-02

    As Smart Grids move closer to dynamic curtailment programs, Demand Response (DR) events will become necessary not only on fixed time intervals and weekdays predetermined by static policies, but also during changing decision periods and weekends to react to real-time demand signals. Unique challenges arise in this context vis-a-vis demand prediction and curtailment estimation and the transformation of such tasks into an automated, efficient dynamic demand response (D2R) process. While existing work has concentrated on increasing the accuracy of prediction models for DR, there is a lack of studies for prediction models for D2R, which we address in this paper. Our first contribution is the formal definition of D2R, and the description of its challenges and requirements. Our second contribution is a feasibility analysis of very-short-term prediction of electricity consumption for D2R over a diverse, large-scale dataset that includes both small residential customers and large buildings. Our third, and major contribution is a set of insights into the predictability of electricity consumption in the context of D2R. Specifically, we focus on prediction models that can operate at a very small data granularity (here 15-min intervals), for both weekdays and weekends - all conditions that characterize scenarios for D2R. We find that short-term time series and simple averaging models used by Independent Service Operators and utilities achieve superior prediction accuracy. We also observe that workdays are more predictable than weekends and holiday. Also, smaller customers have large variation in consumption and are less predictable than larger buildings. Key implications of our findings are that better models are required for small customers and for non-workdays, both of which are critical for D2R. Also, prediction models require just few days’ worth of data indicating that small amounts of

  5. Intersexual Trophic Niche Partitioning in an Ant-Eating spider (Araneae: Zodariidae)

    DEFF Research Database (Denmark)

    Pekár, Stanislav; Martisová, Martina; Bilde, T.

    2011-01-01

    lead to higher energy demands in females driven by fecundity selection, while males invest in mate searching. We tested predictions of the two hypotheses underlying intersexual trophic niche partitioning in a natural population of spiders. Zodarion jozefienae spiders specialize on Messor barbarus ants...... that are polymorphic in body size and hence comprise potential trophic niches for the spider, making this system well-suited to study intersexual trophic niche partitioning. Methodology/Principal Findings Comparative analysis of trophic morphology (the chelicerae) and body size of males, females and juveniles...... demonstrated highly female biased SSD (Sexual Size Dimorphism) in body size, body weight, and in the size of chelicerae, the latter arising from sex-specific growth patterns in trophic morphology. In the field, female spiders actively selected ant sub-castes that were larger than the average prey size...

  6. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  7. Finding Furfural Hydrogenation Catalysts via Predictive Modelling

    Science.gov (United States)

    Strassberger, Zea; Mooijman, Maurice; Ruijter, Eelco; Alberts, Albert H; Maldonado, Ana G; Orru, Romano V A; Rothenberg, Gadi

    2010-01-01

    Abstract We combine multicomponent reactions, catalytic performance studies and predictive modelling to find transfer hydrogenation catalysts. An initial set of 18 ruthenium-carbene complexes were synthesized and screened in the transfer hydrogenation of furfural to furfurol with isopropyl alcohol complexes gave varied yields, from 62% up to >99.9%, with no obvious structure/activity correlations. Control experiments proved that the carbene ligand remains coordinated to the ruthenium centre throughout the reaction. Deuterium-labelling studies showed a secondary isotope effect (kH:kD=1.5). Further mechanistic studies showed that this transfer hydrogenation follows the so-called monohydride pathway. Using these data, we built a predictive model for 13 of the catalysts, based on 2D and 3D molecular descriptors. We tested and validated the model using the remaining five catalysts (cross-validation, R2=0.913). Then, with this model, the conversion and selectivity were predicted for four completely new ruthenium-carbene complexes. These four catalysts were then synthesized and tested. The results were within 3% of the model’s predictions, demonstrating the validity and value of predictive modelling in catalyst optimization. PMID:23193388

  8. Wind farm production prediction - The Zephyr model

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L. [Risoe National Lab., Wind Energy Dept., Roskilde (Denmark); Giebel, G. [Risoe National Lab., Wind Energy Dept., Roskilde (Denmark); Madsen, H. [IMM (DTU), Kgs. Lyngby (Denmark); Nielsen, T.S. [IMM (DTU), Kgs. Lyngby (Denmark); Joergensen, J.U. [Danish Meteorologisk Inst., Copenhagen (Denmark); Lauersen, L. [Danish Meteorologisk Inst., Copenhagen (Denmark); Toefting, J. [Elsam, Fredericia (DK); Christensen, H.S. [Eltra, Fredericia (Denmark); Bjerge, C. [SEAS, Haslev (Denmark)

    2002-06-01

    This report describes a project - funded by the Danish Ministry of Energy and the Environment - which developed a next generation prediction system called Zephyr. The Zephyr system is a merging between two state-of-the-art prediction systems: Prediktor of Risoe National Laboratory and WPPT of IMM at the Danish Technical University. The numerical weather predictions were generated by DMI's HIRLAM model. Due to technical difficulties programming the system, only the computational core and a very simple version of the originally very complex system were developed. The project partners were: Risoe, DMU, DMI, Elsam, Eltra, Elkraft System, SEAS and E2. (au)

  9. Model predictive controller design of hydrocracker reactors

    OpenAIRE

    GÖKÇE, Dila

    2011-01-01

    This study summarizes the design of a Model Predictive Controller (MPC) in Tüpraş, İzmit Refinery Hydrocracker Unit Reactors. Hydrocracking process, in which heavy vacuum gasoil is converted into lighter and valuable products at high temperature and pressure is described briefly. Controller design description, identification and modeling studies are examined and the model variables are presented. WABT (Weighted Average Bed Temperature) equalization and conversion increase are simulate...

  10. Multi-Model Ensemble Wake Vortex Prediction

    Science.gov (United States)

    Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.

    2015-01-01

    Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.

  11. Risk terrain modeling predicts child maltreatment.

    Science.gov (United States)

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. PREDICTIVE CAPACITY OF ARCH FAMILY MODELS

    Directory of Open Access Journals (Sweden)

    Raphael Silveira Amaro

    2016-03-01

    Full Text Available In the last decades, a remarkable number of models, variants from the Autoregressive Conditional Heteroscedastic family, have been developed and empirically tested, making extremely complex the process of choosing a particular model. This research aim to compare the predictive capacity, using the Model Confidence Set procedure, than five conditional heteroskedasticity models, considering eight different statistical probability distributions. The financial series which were used refers to the log-return series of the Bovespa index and the Dow Jones Industrial Index in the period between 27 October 2008 and 30 December 2014. The empirical evidences showed that, in general, competing models have a great homogeneity to make predictions, either for a stock market of a developed country or for a stock market of a developing country. An equivalent result can be inferred for the statistical probability distributions that were used.

  13. Alcator C-Mod predictive modeling

    International Nuclear Information System (INIS)

    Pankin, Alexei; Bateman, Glenn; Kritz, Arnold; Greenwald, Martin; Snipes, Joseph; Fredian, Thomas

    2001-01-01

    Predictive simulations for the Alcator C-mod tokamak [I. Hutchinson et al., Phys. Plasmas 1, 1511 (1994)] are carried out using the BALDUR integrated modeling code [C. E. Singer et al., Comput. Phys. Commun. 49, 275 (1988)]. The results are obtained for temperature and density profiles using the Multi-Mode transport model [G. Bateman et al., Phys. Plasmas 5, 1793 (1998)] as well as the mixed-Bohm/gyro-Bohm transport model [M. Erba et al., Plasma Phys. Controlled Fusion 39, 261 (1997)]. The simulated discharges are characterized by very high plasma density in both low and high modes of confinement. The predicted profiles for each of the transport models match the experimental data about equally well in spite of the fact that the two models have different dimensionless scalings. Average relative rms deviations are less than 8% for the electron density profiles and 16% for the electron and ion temperature profiles

  14. Analysis and prediction of pest dynamics in an agroforestry context using Tiko'n, a generic tool to develop food web models

    Science.gov (United States)

    Rojas, Marcela; Malard, Julien; Adamowski, Jan; Carrera, Jaime Luis; Maas, Raúl

    2017-04-01

    While it is known that climate change will impact future plant-pest population dynamics, potentially affecting crop damage, agroforestry with its enhanced biodiversity is said to reduce the outbreaks of pest insects by providing natural enemies for the control of pest populations. This premise is known in the literature as the natural enemy hypothesis and has been widely studied qualitatively. However, disagreement still exists on whether biodiversity enhancement reduces pest outbreaks, showing the need of quantitatively understanding the mechanisms behind the interactions between pests and natural enemies, also known as trophic interactions. Crop pest models that study insect population dynamics in agroforestry contexts are very rare, and pest models that take trophic interactions into account are even rarer. This may be due to the difficulty of representing complex food webs in a quantifiable model. There is therefore a need for validated food web models that allow users to predict the response of these webs to changes in climate in agroforestry systems. In this study we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web models; the program uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. Tiko'n was run using coffee leaf miner (Leucoptera coffeella) and associated parasitoid data from a shaded coffee plantation, showing the mechanisms of insect population dynamics within a tri-trophic food web in an agroforestry system.

  15. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-07-01

    Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan. Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities. Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems. Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk. Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product. Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  16. Comparison of two ordinal prediction models

    DEFF Research Database (Denmark)

    Kattan, Michael W; Gerds, Thomas A

    2015-01-01

    system (i.e. old or new), such as the level of evidence for one or more factors included in the system or the general opinions of expert clinicians. However, given the major objective of estimating prognosis on an ordinal scale, we argue that the rival staging system candidates should be compared...... on their ability to predict outcome. We sought to outline an algorithm that would compare two rival ordinal systems on their predictive ability. RESULTS: We devised an algorithm based largely on the concordance index, which is appropriate for comparing two models in their ability to rank observations. We...... demonstrate our algorithm with a prostate cancer staging system example. CONCLUSION: We have provided an algorithm for selecting the preferred staging system based on prognostic accuracy. It appears to be useful for the purpose of selecting between two ordinal prediction models....

  17. Trophic niche of squids: Insights from isotopic data in marine systems worldwide

    Science.gov (United States)

    Navarro, Joan; Coll, Marta; Somes, Christoper J.; Olson, Robert J.

    2013-10-01

    Cephalopods are an important prey resource for fishes, seabirds, and marine mammals, and are also voracious predators on crustaceans, fishes, squid and zooplankton. Because of their high feeding rates and abundance, squids have the potential to exert control on the recruitment of commercially important fishes. In this review, we synthesize the available information for two intrinsic markers (δ15N and δ13C isotopic values) in squids for all oceans and several types of ecosystems to obtain a global view of the trophic niches of squids in marine ecosystems. In particular, we aimed to examine whether the trophic positions and trophic widths of squid species vary among oceans and ecosystem types. To correctly compare across systems, we adjusted squid δ15N values for the isotopic variability of phytoplankton at the base of the food web provided by an ocean circulation-biogeochemistry-isotope model. Studies that focused on the trophic ecology of squids using isotopic techniques were few, and most of the information on squids was from studies on their predators. Our results showed that squids occupy a large range of trophic positions and exploit a large range of trophic resources, reflecting the versatility of their feeding behavior and confirming conclusions from food-web models. Clear differences in both trophic position and trophic width were found among oceans and ecosystem types. The study also reinforces the importance of considering the natural variation in isotopic values when comparing the isotopic values of consumers inhabiting different ecosystems.

  18. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  19. Predictive performance models and multiple task performance

    Science.gov (United States)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  20. Model Predictive Control of Sewer Networks

    DEFF Research Database (Denmark)

    Pedersen, Einar B.; Herbertsson, Hannes R.; Niemann, Henrik

    2016-01-01

    The developments in solutions for management of urban drainage are of vital importance, as the amount of sewer water from urban areas continues to increase due to the increase of the world’s population and the change in the climate conditions. How a sewer network is structured, monitored and cont...... benchmark model. Due to the inherent constraints the applied approach is based on Model Predictive Control....

  1. Distributed Model Predictive Control via Dual Decomposition

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Stoustrup, Jakob; Andersen, Palle

    2014-01-01

    This chapter presents dual decomposition as a means to coordinate a number of subsystems coupled by state and input constraints. Each subsystem is equipped with a local model predictive controller while a centralized entity manages the subsystems via prices associated with the coupling constraints...

  2. A stepwise model to predict monthly streamflow

    Science.gov (United States)

    Mahmood Al-Juboori, Anas; Guven, Aytac

    2016-12-01

    In this study, a stepwise model empowered with genetic programming is developed to predict the monthly flows of Hurman River in Turkey and Diyalah and Lesser Zab Rivers in Iraq. The model divides the monthly flow data to twelve intervals representing the number of months in a year. The flow of a month, t is considered as a function of the antecedent month's flow (t - 1) and it is predicted by multiplying the antecedent monthly flow by a constant value called K. The optimum value of K is obtained by a stepwise procedure which employs Gene Expression Programming (GEP) and Nonlinear Generalized Reduced Gradient Optimization (NGRGO) as alternative to traditional nonlinear regression technique. The degree of determination and root mean squared error are used to evaluate the performance of the proposed models. The results of the proposed model are compared with the conventional Markovian and Auto Regressive Integrated Moving Average (ARIMA) models based on observed monthly flow data. The comparison results based on five different statistic measures show that the proposed stepwise model performed better than Markovian model and ARIMA model. The R2 values of the proposed model range between 0.81 and 0.92 for the three rivers in this study.

  3. The importance of predator–prey overlap: predicting North Sea cod recovery with a multispecies assessment model

    DEFF Research Database (Denmark)

    Kempf, Alexander; Dingsør, Gjert Endre; Huse, Geir

    2010-01-01

    The overlap between predator and prey is known as a sensitive parameter in multispecies assessment models for fish, and its parameterization is notoriously difficult. Overlap indices were derived from trawl surveys and used to parametrize the North Sea stochastic multispecies model. The effect...... of time-invariant and year- and quarter-specific overlap estimates on the historical (1991–2007) and predicted trophic interactions, as well as the development of predator and prey stocks, was investigated. The focus was set on a general comparison between single-species and multispecies forecasts...... and the sensitivity of the predicted development of North Sea cod for the two types of overlap implementation. The spatial–temporal overlap between cod and its predators increased with increasing temperature, indicating that foodweb processes might reduce the recovery potential of cod during warm periods...

  4. Electrostatic ion thrusters - towards predictive modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kalentev, O.; Matyash, K.; Duras, J.; Lueskow, K.F.; Schneider, R. [Ernst-Moritz-Arndt Universitaet Greifswald, D-17489 (Germany); Koch, N. [Technische Hochschule Nuernberg Georg Simon Ohm, Kesslerplatz 12, D-90489 Nuernberg (Germany); Schirra, M. [Thales Electronic Systems GmbH, Soeflinger Strasse 100, D-89077 Ulm (Germany)

    2014-02-15

    The development of electrostatic ion thrusters so far has mainly been based on empirical and qualitative know-how, and on evolutionary iteration steps. This resulted in considerable effort regarding prototype design, construction and testing and therefore in significant development and qualification costs and high time demands. For future developments it is anticipated to implement simulation tools which allow for quantitative prediction of ion thruster performance, long-term behavior and space craft interaction prior to hardware design and construction. Based on integrated numerical models combining self-consistent kinetic plasma models with plasma-wall interaction modules a new quality in the description of electrostatic thrusters can be reached. These open the perspective for predictive modeling in this field. This paper reviews the application of a set of predictive numerical modeling tools on an ion thruster model of the HEMP-T (High Efficiency Multi-stage Plasma Thruster) type patented by Thales Electron Devices GmbH. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  5. An Intelligent Model for Stock Market Prediction

    Directory of Open Access Journals (Sweden)

    IbrahimM. Hamed

    2012-08-01

    Full Text Available This paper presents an intelligent model for stock market signal prediction using Multi-Layer Perceptron (MLP Artificial Neural Networks (ANN. Blind source separation technique, from signal processing, is integrated with the learning phase of the constructed baseline MLP ANN to overcome the problems of prediction accuracy and lack of generalization. Kullback Leibler Divergence (KLD is used, as a learning algorithm, because it converges fast and provides generalization in the learning mechanism. Both accuracy and efficiency of the proposed model were confirmed through the Microsoft stock, from wall-street market, and various data sets, from different sectors of the Egyptian stock market. In addition, sensitivity analysis was conducted on the various parameters of the model to ensure the coverage of the generalization issue. Finally, statistical significance was examined using ANOVA test.

  6. Predictive Models, How good are they?

    DEFF Research Database (Denmark)

    Kasch, Helge

    The WAD grading system has been used for more than 20 years by now. It has shown long-term viability, but with strengths and limitations. New bio-psychosocial assessment of the acute whiplash injured subject may provide better prediction of long-term disability and pain. Furthermore, the emerging......-up. It is important to obtain prospective identification of the relevant risk underreported disability could, if we were able to expose these hidden “risk-factors” during our consultations, provide us with better predictive models. New data from large clinical studies will present exciting new genetic risk markers...

  7. NONLINEAR MODEL PREDICTIVE CONTROL OF CHEMICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    SILVA R. G.

    1999-01-01

    Full Text Available A new algorithm for model predictive control is presented. The algorithm utilizes a simultaneous solution and optimization strategy to solve the model's differential equations. The equations are discretized by equidistant collocation, and along with the algebraic model equations are included as constraints in a nonlinear programming (NLP problem. This algorithm is compared with the algorithm that uses orthogonal collocation on finite elements. The equidistant collocation algorithm results in simpler equations, providing a decrease in computation time for the control moves. Simulation results are presented and show a satisfactory performance of this algorithm.

  8. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  9. Time- and depth-wise trophic niche shifts in Antarctic benthos.

    Directory of Open Access Journals (Sweden)

    Edoardo Calizza

    Full Text Available Climate change is expected to affect resource-consumer interactions underlying stability in polar food webs. Polar benthic organisms have adapted to the marked seasonality characterising their habitats by concentrating foraging and reproductive activity in summer months, when inputs from sympagic and pelagic producers increase. While this enables the persistence of biodiverse food webs, the mechanisms underlying changes in resource use and nutrient transfer are poorly understood. Thus, our understanding of how temporal and spatial variations in the supply of resources may affect food web structure and functioning is limited. By means of C and N isotopic analyses of two key Antarctic benthic consumers (Adamussium colbecki, Bivalvia, and Sterechinus neumayeri, Echinoidea and Bayesian mixing models, we describe changes in trophic niche and nutrient transfer across trophic levels associated with the long- and short-term diet and body size of specimens sampled in midsummer in both shallow and deep waters. Samplings occurred soon after the sea-ice broke up at Tethys Bay, an area characterised by extreme seasonality in sea-ice coverage and productivity in the Ross Sea. In the long term, the trophic niche was broader and variation between specimens was greater, with intermediate-size specimens generally consuming a higher number of resources than small and large specimens. The coupling of energy channels in the food web was consequently more direct than in the short term. Sediment and benthic algae were more frequently consumed in the long term, before the sea-ice broke up, while consumers specialised on sympagic algae and plankton in the short term. Regardless of the time scale, sympagic algae were more frequently consumed in shallow waters, while plankton was more frequently consumed in deep waters. Our results suggest a strong temporal relationship between resource availability and the trophic niche of benthic consumers in Antarctica. Potential

  10. Time- and depth-wise trophic niche shifts in Antarctic benthos.

    Science.gov (United States)

    Calizza, Edoardo; Careddu, Giulio; Sporta Caputi, Simona; Rossi, Loreto; Costantini, Maria Letizia

    2018-01-01

    Climate change is expected to affect resource-consumer interactions underlying stability in polar food webs. Polar benthic organisms have adapted to the marked seasonality characterising their habitats by concentrating foraging and reproductive activity in summer months, when inputs from sympagic and pelagic producers increase. While this enables the persistence of biodiverse food webs, the mechanisms underlying changes in resource use and nutrient transfer are poorly understood. Thus, our understanding of how temporal and spatial variations in the supply of resources may affect food web structure and functioning is limited. By means of C and N isotopic analyses of two key Antarctic benthic consumers (Adamussium colbecki, Bivalvia, and Sterechinus neumayeri, Echinoidea) and Bayesian mixing models, we describe changes in trophic niche and nutrient transfer across trophic levels associated with the long- and short-term diet and body size of specimens sampled in midsummer in both shallow and deep waters. Samplings occurred soon after the sea-ice broke up at Tethys Bay, an area characterised by extreme seasonality in sea-ice coverage and productivity in the Ross Sea. In the long term, the trophic niche was broader and variation between specimens was greater, with intermediate-size specimens generally consuming a higher number of resources than small and large specimens. The coupling of energy channels in the food web was consequently more direct than in the short term. Sediment and benthic algae were more frequently consumed in the long term, before the sea-ice broke up, while consumers specialised on sympagic algae and plankton in the short term. Regardless of the time scale, sympagic algae were more frequently consumed in shallow waters, while plankton was more frequently consumed in deep waters. Our results suggest a strong temporal relationship between resource availability and the trophic niche of benthic consumers in Antarctica. Potential climate-driven changes

  11. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to

  12. Neuro-fuzzy modeling in bankruptcy prediction

    Directory of Open Access Journals (Sweden)

    Vlachos D.

    2003-01-01

    Full Text Available For the past 30 years the problem of bankruptcy prediction had been thoroughly studied. From the paper of Altman in 1968 to the recent papers in the '90s, the progress of prediction accuracy was not satisfactory. This paper investigates an alternative modeling of the system (firm, combining neural networks and fuzzy controllers, i.e. using neuro-fuzzy models. Classical modeling is based on mathematical models that describe the behavior of the firm under consideration. The main idea of fuzzy control, on the other hand, is to build a model of a human control expert who is capable of controlling the process without thinking in a mathematical model. This control expert specifies his control action in the form of linguistic rules. These control rules are translated into the framework of fuzzy set theory providing a calculus, which can stimulate the behavior of the control expert and enhance its performance. The accuracy of the model is studied using datasets from previous research papers.

  13. Trait-mediated trophic interactions: is foraging theory keeping up?

    Science.gov (United States)

    Railsback, Steven F; Harvey, Bret C

    2013-02-01

    Many ecologists believe that there is a lack of foraging theory that works in community contexts, for populations of unique individuals each making trade-offs between food and risk that are subject to feedbacks from behavior of others. Such theory is necessary to reproduce the trait-mediated trophic interactions now recognized as widespread and strong. Game theory can address feedbacks but does not provide foraging theory for unique individuals in variable environments. 'State- and prediction-based theory' (SPT) is a new approach that combines existing trade-off methods with routine updating: individuals regularly predict future food availability and risk from current conditions to optimize a fitness measure. SPT can reproduce a variety of realistic foraging behaviors and trait-mediated trophic interactions with feedbacks, even when the environment is unpredictable. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Coupling age-structured stock assessment and fish bioenergetics models: a system of time-varying models for quantifying piscivory patterns during the rapid trophic shift in the main basin of Lake Huron

    Science.gov (United States)

    He, Ji X.; Bence, James R.; Madenjian, Charles P.; Pothoven, Steven A.; Dobiesz, Norine E.; Fielder, David G.; Johnson, James E.; Ebener, Mark P.; Cottrill, Adam R.; Mohr, Lloyd C.; Koproski, Scott R.

    2015-01-01

    We quantified piscivory patterns in the main basin of Lake Huron during 1984–2010 and found that the biomass transfer from prey fish to piscivores remained consistently high despite the rapid major trophic shift in the food webs. We coupled age-structured stock assessment models and fish bioenergetics models for lake trout (Salvelinus namaycush), Chinook salmon (Oncorhynchus tshawytscha), walleye (Sander vitreus), and lake whitefish (Coregonus clupeaformis). The model system also included time-varying parameters or variables of growth, length–mass relations, maturity schedules, energy density, and diets. These time-varying models reflected the dynamic connections that a fish cohort responded to year-to-year ecosystem changes at different ages and body sizes. We found that the ratio of annual predation by lake trout, Chinook salmon, and walleye combined with the biomass indices of age-1 and older alewives (Alosa pseudoharengus) and rainbow smelt (Osmerus mordax) increased more than tenfold during 1987–2010, and such increases in predation pressure were structured by relatively stable biomass of the three piscivores and stepwise declines in the biomass of alewives and rainbow smelt. The piscivore stability was supported by the use of alternative energy pathways and changes in relative composition of the three piscivores. In addition, lake whitefish became a new piscivore by feeding on round goby (Neogobius melanostomus). Their total fish consumption rivaled that of the other piscivores combined, although fish were still a modest proportion of their diet. Overall, the use of alternative energy pathways by piscivores allowed the increases in predation pressure on dominant diet species.

  15. Predictive Models for Carcinogenicity and Mutagenicity ...

    Science.gov (United States)

    Mutagenicity and carcinogenicity are endpoints of major environmental and regulatory concern. These endpoints are also important targets for development of alternative methods for screening and prediction due to the large number of chemicals of potential concern and the tremendous cost (in time, money, animals) of rodent carcinogenicity bioassays. Both mutagenicity and carcinogenicity involve complex, cellular processes that are only partially understood. Advances in technologies and generation of new data will permit a much deeper understanding. In silico methods for predicting mutagenicity and rodent carcinogenicity based on chemical structural features, along with current mutagenicity and carcinogenicity data sets, have performed well for local prediction (i.e., within specific chemical classes), but are less successful for global prediction (i.e., for a broad range of chemicals). The predictivity of in silico methods can be improved by improving the quality of the data base and endpoints used for modelling. In particular, in vitro assays for clastogenicity need to be improved to reduce false positives (relative to rodent carcinogenicity) and to detect compounds that do not interact directly with DNA or have epigenetic activities. New assays emerging to complement or replace some of the standard assays include VitotoxTM, GreenScreenGC, and RadarScreen. The needs of industry and regulators to assess thousands of compounds necessitate the development of high-t

  16. Assessment, modelization and analysis of {sup 106} Ru experimental transfers through a freshwater trophic system; Evaluation, modelisation et analyse des transferts experimentaux du {sup 106}Ru au sein d`un reseau trophique d`eau douce

    Energy Technology Data Exchange (ETDEWEB)

    Vray, F

    1994-11-24

    Experiments are carried out in order to study {sup 106} RU transfers through a freshwater ecosystem including 2 abiotic compartments (water and sediment) and 3 trophic levels (10 species). Experimental results are expressed mathematically so as they can be included into a global model which is then tested in two different situations. The comparison of the available data concerning the in situ measured concentrations to the corresponding calculated ones validates the whole procedure. Analysis of the so validated results lightens ruthenium distribution process in the environment. The rare detection of this radionuclide in organisms living in areas contaminated by known meaningful releases can be explained by a relativity high detection limit and by a slight role of the sediment as a secondary contamination source. (author). 78 figs., 18 tabs.

  17. Validated predictive modelling of the environmental resistome.

    Science.gov (United States)

    Amos, Gregory C A; Gozzard, Emma; Carter, Charlotte E; Mead, Andrew; Bowes, Mike J; Hawkey, Peter M; Zhang, Lihong; Singer, Andrew C; Gaze, William H; Wellington, Elizabeth M H

    2015-06-01

    Multi-drug-resistant bacteria pose a significant threat to public health. The role of the environment in the overall rise in antibiotic-resistant infections and risk to humans is largely unknown. This study aimed to evaluate drivers of antibiotic-resistance levels across the River Thames catchment, model key biotic, spatial and chemical variables and produce predictive models for future risk assessment. Sediment samples from 13 sites across the River Thames basin were taken at four time points across 2011 and 2012. Samples were analysed for class 1 integron prevalence and enumeration of third-generation cephalosporin-resistant bacteria. Class 1 integron prevalence was validated as a molecular marker of antibiotic resistance; levels of resistance showed significant geospatial and temporal variation. The main explanatory variables of resistance levels at each sample site were the number, proximity, size and type of surrounding wastewater-treatment plants. Model 1 revealed treatment plants accounted for 49.5% of the variance in resistance levels. Other contributing factors were extent of different surrounding land cover types (for example, Neutral Grassland), temporal patterns and prior rainfall; when modelling all variables the resulting model (Model 2) could explain 82.9% of variations in resistance levels in the whole catchment. Chemical analyses correlated with key indicators of treatment plant effluent and a model (Model 3) was generated based on water quality parameters (contaminant and macro- and micro-nutrient levels). Model 2 was beta tested on independent sites and explained over 78% of the variation in integron prevalence showing a significant predictive ability. We believe all models in this study are highly useful tools for informing and prioritising mitigation strategies to reduce the environmental resistome.

  18. Nonlinear model predictive control theory and algorithms

    CERN Document Server

    Grüne, Lars

    2017-01-01

    This book offers readers a thorough and rigorous introduction to nonlinear model predictive control (NMPC) for discrete-time and sampled-data systems. NMPC schemes with and without stabilizing terminal constraints are detailed, and intuitive examples illustrate the performance of different NMPC variants. NMPC is interpreted as an approximation of infinite-horizon optimal control so that important properties like closed-loop stability, inverse optimality and suboptimality can be derived in a uniform manner. These results are complemented by discussions of feasibility and robustness. An introduction to nonlinear optimal control algorithms yields essential insights into how the nonlinear optimization routine—the core of any nonlinear model predictive controller—works. Accompanying software in MATLAB® and C++ (downloadable from extras.springer.com/), together with an explanatory appendix in the book itself, enables readers to perform computer experiments exploring the possibilities and limitations of NMPC. T...

  19. Baryogenesis model predicting antimatter in the Universe

    International Nuclear Information System (INIS)

    Kirilova, D.

    2003-01-01

    Cosmic ray and gamma-ray data do not rule out antimatter domains in the Universe, separated at distances bigger than 10 Mpc from us. Hence, it is interesting to analyze the possible generation of vast antimatter structures during the early Universe evolution. We discuss a SUSY-condensate baryogenesis model, predicting large separated regions of matter and antimatter. The model provides generation of the small locally observed baryon asymmetry for a natural initial conditions, it predicts vast antimatter domains, separated from the matter ones by baryonically empty voids. The characteristic scale of antimatter regions and their distance from the matter ones is in accordance with observational constraints from cosmic ray, gamma-ray and cosmic microwave background anisotropy data

  20. Predator-prey dynamics driven by feedback between functionally diverse trophic levels.

    Directory of Open Access Journals (Sweden)

    Katrin Tirok

    Full Text Available Neglecting the naturally existing functional diversity of communities and the resulting potential to respond to altered conditions may strongly reduce the realism and predictive power of ecological models. We therefore propose and study a predator-prey model that describes mutual feedback via species shifts in both predator and prey, using a dynamic trait approach. Species compositions of the two trophic levels were described by mean functional traits--prey edibility and predator food-selectivity--and functional diversities by the variances. Altered edibility triggered shifts in food-selectivity so that consumers continuously respond to the present prey composition, and vice versa. This trait-mediated feedback mechanism resulted in a complex dynamic behavior with ongoing oscillations in the mean trait values, reflecting continuous reorganization of the trophic levels. The feedback was only possible if sufficient functional diversity was present in both trophic levels. Functional diversity was internally maintained on the prey level as no niche existed in our system, which was ideal under any composition of the predator level due to the trade-offs between edibility, growth and carrying capacity. The predators were only subject to one trade-off between food-selectivity and grazing ability and in the absence of immigration, one predator type became abundant, i.e., functional diversity declined to zero. In the lack of functional diversity the system showed the same dynamics as conventional models of predator-prey interactions ignoring the potential for shifts in species composition. This way, our study identified the crucial role of trade-offs and their shape in physiological and ecological traits for preserving diversity.

  1. Finding Furfural Hydrogenation Catalysts via Predictive Modelling

    OpenAIRE

    Strassberger, Zea; Mooijman, Maurice; Ruijter, Eelco; Alberts, Albert H; Maldonado, Ana G; Orru, Romano V A; Rothenberg, Gadi

    2010-01-01

    Abstract We combine multicomponent reactions, catalytic performance studies and predictive modelling to find transfer hydrogenation catalysts. An initial set of 18 ruthenium-carbene complexes were synthesized and screened in the transfer hydrogenation of furfural to furfurol with isopropyl alcohol complexes gave varied yields, from 62% up to >99.9%, with no obvious structure/activity correlations. Control experiments proved that the carbene ligand remains coordinated to the ruthenium centre t...

  2. Predictive Modeling in Actinide Chemistry and Catalysis

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ping [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-16

    These are slides from a presentation on predictive modeling in actinide chemistry and catalysis. The following topics are covered in these slides: Structures, bonding, and reactivity (bonding can be quantified by optical probes and theory, and electronic structures and reaction mechanisms of actinide complexes); Magnetic resonance properties (transition metal catalysts with multi-nuclear centers, and NMR/EPR parameters); Moving to more complex systems (surface chemistry of nanomaterials, and interactions of ligands with nanoparticles); Path forward and conclusions.

  3. Tectonic predictions with mantle convection models

    Science.gov (United States)

    Coltice, Nicolas; Shephard, Grace E.

    2018-04-01

    Over the past 15 yr, numerical models of convection in Earth's mantle have made a leap forward: they can now produce self-consistent plate-like behaviour at the surface together with deep mantle circulation. These digital tools provide a new window into the intimate connections between plate tectonics and mantle dynamics, and can therefore be used for tectonic predictions, in principle. This contribution explores this assumption. First, initial conditions at 30, 20, 10 and 0 Ma are generated by driving a convective flow with imposed plate velocities at the surface. We then compute instantaneous mantle flows in response to the guessed temperature fields without imposing any boundary conditions. Plate boundaries self-consistently emerge at correct locations with respect to reconstructions, except for small plates close to subduction zones. As already observed for other types of instantaneous flow calculations, the structure of the top boundary layer and upper-mantle slab is the dominant character that leads to accurate predictions of surface velocities. Perturbations of the rheological parameters have little impact on the resulting surface velocities. We then compute fully dynamic model evolution from 30 and 10 to 0 Ma, without imposing plate boundaries or plate velocities. Contrary to instantaneous calculations, errors in kinematic predictions are substantial, although the plate layout and kinematics in several areas remain consistent with the expectations for the Earth. For these calculations, varying the rheological parameters makes a difference for plate boundary evolution. Also, identified errors in initial conditions contribute to first-order kinematic errors. This experiment shows that the tectonic predictions of dynamic models over 10 My are highly sensitive to uncertainties of rheological parameters and initial temperature field in comparison to instantaneous flow calculations. Indeed, the initial conditions and the rheological parameters can be good enough

  4. Trophic magnification of PCBs and Its relationship to the octanol-water partition coefficient.

    Science.gov (United States)

    Walters, David M; Mills, Marc A; Cade, Brian S; Burkard, Lawrence P

    2011-05-01

    We investigated polychlorinated biphenyl (PCB) bioaccumulation relative to octanol-water partition coefficient (K(OW)) and organism trophic position (TP) at the Lake Hartwell Superfund site (South Carolina). We measured PCBs (127 congeners) and stable isotopes (δ¹⁵N) in sediment, organic matter, phytoplankton, zooplankton, macroinvertebrates, and fish. TP, as calculated from δ¹⁵N, was significantly, positively related to PCB concentrations, and food web trophic magnification factors (TMFs) ranged from 1.5-6.6 among congeners. TMFs of individual congeners increased strongly with log K(OW), as did the predictive power (r²) of individual TP-PCB regression models used to calculate TMFs. We developed log K(OW)-TMF models for eight food webs with vastly different environments (freshwater, marine, arctic, temperate) and species composition (cold- vs warmblooded consumers). The effect of K(OW) on congener TMFs varied strongly across food webs (model slopes 0.0-15.0) because the range of TMFs among studies was also highly variable. We standardized TMFs within studies to mean = 0, standard deviation (SD) = 1 to normalize for scale differences and found a remarkably consistent K(OW) effect on TMFs (no difference in model slopes among food webs). Our findings underscore the importance of hydrophobicity (as characterized by K(OW)) in regulating bioaccumulation of recalcitrant compounds in aquatic systems, and demonstrate that relationships between chemical K(OW) and bioaccumulation from field studies are more generalized than previously recognized.

  5. Trophic ecomorphology of Siluriformes (Pisces, Osteichthyes) from a tropical stream.

    Science.gov (United States)

    Pagotto, J P A; Goulart, E; Oliveira, E F; Yamamura, C B

    2011-05-01

    The present study analysed the relationship between morphology and trophic structure of Siluriformes (Pisces, Osteichthyes) from the Caracu Stream (22º 45' S and 53º 15' W), a tributary of the Paraná River (Brazil). Sampling was carried out at three sites using electrofishing, and two species of Loricariidae and four of Heptapteridae were obtained. A cluster analysis revealed the presence of three trophic guilds (detritivores, insectivores and omnivores). Principal components analysis demonstrated the segregation of two ecomorphotypes: at one extreme there were the detritivores (Loricariidae) with morphological structures that are fundamental in allowing them to fix themselves to substrates characterised by rushing torrents, thus permitting them to graze on the detritus and organic materials encrusted on the substrate; at the other extreme of the gradient there were the insectivores and omnivores (Heptapteridae), with morphological characteristics that promote superior performance in the exploitation of structurally complex habitats with low current velocity, colonised by insects and plants. Canonical discriminant analysis revealed an ecomorphological divergence between insectivores, which have morphological structures that permit them to capture prey in small spaces among rocks, and omnivores, which have a more compressed body and tend to explore food items deposited in marginal backwater zones. Mantel tests showed that trophic structure was significantly related to the body shape of a species, independently of the phylogenetic history, indicating that, in this case, there was an ecomorphotype for each trophic guild. Therefore, the present study demonstrated that the Siluriformes of the Caracu Stream were ecomorphologically structured and that morphology can be applied as an additional tool in predicting the trophic structure of this group.

  6. Trophic ecomorphology of Siluriformes (Pisces, Osteichthyes from a tropical stream

    Directory of Open Access Journals (Sweden)

    JPA Pagotto

    Full Text Available The present study analysed the relationship between morphology and trophic structure of Siluriformes (Pisces, Osteichthyes from the Caracu Stream (22º 45' S and 53º 15' W, a tributary of the Paraná River (Brazil. Sampling was carried out at three sites using electrofishing, and two species of Loricariidae and four of Heptapteridae were obtained. A cluster analysis revealed the presence of three trophic guilds (detritivores, insectivores and omnivores. Principal components analysis demonstrated the segregation of two ecomorphotypes: at one extreme there were the detritivores (Loricariidae with morphological structures that are fundamental in allowing them to fix themselves to substrates characterised by rushing torrents, thus permitting them to graze on the detritus and organic materials encrusted on the substrate; at the other extreme of the gradient there were the insectivores and omnivores (Heptapteridae, with morphological characteristics that promote superior performance in the exploitation of structurally complex habitats with low current velocity, colonised by insects and plants. Canonical discriminant analysis revealed an ecomorphological divergence between insectivores, which have morphological structures that permit them to capture prey in small spaces among rocks, and omnivores, which have a more compressed body and tend to explore food items deposited in marginal backwater zones. Mantel tests showed that trophic structure was significantly related to the body shape of a species, independently of the phylogenetic history, indicating that, in this case, there was an ecomorphotype for each trophic guild. Therefore, the present study demonstrated that the Siluriformes of the Caracu Stream were ecomorphologically structured and that morphology can be applied as an additional tool in predicting the trophic structure of this group.

  7. Breast cancer risks and risk prediction models.

    Science.gov (United States)

    Engel, Christoph; Fischer, Christine

    2015-02-01

    BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation.

  8. A predictive model for dimensional errors in fused deposition modeling

    DEFF Research Database (Denmark)

    Stolfi, A.

    2015-01-01

    This work concerns the effect of deposition angle (a) and layer thickness (L) on the dimensional performance of FDM parts using a predictive model based on the geometrical description of the FDM filament profile. An experimental validation over the whole a range from 0° to 177° at 3° steps and two...... values of L (0.254 mm, 0.330 mm) was produced by comparing predicted values with external face-to-face measurements. After removing outliers, the results show that the developed two-parameter model can serve as tool for modeling the FDM dimensional behavior in a wide range of deposition angles....

  9. Two stage neural network modelling for robust model predictive control.

    Science.gov (United States)

    Patan, Krzysztof

    2018-01-01

    The paper proposes a novel robust model predictive control scheme realized by means of artificial neural networks. The neural networks are used twofold: to design the so-called fundamental model of a plant and to catch uncertainty associated with the plant model. In order to simplify the optimization process carried out within the framework of predictive control an instantaneous linearization is applied which renders it possible to define the optimization problem in the form of constrained quadratic programming. Stability of the proposed control system is also investigated by showing that a cost function is monotonically decreasing with respect to time. Derived robust model predictive control is tested and validated on the example of a pneumatic servomechanism working at different operating regimes. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Vitamin D and muscle trophicity.

    Science.gov (United States)

    Domingues-Faria, Carla; Boirie, Yves; Walrand, Stéphane

    2017-05-01

    We review recent findings on the involvement of vitamin D in skeletal muscle trophicity. Vitamin D deficiencies are associated with reduced muscle mass and strength, and its supplementation seems effective to improve these parameters in vitamin D-deficient study participants. Latest investigations have also evidenced that vitamin D is essential in muscle development and repair. In particular, it modulates skeletal muscle cell proliferation and differentiation. However, discrepancies still exist about an enhancement or a decrease of muscle proliferation and differentiation by the vitamin D. Recently, it has been demonstrated that vitamin D influences skeletal muscle cell metabolism as it seems to regulate protein synthesis and mitochondrial function. Finally, apart from its genomic and nongenomic effects, recent investigations have demonstrated a genetic contribution of vitamin D to muscle functioning. Recent studies support the importance of vitamin D in muscle health, and the impact of its deficiency in regard to muscle mass and function. These 'trophic' properties are of particular importance for some specific populations such as elderly persons and athletes, and in situations of loss of muscle mass or function, particularly in the context of chronic diseases.

  11. Predicting extinction rates in stochastic epidemic models

    International Nuclear Information System (INIS)

    Schwartz, Ira B; Billings, Lora; Dykman, Mark; Landsman, Alexandra

    2009-01-01

    We investigate the stochastic extinction processes in a class of epidemic models. Motivated by the process of natural disease extinction in epidemics, we examine the rate of extinction as a function of disease spread. We show that the effective entropic barrier for extinction in a susceptible–infected–susceptible epidemic model displays scaling with the distance to the bifurcation point, with an unusual critical exponent. We make a direct comparison between predictions and numerical simulations. We also consider the effect of non-Gaussian vaccine schedules, and show numerically how the extinction process may be enhanced when the vaccine schedules are Poisson distributed

  12. Predictive Modeling of the CDRA 4BMS

    Science.gov (United States)

    Coker, Robert F.; Knox, James C.

    2016-01-01

    As part of NASA's Advanced Exploration Systems (AES) program and the Life Support Systems Project (LSSP), fully predictive models of the Four Bed Molecular Sieve (4BMS) of the Carbon Dioxide Removal Assembly (CDRA) on the International Space Station (ISS) are being developed. This virtual laboratory will be used to help reduce mass, power, and volume requirements for future missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future crewed Mars missions as well as the resolution of anomalies observed in the ISS CDRA.

  13. Assessment of agglomeration, co-sedimentation and trophic transfer of titanium dioxide nanoparticles in a laboratory-scale predator-prey model system

    Science.gov (United States)

    Gupta, Govind Sharan; Kumar, Ashutosh; Shanker, Rishi; Dhawan, Alok

    2016-08-01

    Nano titanium dioxide (nTiO2) is the most abundantly released engineered nanomaterial (ENM) in aquatic environments. Therefore, it is prudent to assess its fate and its effects on lower trophic-level organisms in the aquatic food chain. A predator-and-prey-based laboratory microcosm was established using Paramecium caudatum and Escherichia coli to evaluate the effects of nTiO2. The surface interaction of nTiO2 with E. coli significantly increased after the addition of Paramecium into the microcosm. This interaction favoured the hetero-agglomeration and co-sedimentation of nTiO2. The extent of nTiO2 agglomeration under experimental conditions was as follows: combined E. coli and Paramecium > Paramecium only > E. coli only > without E. coli or Paramecium. An increase in nTiO2 internalisation in Paramecium cells was also observed in the presence or absence of E. coli cells. These interactions and nTiO2 internalisation in Paramecium cells induced statistically significant (p < 0.05) effects on growth and the bacterial ingestion rate at 24 h. These findings provide new insights into the fate of nTiO2 in the presence of bacterial-ciliate interactions in the aquatic environment.

  14. Data Driven Economic Model Predictive Control

    Directory of Open Access Journals (Sweden)

    Masoud Kheradmandi

    2018-04-01

    Full Text Available This manuscript addresses the problem of data driven model based economic model predictive control (MPC design. To this end, first, a data-driven Lyapunov-based MPC is designed, and shown to be capable of stabilizing a system at an unstable equilibrium point. The data driven Lyapunov-based MPC utilizes a linear time invariant (LTI model cognizant of the fact that the training data, owing to the unstable nature of the equilibrium point, has to be obtained from closed-loop operation or experiments. Simulation results are first presented demonstrating closed-loop stability under the proposed data-driven Lyapunov-based MPC. The underlying data-driven model is then utilized as the basis to design an economic MPC. The economic improvements yielded by the proposed method are illustrated through simulations on a nonlinear chemical process system example.

  15. Plant control using embedded predictive models

    International Nuclear Information System (INIS)

    Godbole, S.S.; Gabler, W.E.; Eschbach, S.L.

    1990-01-01

    B and W recently undertook the design of an advanced light water reactor control system. A concept new to nuclear steam system (NSS) control was developed. The concept, which is called the Predictor-Corrector, uses mathematical models of portions of the controlled NSS to calculate, at various levels within the system, demand and control element position signals necessary to satisfy electrical demand. The models give the control system the ability to reduce overcooling and undercooling of the reactor coolant system during transients and upsets. Two types of mathematical models were developed for use in designing and testing the control system. One model was a conventional, comprehensive NSS model that responds to control system outputs and calculates the resultant changes in plant variables that are then used as inputs to the control system. Two other models, embedded in the control system, were less conventional, inverse models. These models accept as inputs plant variables, equipment states, and demand signals and predict plant operating conditions and control element states that will satisfy the demands. This paper reports preliminary results of closed-loop Reactor Coolant (RC) pump trip and normal load reduction testing of the advanced concept. Results of additional transient testing, and of open and closed loop stability analyses will be reported as they are available

  16. Ground Motion Prediction Models for Caucasus Region

    Science.gov (United States)

    Jorjiashvili, Nato; Godoladze, Tea; Tvaradze, Nino; Tumanova, Nino

    2016-04-01

    Ground motion prediction models (GMPMs) relate ground motion intensity measures to variables describing earthquake source, path, and site effects. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is peak ground acceleration or spectral acceleration because this parameter gives useful information for Seismic Hazard Assessment. Since 2003 development of Georgian Digital Seismic Network has started. In this study new GMP models are obtained based on new data from Georgian seismic network and also from neighboring countries. Estimation of models is obtained by classical, statistical way, regression analysis. In this study site ground conditions are additionally considered because the same earthquake recorded at the same distance may cause different damage according to ground conditions. Empirical ground-motion prediction models (GMPMs) require adjustment to make them appropriate for site-specific scenarios. However, the process of making such adjustments remains a challenge. This work presents a holistic framework for the development of a peak ground acceleration (PGA) or spectral acceleration (SA) GMPE that is easily adjustable to different seismological conditions and does not suffer from the practical problems associated with adjustments in the response spectral domain.

  17. Modeling and Prediction of Krueger Device Noise

    Science.gov (United States)

    Guo, Yueping; Burley, Casey L.; Thomas, Russell H.

    2016-01-01

    This paper presents the development of a noise prediction model for aircraft Krueger flap devices that are considered as alternatives to leading edge slotted slats. The prediction model decomposes the total Krueger noise into four components, generated by the unsteady flows, respectively, in the cove under the pressure side surface of the Krueger, in the gap between the Krueger trailing edge and the main wing, around the brackets supporting the Krueger device, and around the cavity on the lower side of the main wing. For each noise component, the modeling follows a physics-based approach that aims at capturing the dominant noise-generating features in the flow and developing correlations between the noise and the flow parameters that control the noise generation processes. The far field noise is modeled using each of the four noise component's respective spectral functions, far field directivities, Mach number dependencies, component amplitudes, and other parametric trends. Preliminary validations are carried out by using small scale experimental data, and two applications are discussed; one for conventional aircraft and the other for advanced configurations. The former focuses on the parametric trends of Krueger noise on design parameters, while the latter reveals its importance in relation to other airframe noise components.

  18. Prediction of Chemical Function: Model Development and ...

    Science.gov (United States)

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi

  19. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  20. Predicting FLDs Using a Multiscale Modeling Scheme

    Science.gov (United States)

    Wu, Z.; Loy, C.; Wang, E.; Hegadekatte, V.

    2017-09-01

    The measurement of a single forming limit diagram (FLD) requires significant resources and is time consuming. We have developed a multiscale modeling scheme to predict FLDs using a combination of limited laboratory testing, crystal plasticity (VPSC) modeling, and dual sequential-stage finite element (ABAQUS/Explicit) modeling with the Marciniak-Kuczynski (M-K) criterion to determine the limit strain. We have established a means to work around existing limitations in ABAQUS/Explicit by using an anisotropic yield locus (e.g., BBC2008) in combination with the M-K criterion. We further apply a VPSC model to reduce the number of laboratory tests required to characterize the anisotropic yield locus. In the present work, we show that the predicted FLD is in excellent agreement with the measured FLD for AA5182 in the O temper. Instead of 13 different tests as for a traditional FLD determination within Novelis, our technique uses just four measurements: tensile properties in three orientations; plane strain tension; biaxial bulge; and the sheet crystallographic texture. The turnaround time is consequently far less than for the traditional laboratory measurement of the FLD.

  1. PREDICTION MODELS OF GRAIN YIELD AND CHARACTERIZATION

    Directory of Open Access Journals (Sweden)

    Narciso Ysac Avila Serrano

    2009-06-01

    Full Text Available With the objective to characterize the grain yield of five cowpea cultivars and to find linear regression models to predict it, a study was developed in La Paz, Baja California Sur, Mexico. A complete randomized blocks design was used. Simple and multivariate analyses of variance were carried out using the canonical variables to characterize the cultivars. The variables cluster per plant, pods per plant, pods per cluster, seeds weight per plant, seeds hectoliter weight, 100-seed weight, seeds length, seeds wide, seeds thickness, pods length, pods wide, pods weight, seeds per pods, and seeds weight per pods, showed significant differences (P≤ 0.05 among cultivars. Paceño and IT90K-277-2 cultivars showed the higher seeds weight per plant. The linear regression models showed correlation coefficients ≥0.92. In these models, the seeds weight per plant, pods per cluster, pods per plant, cluster per plant and pods length showed significant correlations (P≤ 0.05. In conclusion, the results showed that grain yield differ among cultivars and for its estimation, the prediction models showed determination coefficients highly dependable.

  2. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  3. Gamma-Ray Pulsars Models and Predictions

    CERN Document Server

    Harding, A K

    2001-01-01

    Pulsed emission from gamma-ray pulsars originates inside the magnetosphere, from radiation by charged particles accelerated near the magnetic poles or in the outer gaps. In polar cap models, the high energy spectrum is cut off by magnetic pair production above an energy that is dependent on the local magnetic field strength. While most young pulsars with surface fields in the range B = 10^{12} - 10^{13} G are expected to have high energy cutoffs around several GeV, the gamma-ray spectra of old pulsars having lower surface fields may extend to 50 GeV. Although the gamma-ray emission of older pulsars is weaker, detecting pulsed emission at high energies from nearby sources would be an important confirmation of polar cap models. Outer gap models predict more gradual high-energy turnovers at around 10 GeV, but also predict an inverse Compton component extending to TeV energies. Detection of pulsed TeV emission, which would not survive attenuation at the polar caps, is thus an important test of outer gap models. N...

  4. Artificial Neural Network Model for Predicting Compressive

    Directory of Open Access Journals (Sweden)

    Salim T. Yousif

    2013-05-01

    Full Text Available   Compressive strength of concrete is a commonly used criterion in evaluating concrete. Although testing of the compressive strength of concrete specimens is done routinely, it is performed on the 28th day after concrete placement. Therefore, strength estimation of concrete at early time is highly desirable. This study presents the effort in applying neural network-based system identification techniques to predict the compressive strength of concrete based on concrete mix proportions, maximum aggregate size (MAS, and slump of fresh concrete. Back-propagation neural networks model is successively developed, trained, and tested using actual data sets of concrete mix proportions gathered from literature.    The test of the model by un-used data within the range of input parameters shows that the maximum absolute error for model is about 20% and 88% of the output results has absolute errors less than 10%. The parametric study shows that water/cement ratio (w/c is the most significant factor  affecting the output of the model.     The results showed that neural networks has strong potential as a feasible tool for predicting compressive strength of concrete.

  5. Clinical Predictive Modeling Development and Deployment through FHIR Web Services.

    Science.gov (United States)

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction.

  6. Intersexual trophic niche partitioning in an ant-eating spider (Araneae: Zodariidae.

    Directory of Open Access Journals (Sweden)

    Stano Pekár

    2011-01-01

    Full Text Available Divergence in trophic niche between the sexes may function to reduce competition between the sexes ("intersexual niche partitioning hypothesis", or may be result from differential selection among the sexes on maximizing reproductive output ("sexual selection hypothesis". The latter may lead to higher energy demands in females driven by fecundity selection, while males invest in mate searching. We tested predictions of the two hypotheses underlying intersexual trophic niche partitioning in a natural population of spiders. Zodarion jozefienae spiders specialize on Messor barbarus ants that are polymorphic in body size and hence comprise potential trophic niches for the spider, making this system well-suited to study intersexual trophic niche partitioning.Comparative analysis of trophic morphology (the chelicerae and body size of males, females and juveniles demonstrated highly female biased SSD (Sexual Size Dimorphism in body size, body weight, and in the size of chelicerae, the latter arising from sex-specific growth patterns in trophic morphology. In the field, female spiders actively selected ant sub-castes that were larger than the average prey size, and larger than ants captured by juveniles and males. Female fecundity was highly positively correlated with female body mass, which reflects foraging success during the adult stage. Females in laboratory experiments preferred the large ant sub-castes and displayed higher capture efficiency. In contrast, males occupied a different trophic niche and showed reduced foraging effort and reduced prey capture and feeding efficiency compared with females and juveniles.Our data indicate that female-biased dimorphism in trophic morphology and body size correlate with sex-specific reproductive strategies. We propose that intersexual trophic niche partitioning is shaped primarily by fecundity selection in females, and results from sex-differences in the route to successful reproduction where females are

  7. An analytical model for climatic predictions

    International Nuclear Information System (INIS)

    Njau, E.C.

    1990-12-01

    A climatic model based upon analytical expressions is presented. This model is capable of making long-range predictions of heat energy variations on regional or global scales. These variations can then be transformed into corresponding variations of some other key climatic parameters since weather and climatic changes are basically driven by differential heating and cooling around the earth. On the basis of the mathematical expressions upon which the model is based, it is shown that the global heat energy structure (and hence the associated climatic system) are characterized by zonally as well as latitudinally propagating fluctuations at frequencies downward of 0.5 day -1 . We have calculated the propagation speeds for those particular frequencies that are well documented in the literature. The calculated speeds are in excellent agreement with the measured speeds. (author). 13 refs

  8. An Anisotropic Hardening Model for Springback Prediction

    Science.gov (United States)

    Zeng, Danielle; Xia, Z. Cedric

    2005-08-01

    As more Advanced High-Strength Steels (AHSS) are heavily used for automotive body structures and closures panels, accurate springback prediction for these components becomes more challenging because of their rapid hardening characteristics and ability to sustain even higher stresses. In this paper, a modified Mroz hardening model is proposed to capture realistic Bauschinger effect at reverse loading, such as when material passes through die radii or drawbead during sheet metal forming process. This model accounts for material anisotropic yield surface and nonlinear isotropic/kinematic hardening behavior. Material tension/compression test data are used to accurately represent Bauschinger effect. The effectiveness of the model is demonstrated by comparison of numerical and experimental springback results for a DP600 straight U-channel test.

  9. An Anisotropic Hardening Model for Springback Prediction

    International Nuclear Information System (INIS)

    Zeng, Danielle; Xia, Z. Cedric

    2005-01-01

    As more Advanced High-Strength Steels (AHSS) are heavily used for automotive body structures and closures panels, accurate springback prediction for these components becomes more challenging because of their rapid hardening characteristics and ability to sustain even higher stresses. In this paper, a modified Mroz hardening model is proposed to capture realistic Bauschinger effect at reverse loading, such as when material passes through die radii or drawbead during sheet metal forming process. This model accounts for material anisotropic yield surface and nonlinear isotropic/kinematic hardening behavior. Material tension/compression test data are used to accurately represent Bauschinger effect. The effectiveness of the model is demonstrated by comparison of numerical and experimental springback results for a DP600 straight U-channel test

  10. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  11. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  12. A Predictive Maintenance Model for Railway Tracks

    DEFF Research Database (Denmark)

    Li, Rui; Wen, Min; Salling, Kim Bang

    2015-01-01

    presents a mathematical model based on Mixed Integer Programming (MIP) which is designed to optimize the predictive railway tamping activities for ballasted track for the time horizon up to four years. The objective function is setup to minimize the actual costs for the tamping machine (measured by time......). Five technical and economic aspects are taken into account to schedule tamping: (1) track degradation of the standard deviation of the longitudinal level over time; (2) track geometrical alignment; (3) track quality thresholds based on the train speed limits; (4) the dependency of the track quality...

  13. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  14. Effective modelling for predictive analytics in data science ...

    African Journals Online (AJOL)

    Effective modelling for predictive analytics in data science. ... the nearabsence of empirical or factual predictive analytics in the mainstream research going on ... Keywords: Predictive Analytics, Big Data, Business Intelligence, Project Planning.

  15. Modeling the effects of dispersal on predicted contemporary and future fisher (Martes pennanti) distribution in the U.S

    Science.gov (United States)

    Lucretia Olson; M. Schwartz

    2013-01-01

    Many species at high trophic levels are predicted to be impacted by shifts in habitat associated with climate change. While temperate coniferous forests are predicted to be one of the least affected ecosystems, the impact of shifting habitat on terrestrial carnivores that live within these ecosystems may depend on the dispersal rates of the species and the patchiness...

  16. Combining GPS measurements and IRI model predictions

    International Nuclear Information System (INIS)

    Hernandez-Pajares, M.; Juan, J.M.; Sanz, J.; Bilitza, D.

    2002-01-01

    The free electrons distributed in the ionosphere (between one hundred and thousands of km in height) produce a frequency-dependent effect on Global Positioning System (GPS) signals: a delay in the pseudo-orange and an advance in the carrier phase. These effects are proportional to the columnar electron density between the satellite and receiver, i.e. the integrated electron density along the ray path. Global ionospheric TEC (total electron content) maps can be obtained with GPS data from a network of ground IGS (international GPS service) reference stations with an accuracy of few TEC units. The comparison with the TOPEX TEC, mainly measured over the oceans far from the IGS stations, shows a mean bias and standard deviation of about 2 and 5 TECUs respectively. The discrepancies between the STEC predictions and the observed values show an RMS typically below 5 TECUs (which also includes the alignment code noise). he existence of a growing database 2-hourly global TEC maps and with resolution of 5x2.5 degrees in longitude and latitude can be used to improve the IRI prediction capability of the TEC. When the IRI predictions and the GPS estimations are compared for a three month period around the Solar Maximum, they are in good agreement for middle latitudes. An over-determination of IRI TEC has been found at the extreme latitudes, the IRI predictions being, typically two times higher than the GPS estimations. Finally, local fits of the IRI model can be done by tuning the SSN from STEC GPS observations

  17. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  18. Mathematical models for indoor radon prediction

    International Nuclear Information System (INIS)

    Malanca, A.; Pessina, V.; Dallara, G.

    1995-01-01

    It is known that the indoor radon (Rn) concentration can be predicted by means of mathematical models. The simplest model relies on two variables only: the Rn source strength and the air exchange rate. In the Lawrence Berkeley Laboratory (LBL) model several environmental parameters are combined into a complex equation; besides, a correlation between the ventilation rate and the Rn entry rate from the soil is admitted. The measurements were carried out using activated carbon canisters. Seventy-five measurements of Rn concentrations were made inside two rooms placed on the second floor of a building block. One of the rooms had a single-glazed window whereas the other room had a double pane window. During three different experimental protocols, the mean Rn concentration was always higher into the room with a double-glazed window. That behavior can be accounted for by the simplest model. A further set of 450 Rn measurements was collected inside a ground-floor room with a grounding well in it. This trend maybe accounted for by the LBL model

  19. Towards predictive models for transitionally rough surfaces

    Science.gov (United States)

    Abderrahaman-Elena, Nabil; Garcia-Mayoral, Ricardo

    2017-11-01

    We analyze and model the previously presented decomposition for flow variables in DNS of turbulence over transitionally rough surfaces. The flow is decomposed into two contributions: one produced by the overlying turbulence, which has no footprint of the surface texture, and one induced by the roughness, which is essentially the time-averaged flow around the surface obstacles, but modulated in amplitude by the first component. The roughness-induced component closely resembles the laminar steady flow around the roughness elements at the same non-dimensional roughness size. For small - yet transitionally rough - textures, the roughness-free component is essentially the same as over a smooth wall. Based on these findings, we propose predictive models for the onset of the transitionally rough regime. Project supported by the Engineering and Physical Sciences Research Council (EPSRC).

  20. Resource-estimation models and predicted discovery

    International Nuclear Information System (INIS)

    Hill, G.W.

    1982-01-01

    Resources have been estimated by predictive extrapolation from past discovery experience, by analogy with better explored regions, or by inference from evidence of depletion of targets for exploration. Changes in technology and new insights into geological mechanisms have occurred sufficiently often in the long run to form part of the pattern of mature discovery experience. The criterion, that a meaningful resource estimate needs an objective measure of its precision or degree of uncertainty, excludes 'estimates' based solely on expert opinion. This is illustrated by development of error measures for several persuasive models of discovery and production of oil and gas in USA, both annually and in terms of increasing exploration effort. Appropriate generalizations of the models resolve many points of controversy. This is illustrated using two USA data sets describing discovery of oil and of U 3 O 8 ; the latter set highlights an inadequacy of available official data. Review of the oil-discovery data set provides a warrant for adjusting the time-series prediction to a higher resource figure for USA petroleum. (author)

  1. Trophic factors as modulators of motor neuron physiology and survival: implications for ALS therapy

    Directory of Open Access Journals (Sweden)

    Luis B Tovar-y-Romo

    2014-02-01

    Full Text Available Motor neuron physiology and development depend on a continuous and tightly regulated trophic support from a variety of cellular sources. Trophic factors guide the generation and positioning of motor neurons during every stage of the developmental process. As well, they are involved in axon guidance and synapse formation. Even in the adult spinal cord an uninterrupted trophic input is required to maintain neuronal functioning and protection from noxious stimuli. Among the trophic factors that have been demonstrated to participate in motor neuron physiology are vascular endothelial growth factor (VEGF, glial-derived neurotrophic factor (GDNF, ciliary neurotrophic factor (CNTF and insulin-like growth factor 1 (IGF-1. Upon binding to membrane receptors expressed in motor neurons or neighboring glia, these trophic factors activate intracellular signaling pathways that promote cell survival and have protective action on motor neurons, in both in vivo and in vitro models of neuronal degeneration. For these reasons these factors have been considered a promising therapeutic method for amyotrophic lateral sclerosis (ALS and other neurodegenerative diseases, although their efficacy in human clinical trials have not yet shown the expected protection. In this review we summarize experimental data on the role of these trophic factors in motor neuron function and survival, as well as their mechanisms of action. We also briefly discuss the potential therapeutic use of the trophic factors and why these therapies may have not been yet successful in the clinical use.

  2. Prediction of pipeline corrosion rate based on grey Markov models

    International Nuclear Information System (INIS)

    Chen Yonghong; Zhang Dafa; Peng Guichu; Wang Yuemin

    2009-01-01

    Based on the model that combined by grey model and Markov model, the prediction of corrosion rate of nuclear power pipeline was studied. Works were done to improve the grey model, and the optimization unbiased grey model was obtained. This new model was used to predict the tendency of corrosion rate, and the Markov model was used to predict the residual errors. In order to improve the prediction precision, rolling operation method was used in these prediction processes. The results indicate that the improvement to the grey model is effective and the prediction precision of the new model combined by the optimization unbiased grey model and Markov model is better, and the use of rolling operation method may improve the prediction precision further. (authors)

  3. An Operational Model for the Prediction of Jet Blast

    Science.gov (United States)

    2012-01-09

    This paper presents an operational model for the prediction of jet blast. The model was : developed based upon three modules including a jet exhaust model, jet centerline decay : model and aircraft motion model. The final analysis was compared with d...

  4. Data driven propulsion system weight prediction model

    Science.gov (United States)

    Gerth, Richard J.

    1994-10-01

    The objective of the research was to develop a method to predict the weight of paper engines, i.e., engines that are in the early stages of development. The impetus for the project was the Single Stage To Orbit (SSTO) project, where engineers need to evaluate alternative engine designs. Since the SSTO is a performance driven project the performance models for alternative designs were well understood. The next tradeoff is weight. Since it is known that engine weight varies with thrust levels, a model is required that would allow discrimination between engines that produce the same thrust. Above all, the model had to be rooted in data with assumptions that could be justified based on the data. The general approach was to collect data on as many existing engines as possible and build a statistical model of the engines weight as a function of various component performance parameters. This was considered a reasonable level to begin the project because the data would be readily available, and it would be at the level of most paper engines, prior to detailed component design.

  5. Predictive modeling of emergency cesarean delivery.

    Directory of Open Access Journals (Sweden)

    Carlos Campillo-Artero

    Full Text Available To increase discriminatory accuracy (DA for emergency cesarean sections (ECSs.We prospectively collected data on and studied all 6,157 births occurring in 2014 at four public hospitals located in three different autonomous communities of Spain. To identify risk factors (RFs for ECS, we used likelihood ratios and logistic regression, fitted a classification tree (CTREE, and analyzed a random forest model (RFM. We used the areas under the receiver-operating-characteristic (ROC curves (AUCs to assess their DA.The magnitude of the LR+ for all putative individual RFs and ORs in the logistic regression models was low to moderate. Except for parity, all putative RFs were positively associated with ECS, including hospital fixed-effects and night-shift delivery. The DA of all logistic models ranged from 0.74 to 0.81. The most relevant RFs (pH, induction, and previous C-section in the CTREEs showed the highest ORs in the logistic models. The DA of the RFM and its most relevant interaction terms was even higher (AUC = 0.94; 95% CI: 0.93-0.95.Putative fetal, maternal, and contextual RFs alone fail to achieve reasonable DA for ECS. It is the combination of these RFs and the interactions between them at each hospital that make it possible to improve the DA for the type of delivery and tailor interventions through prediction to improve the appropriateness of ECS indications.

  6. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...... and related to the uncertainty of the impulse response coefficients. The simulations can be used to benchmark l2 MPC against FIR based robust MPC as well as to estimate the maximum performance improvements by robust MPC....

  7. Predicting what helminth parasites a fish species should have using Parasite Co-occurrence Modeler (PaCo)

    Science.gov (United States)

    Strona, Giovanni; Lafferty, Kevin D.

    2013-01-01

    Fish pathologists are often interested in which parasites would likely be present in a particular host. Parasite Co-occurrence Modeler (PaCo) is a tool for identifying a list of parasites known from fish species that are similar ecologically, phylogenetically, and geographically to the host of interest. PaCo uses data from FishBase (maximum length, growth rate, life span, age at maturity, trophic level, phylogeny, and biogeography) to estimate compatibility between a target host and parasite species–genera from the major helminth groups (Acanthocephala, Cestoda, Monogenea, Nematoda, and Trematoda). Users can include any combination of host attributes in a model. These unique features make PaCo an innovative tool for addressing both theoretical and applied questions in parasitology. In addition to predicting the occurrence of parasites, PaCo can be used to investigate how host characteristics shape parasite communities. To test the performance of the PaCo algorithm, we created 12,400 parasite lists by applying any possible combination of model parameters (248) to 50 fish hosts. We then measured the relative importance of each parameter by assessing their frequency in the best models for each host. Host phylogeny and host geography were identified as the most important factors, with both present in 88% of the best models. Habitat (64%) was identified in more than half of the best models. Among ecological parameters, trophic level (41%) was the most relevant while life span (34%), growth rate (32%), maximum length (28%), and age at maturity (20%) were less commonly linked to best models. PaCo is free to use at www.purl.oclc.org/fishpest.

  8. Methodology for Designing Models Predicting Success of Infertility Treatment

    OpenAIRE

    Alireza Zarinara; Mohammad Mahdi Akhondi; Hojjat Zeraati; Koorsh Kamali; Kazem Mohammad

    2016-01-01

    Abstract Background: The prediction models for infertility treatment success have presented since 25 years ago. There are scientific principles for designing and applying the prediction models that is also used to predict the success rate of infertility treatment. The purpose of this study is to provide basic principles for designing the model to predic infertility treatment success. Materials and Methods: In this paper, the principles for developing predictive models are explained and...

  9. Finite Unification: Theory, Models and Predictions

    CERN Document Server

    Heinemeyer, S; Zoupanos, G

    2011-01-01

    All-loop Finite Unified Theories (FUTs) are very interesting N=1 supersymmetric Grand Unified Theories (GUTs) realising an old field theory dream, and moreover have a remarkable predictive power due to the required reduction of couplings. The reduction of the dimensionless couplings in N=1 GUTs is achieved by searching for renormalization group invariant (RGI) relations among them holding beyond the unification scale. Finiteness results from the fact that there exist RGI relations among dimensional couplings that guarantee the vanishing of all beta-functions in certain N=1 GUTs even to all orders. Furthermore developments in the soft supersymmetry breaking sector of N=1 GUTs and FUTs lead to exact RGI relations, i.e. reduction of couplings, in this dimensionful sector of the theory, too. Based on the above theoretical framework phenomenologically consistent FUTs have been constructed. Here we review FUT models based on the SU(5) and SU(3)^3 gauge groups and their predictions. Of particular interest is the Hig...

  10. Simulation of fruit-set and trophic competition and optimization of yield advantages in six Capsicum cultivars using functional-structural plant modelling.

    Science.gov (United States)

    Ma, Y T; Wubs, A M; Mathieu, A; Heuvelink, E; Zhu, J Y; Hu, B G; Cournède, P H; de Reffye, P

    2011-04-01

    Many indeterminate plants can have wide fluctuations in the pattern of fruit-set and harvest. Fruit-set in these types of plants depends largely on the balance between source (assimilate supply) and sink strength (assimilate demand) within the plant. This study aims to evaluate the ability of functional-structural plant models to simulate different fruit-set patterns among Capsicum cultivars through source-sink relationships. A greenhouse experiment of six Capsicum cultivars characterized with different fruit weight and fruit-set was conducted. Fruit-set patterns and potential fruit sink strength were determined through measurement. Source and sink strength of other organs were determined via the GREENLAB model, with a description of plant organ weight and dimensions according to plant topological structure established from the measured data as inputs. Parameter optimization was determined using a generalized least squares method for the entire growth cycle. Fruit sink strength differed among cultivars. Vegetative sink strength was generally lower for large-fruited cultivars than for small-fruited ones. The larger the size of the fruit, the larger variation there was in fruit-set and fruit yield. Large-fruited cultivars need a higher source-sink ratio for fruit-set, which means higher demand for assimilates. Temporal heterogeneity of fruit-set affected both number and yield of fruit. The simulation study showed that reducing heterogeneity of fruit-set was obtained by different approaches: for example, increasing source strength; decreasing vegetative sink strength, source-sink ratio for fruit-set and flower appearance rate; and harvesting individual fruits earlier before full ripeness. Simulation results showed that, when we increased source strength or decreased vegetative sink strength, fruit-set and fruit weight increased. However, no significant differences were found between large-fruited and small-fruited groups of cultivars regarding the effects of source

  11. Revised predictive equations for salt intrusion modelling in estuaries

    NARCIS (Netherlands)

    Gisen, J.I.A.; Savenije, H.H.G.; Nijzink, R.C.

    2015-01-01

    For one-dimensional salt intrusion models to be predictive, we need predictive equations to link model parameters to observable hydraulic and geometric variables. The one-dimensional model of Savenije (1993b) made use of predictive equations for the Van der Burgh coefficient $K$ and the dispersion

  12. Neutrino nucleosynthesis in supernovae: Shell model predictions

    International Nuclear Information System (INIS)

    Haxton, W.C.

    1989-01-01

    Almost all of the 3 · 10 53 ergs liberated in a core collapse supernova is radiated as neutrinos by the cooling neutron star. I will argue that these neutrinos interact with nuclei in the ejected shells of the supernovae to produce new elements. It appears that this nucleosynthesis mechanism is responsible for the galactic abundances of 7 Li, 11 B, 19 F, 138 La, and 180 Ta, and contributes significantly to the abundances of about 15 other light nuclei. I discuss shell model predictions for the charged and neutral current allowed and first-forbidden responses of the parent nuclei, as well as the spallation processes that produce the new elements. 18 refs., 1 fig., 1 tab

  13. Hierarchical Model Predictive Control for Resource Distribution

    DEFF Research Database (Denmark)

    Bendtsen, Jan Dimon; Trangbæk, K; Stoustrup, Jakob

    2010-01-01

    units. The approach is inspired by smart-grid electric power production and consumption systems, where the flexibility of a large number of power producing and/or power consuming units can be exploited in a smart-grid solution. The objective is to accommodate the load variation on the grid, arising......This paper deals with hierarchichal model predictive control (MPC) of distributed systems. A three level hierachical approach is proposed, consisting of a high level MPC controller, a second level of so-called aggregators, controlled by an online MPC-like algorithm, and a lower level of autonomous...... on one hand from varying consumption, on the other hand by natural variations in power production e.g. from wind turbines. The approach presented is based on quadratic optimization and possess the properties of low algorithmic complexity and of scalability. In particular, the proposed design methodology...

  14. Distributed model predictive control made easy

    CERN Document Server

    Negenborn, Rudy

    2014-01-01

    The rapid evolution of computer science, communication, and information technology has enabled the application of control techniques to systems beyond the possibilities of control theory just a decade ago. Critical infrastructures such as electricity, water, traffic and intermodal transport networks are now in the scope of control engineers. The sheer size of such large-scale systems requires the adoption of advanced distributed control approaches. Distributed model predictive control (MPC) is one of the promising control methodologies for control of such systems.   This book provides a state-of-the-art overview of distributed MPC approaches, while at the same time making clear directions of research that deserve more attention. The core and rationale of 35 approaches are carefully explained. Moreover, detailed step-by-step algorithmic descriptions of each approach are provided. These features make the book a comprehensive guide both for those seeking an introduction to distributed MPC as well as for those ...

  15. Tempo of trophic evolution and its impact on mammalian diversification.

    Science.gov (United States)

    Price, Samantha A; Hopkins, Samantha S B; Smith, Kathleen K; Roth, V Louise

    2012-05-01

    Mammals are characterized by the complex adaptations of their dentition, which are an indication that diet has played a critical role in their evolutionary history. Although much attention has focused on diet and the adaptations of specific taxa, the role of diet in large-scale diversification patterns remains unresolved. Contradictory hypotheses have been proposed, making prediction of the expected relationship difficult. We show that net diversification rate (the cumulative effect of speciation and extinction), differs significantly among living mammals, depending upon trophic strategy. Herbivores diversify fastest, carnivores are intermediate, and omnivores are slowest. The tempo of transitions between the trophic strategies is also highly biased: the fastest rates occur into omnivory from herbivory and carnivory and the lowest transition rates are between herbivory and carnivory. Extant herbivore and carnivore diversity arose primarily through diversification within lineages, whereas omnivore diversity evolved by transitions into the strategy. The ability to specialize and subdivide the trophic niche allowed herbivores and carnivores to evolve greater diversity than omnivores.

  16. Model predictive control of a wind turbine modelled in Simpack

    International Nuclear Information System (INIS)

    Jassmann, U; Matzke, D; Reiter, M; Abel, D; Berroth, J; Schelenz, R; Jacobs, G

    2014-01-01

    Wind turbines (WT) are steadily growing in size to increase their power production, which also causes increasing loads acting on the turbine's components. At the same time large structures, such as the blades and the tower get more flexible. To minimize this impact, the classical control loops for keeping the power production in an optimum state are more and more extended by load alleviation strategies. These additional control loops can be unified by a multiple-input multiple-output (MIMO) controller to achieve better balancing of tuning parameters. An example for MIMO control, which has been paid more attention to recently by wind industry, is Model Predictive Control (MPC). In a MPC framework a simplified model of the WT is used to predict its controlled outputs. Based on a user-defined cost function an online optimization calculates the optimal control sequence. Thereby MPC can intrinsically incorporate constraints e.g. of actuators. Turbine models used for calculation within the MPC are typically simplified. For testing and verification usually multi body simulations, such as FAST, BLADED or FLEX5 are used to model system dynamics, but they are still limited in the number of degrees of freedom (DOF). Detailed information about load distribution (e.g. inside the gearbox) cannot be provided by such models. In this paper a Model Predictive Controller is presented and tested in a co-simulation with SlMPACK, a multi body system (MBS) simulation framework used for detailed load analysis. The analysis are performed on the basis of the IME6.0 MBS WT model, described in this paper. It is based on the rotor of the NREL 5MW WT and consists of a detailed representation of the drive train. This takes into account a flexible main shaft and its main bearings with a planetary gearbox, where all components are modelled flexible, as well as a supporting flexible main frame. The wind loads are simulated using the NREL AERODYN v13 code which has been implemented as a routine

  17. Model predictive control of a wind turbine modelled in Simpack

    Science.gov (United States)

    Jassmann, U.; Berroth, J.; Matzke, D.; Schelenz, R.; Reiter, M.; Jacobs, G.; Abel, D.

    2014-06-01

    Wind turbines (WT) are steadily growing in size to increase their power production, which also causes increasing loads acting on the turbine's components. At the same time large structures, such as the blades and the tower get more flexible. To minimize this impact, the classical control loops for keeping the power production in an optimum state are more and more extended by load alleviation strategies. These additional control loops can be unified by a multiple-input multiple-output (MIMO) controller to achieve better balancing of tuning parameters. An example for MIMO control, which has been paid more attention to recently by wind industry, is Model Predictive Control (MPC). In a MPC framework a simplified model of the WT is used to predict its controlled outputs. Based on a user-defined cost function an online optimization calculates the optimal control sequence. Thereby MPC can intrinsically incorporate constraints e.g. of actuators. Turbine models used for calculation within the MPC are typically simplified. For testing and verification usually multi body simulations, such as FAST, BLADED or FLEX5 are used to model system dynamics, but they are still limited in the number of degrees of freedom (DOF). Detailed information about load distribution (e.g. inside the gearbox) cannot be provided by such models. In this paper a Model Predictive Controller is presented and tested in a co-simulation with SlMPACK, a multi body system (MBS) simulation framework used for detailed load analysis. The analysis are performed on the basis of the IME6.0 MBS WT model, described in this paper. It is based on the rotor of the NREL 5MW WT and consists of a detailed representation of the drive train. This takes into account a flexible main shaft and its main bearings with a planetary gearbox, where all components are modelled flexible, as well as a supporting flexible main frame. The wind loads are simulated using the NREL AERODYN v13 code which has been implemented as a routine to

  18. Poisson Mixture Regression Models for Heart Disease Prediction.

    Science.gov (United States)

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  19. Not all jellyfish are equal: isotopic evidence for inter- and intraspecific variation in jellyfish trophic ecology

    Directory of Open Access Journals (Sweden)

    Nicholas E.C. Fleming

    2015-07-01

    Full Text Available Jellyfish are highly topical within studies of pelagic food-webs and there is a growing realisation that their role is more complex than once thought. Efforts being made to include jellyfish within fisheries and ecosystem models are an important step forward, but our present understanding of their underlying trophic ecology can lead to their oversimplification in these models. Gelatinous zooplankton represent a polyphyletic assemblage spanning >2,000 species that inhabit coastal seas to the deep-ocean and employ a wide variety of foraging strategies. Despite this diversity, many contemporary modelling approaches include jellyfish as a single functional group feeding at one or two trophic levels at most. Recent reviews have drawn attention to this issue and highlighted the need for improved communication between biologists and theoreticians if this problem is to be overcome. We used stable isotopes to investigate the trophic ecology of three co-occurring scyphozoan jellyfish species (Aurelia aurita, Cyanea lamarckii and C. capillata within a temperate, coastal food-web in the NE Atlantic. Using information on individual size, time of year and δ13C and δ15N stable isotope values, we examined: (1 whether all jellyfish could be considered as a single functional group, or showed distinct inter-specific differences in trophic ecology; (2 Were size-based shifts in trophic position, found previously in A. aurita, a common trait across species?; (3 When considered collectively, did the trophic position of three sympatric species remain constant over time? Differences in δ15N (trophic position were evident between all three species, with size-based and temporal shifts in δ15N apparent in A. aurita and C. capillata. The isotopic niche width for all species combined increased throughout the season, reflecting temporal shifts in trophic position and seasonal succession in these gelatinous species. Taken together, these findings support previous

  20. Not all jellyfish are equal: isotopic evidence for inter- and intraspecific variation in jellyfish trophic ecology.

    Science.gov (United States)

    Fleming, Nicholas E C; Harrod, Chris; Newton, Jason; Houghton, Jonathan D R

    2015-01-01

    Jellyfish are highly topical within studies of pelagic food-webs and there is a growing realisation that their role is more complex than once thought. Efforts being made to include jellyfish within fisheries and ecosystem models are an important step forward, but our present understanding of their underlying trophic ecology can lead to their oversimplification in these models. Gelatinous zooplankton represent a polyphyletic assemblage spanning >2,000 species that inhabit coastal seas to the deep-ocean and employ a wide variety of foraging strategies. Despite this diversity, many contemporary modelling approaches include jellyfish as a single functional group feeding at one or two trophic levels at most. Recent reviews have drawn attention to this issue and highlighted the need for improved communication between biologists and theoreticians if this problem is to be overcome. We used stable isotopes to investigate the trophic ecology of three co-occurring scyphozoan jellyfish species (Aurelia aurita, Cyanea lamarckii and C. capillata) within a temperate, coastal food-web in the NE Atlantic. Using information on individual size, time of year and δ (13)C and δ (15)N stable isotope values, we examined: (1) whether all jellyfish could be considered as a single functional group, or showed distinct inter-specific differences in trophic ecology; (2) Were size-based shifts in trophic position, found previously in A. aurita, a common trait across species?; (3) When considered collectively, did the trophic position of three sympatric species remain constant over time? Differences in δ (15)N (trophic position) were evident between all three species, with size-based and temporal shifts in δ (15)N apparent in A. aurita and C. capillata. The isotopic niche width for all species combined increased throughout the season, reflecting temporal shifts in trophic position and seasonal succession in these gelatinous species. Taken together, these findings support previous assertions

  1. Spider foraging strategy affects trophic cascades under natural and drought conditions.

    Science.gov (United States)

    Liu, Shengjie; Chen, Jin; Gan, Wenjin; Schaefer, Douglas; Gan, Jianmin; Yang, Xiaodong

    2015-07-23

    Spiders can cause trophic cascades affecting litter decomposition rates. However, it remains unclear how spiders with different foraging strategies influence faunal communities, or present cascading effects on decomposition. Furthermore, increased dry periods predicted in future climates will likely have important consequences for trophic interactions in detritus-based food webs. We investigated independent and interactive effects of spider predation and drought on litter decomposition in a tropical forest floor. We manipulated densities of dominant spiders with actively hunting or sit-and-wait foraging strategies in microcosms which mimicked the tropical-forest floor. We found a positive trophic cascade on litter decomposition was triggered by actively hunting spiders under ambient rainfall, but sit-and-wait spiders did not cause this. The drought treatment reversed the effect of actively hunting spiders on litter decomposition. Under drought conditions, we observed negative trophic cascade effects on litter decomposition in all three spider treatments. Thus, reduced rainfall can alter predator-induced indirect effects on lower trophic levels and ecosystem processes, and is an example of how such changes may alter trophic cascades in detritus-based webs of tropical forests.

  2. Trophic ulcers in the carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Abelardo Q.-C. Araújo

    1993-09-01

    Full Text Available A patient with carpal tunnel syndrome (CTS and trophic ulcers is described. Despite the healing of the ulcers after surgery for CTS, the severe sensory deficit and the electrophysiological tests have not shown any significant improvement. We think these findings argue against the hypothesis of the sensory deficit being responsible for the trophic ulcers. We favor a major role for the sympathetic disturbances as the main cause for those lesions.

  3. Dynamic modeling predicts continued bioaccumulation of polybrominated diphenyl ethers (PBDEs) in smallmouth bass (Micropterus dolomiu) post phase-out due to invasive prey and shifts in predation

    International Nuclear Information System (INIS)

    Wallace, Joshua S.; Blersch, David M.

    2015-01-01

    Unprecedented food chain links between benthic and pelagic organisms are often thought to disrupt traditional contaminant transport and uptake due to changes in predation and mobilization of otherwise sequestered pollutants. A bioaccumulation model for polybrominated diphenyl ethers (PBDEs) is developed to simulate increases in biotic congener loads based upon trophic transfer through diet and gill uptake for a Lake Erie food chain including two invasive species as a benthic-pelagic link. The model utilizes species-specific bioenergetic parameters in a four-level food chain including the green alga Scenedesmus quadricauda, zebra mussels (Dreissena polymorpha), round goby (Appollonia melanostoma), and the smallmouth bass (Micropterus dolomiu). The model was calibrated to current biotic concentrations and predicts an increase in contaminant load by almost 48% in the upper trophic level in two years. Validation to archival data resulted in <2% error from reported values following a two-year simulation. - Highlights: • A dynamic model assesses continued bioaccumulation of PBDEs in predators of invasive prey. • The model incorporates novel benthic-pelagic energy links due to invasive prey. • Increases in total PBDEs in smallmouth bass due to invasive energy pathways are simulated. • The model is validated to archival data obtained prior to invasion of zebra mussels and round goby. - A dynamic model is developed to simulate continued bioaccumulation of polybrominated diphenyl ethers (PBDEs) in smallmouth bass due to emerging benthic-pelagic energy pathways.

  4. Predictive integrated modelling for ITER scenarios

    International Nuclear Information System (INIS)

    Artaud, J.F.; Imbeaux, F.; Aniel, T.; Basiuk, V.; Eriksson, L.G.; Giruzzi, G.; Hoang, G.T.; Huysmans, G.; Joffrin, E.; Peysson, Y.; Schneider, M.; Thomas, P.

    2005-01-01

    The uncertainty on the prediction of ITER scenarios is evaluated. 2 transport models which have been extensively validated against the multi-machine database are used for the computation of the transport coefficients. The first model is GLF23, the second called Kiauto is a model in which the profile of dilution coefficient is a gyro Bohm-like analytical function, renormalized in order to get profiles consistent with a given global energy confinement scaling. The package of codes CRONOS is used, it gives access to the dynamics of the discharge and allows the study of interplay between heat transport, current diffusion and sources. The main motivation of this work is to study the influence of parameters such plasma current, heat, density, impurities and toroidal moment transport. We can draw the following conclusions: 1) the target Q = 10 can be obtained in ITER hybrid scenario at I p = 13 MA, using either the DS03 two terms scaling or the GLF23 model based on the same pedestal; 2) I p = 11.3 MA, Q = 10 can be reached only assuming a very peaked pressure profile and a low pedestal; 3) at fixed Greenwald fraction, Q increases with density peaking; 4) achieving a stationary q-profile with q > 1 requires a large non-inductive current fraction (80%) that could be provided by 20 to 40 MW of LHCD; and 5) owing to the high temperature the q-profile penetration is delayed and q = 1 is reached about 600 s in ITER hybrid scenario at I p = 13 MA, in the absence of active q-profile control. (A.C.)

  5. Analysis of trophic interactions reveals highly plastic response to climate change in a tri-trophic High-Arctic ecosystem

    DEFF Research Database (Denmark)

    Mortensen, Lars O.; Schmidt, Niels Martin; Hoye, Toke T.

    2016-01-01

    As a response to current climate changes, individual species have changed various biological traits, illustrating an inherent phenotypic plasticity. However, as species are embedded in an ecological network characterised by multiple consumer-resource interactions, ecological mismatches are likely...... to arise when interacting species do not respond homogeneously. The approach of biological networks analysis calls for the use of structural equation modelling (SEM), a multidimensional analytical setup that has proven particularly useful for analysing multiple interactions across trophic levels. Here we...

  6. Biomass changes and trophic amplification of plankton in a warmer ocean

    KAUST Repository

    Chust, Guillem

    2014-05-07

    Ocean warming can modify the ecophysiology and distribution of marine organisms, and relationships between species, with nonlinear interactions between ecosystem components potentially resulting in trophic amplification. Trophic amplification (or attenuation) describe the propagation of a hydroclimatic signal up the food web, causing magnification (or depression) of biomass values along one or more trophic pathways. We have employed 3-D coupled physical-biogeochemical models to explore ecosystem responses to climate change with a focus on trophic amplification. The response of phytoplankton and zooplankton to global climate-change projections, carried out with the IPSL Earth System Model by the end of the century, is analysed at global and regional basis, including European seas (NE Atlantic, Barents Sea, Baltic Sea, Black Sea, Bay of Biscay, Adriatic Sea, Aegean Sea) and the Eastern Boundary Upwelling System (Benguela). Results indicate that globally and in Atlantic Margin and North Sea, increased ocean stratification causes primary production and zooplankton biomass to decrease in response to a warming climate, whilst in the Barents, Baltic and Black Seas, primary production and zooplankton biomass increase. Projected warming characterized by an increase in sea surface temperature of 2.29 ± 0.05 °C leads to a reduction in zooplankton and phytoplankton biomasses of 11% and 6%, respectively. This suggests negative amplification of climate driven modifications of trophic level biomass through bottom-up control, leading to a reduced capacity of oceans to regulate climate through the biological carbon pump. Simulations suggest negative amplification is the dominant response across 47% of the ocean surface and prevails in the tropical oceans; whilst positive trophic amplification prevails in the Arctic and Antarctic oceans. Trophic attenuation is projected in temperate seas. Uncertainties in ocean plankton projections, associated to the use of single global and

  7. Biomass changes and trophic amplification of plankton in a warmer ocean

    KAUST Repository

    Chust, Guillem; Allen, Julian Icarus; Bopp, Laurent; Schrum, Corinna; Holt, Jason T.; Tsiaras, Kostas P.; Zavatarelli, Marco; Chifflet, Marina; Cannaby, Heather; Dadou, Isabelle C.; Daewel, Ute; Wakelin, Sarah L.; Machú , Eric; Pushpadas, Dhanya; Butenschö n, Momme; Artioli, Yuri; Petihakis, George; Smith, Chris J M; Garç on, Vé ronique C.; Goubanova, Katerina; Le Vu, Briac; Fach, Bettina A.; Salihoglu, Baris; Clementi, Emanuela; Irigoien, Xabier

    2014-01-01

    Ocean warming can modify the ecophysiology and distribution of marine organisms, and relationships between species, with nonlinear interactions between ecosystem components potentially resulting in trophic amplification. Trophic amplification (or attenuation) describe the propagation of a hydroclimatic signal up the food web, causing magnification (or depression) of biomass values along one or more trophic pathways. We have employed 3-D coupled physical-biogeochemical models to explore ecosystem responses to climate change with a focus on trophic amplification. The response of phytoplankton and zooplankton to global climate-change projections, carried out with the IPSL Earth System Model by the end of the century, is analysed at global and regional basis, including European seas (NE Atlantic, Barents Sea, Baltic Sea, Black Sea, Bay of Biscay, Adriatic Sea, Aegean Sea) and the Eastern Boundary Upwelling System (Benguela). Results indicate that globally and in Atlantic Margin and North Sea, increased ocean stratification causes primary production and zooplankton biomass to decrease in response to a warming climate, whilst in the Barents, Baltic and Black Seas, primary production and zooplankton biomass increase. Projected warming characterized by an increase in sea surface temperature of 2.29 ± 0.05 °C leads to a reduction in zooplankton and phytoplankton biomasses of 11% and 6%, respectively. This suggests negative amplification of climate driven modifications of trophic level biomass through bottom-up control, leading to a reduced capacity of oceans to regulate climate through the biological carbon pump. Simulations suggest negative amplification is the dominant response across 47% of the ocean surface and prevails in the tropical oceans; whilst positive trophic amplification prevails in the Arctic and Antarctic oceans. Trophic attenuation is projected in temperate seas. Uncertainties in ocean plankton projections, associated to the use of single global and

  8. Biomass changes and trophic amplification of plankton in a warmer ocean.

    Science.gov (United States)

    Chust, Guillem; Allen, J Icarus; Bopp, Laurent; Schrum, Corinna; Holt, Jason; Tsiaras, Kostas; Zavatarelli, Marco; Chifflet, Marina; Cannaby, Heather; Dadou, Isabelle; Daewel, Ute; Wakelin, Sarah L; Machu, Eric; Pushpadas, Dhanya; Butenschon, Momme; Artioli, Yuri; Petihakis, George; Smith, Chris; Garçon, Veronique; Goubanova, Katerina; Le Vu, Briac; Fach, Bettina A; Salihoglu, Baris; Clementi, Emanuela; Irigoien, Xabier

    2014-07-01

    Ocean warming can modify the ecophysiology and distribution of marine organisms, and relationships between species, with nonlinear interactions between ecosystem components potentially resulting in trophic amplification. Trophic amplification (or attenuation) describe the propagation of a hydroclimatic signal up the food web, causing magnification (or depression) of biomass values along one or more trophic pathways. We have employed 3-D coupled physical-biogeochemical models to explore ecosystem responses to climate change with a focus on trophic amplification. The response of phytoplankton and zooplankton to global climate-change projections, carried out with the IPSL Earth System Model by the end of the century, is analysed at global and regional basis, including European seas (NE Atlantic, Barents Sea, Baltic Sea, Black Sea, Bay of Biscay, Adriatic Sea, Aegean Sea) and the Eastern Boundary Upwelling System (Benguela). Results indicate that globally and in Atlantic Margin and North Sea, increased ocean stratification causes primary production and zooplankton biomass to decrease in response to a warming climate, whilst in the Barents, Baltic and Black Seas, primary production and zooplankton biomass increase. Projected warming characterized by an increase in sea surface temperature of 2.29 ± 0.05 °C leads to a reduction in zooplankton and phytoplankton biomasses of 11% and 6%, respectively. This suggests negative amplification of climate driven modifications of trophic level biomass through bottom-up control, leading to a reduced capacity of oceans to regulate climate through the biological carbon pump. Simulations suggest negative amplification is the dominant response across 47% of the ocean surface and prevails in the tropical oceans; whilst positive trophic amplification prevails in the Arctic and Antarctic oceans. Trophic attenuation is projected in temperate seas. Uncertainties in ocean plankton projections, associated to the use of single global and

  9. Effect of stock size, climate, predation, and trophic status on recruitment of alewives in Lake Ontario, 1978-2000

    Science.gov (United States)

    O'Gorman, Robert; Lantry, Brian F.; Schneider, Clifford P.

    2004-01-01

    The population of alewives Alosa pseudoharengus in Lake Ontario is of great concern to fishery managers because alewives are the principal prey of introduced salmonines and because alewives negatively influence many endemic fishes. We used spring bottom trawl catches of alewives to investigate the roles of stock size, climate, predation, and lake trophic status on recruitment of alewives to age 2 in Lake Ontario during 1978–2000. Climate was indexed from the temperature of water entering a south-shore municipal treatment plant, lake trophic status was indexed by the mean concentration of total phosphorus (TP) in surface water in spring, and predation was indexed by the product of the number of salmonines stocked and relative, first-year survival of Chinook salmonOncorhynchus tshawytscha. A Ricker-type parent–progeny model suggested that peak production of age-1 alewives could occur over a broad range of spawning stock sizes, and the fit of the model was improved most by the addition of terms for spring water temperature and winter duration. With the addition of the two climate terms, the Ricker model indicated that when water was relatively warm in spring and the winter was relatively short, peak potential production of young was nine times higher than when water temperature and winters were average, and 73 times higher than when water was cold in spring and winters were long. Relative survival from age 1 to recruitment at age 2 was best described by a multiple linear regression with terms for adult abundance, TP, and predation. Mean recruitment of age-2 fish in the 1978–1998 year-classes predicted by using the two models in sequence was only about 20% greater than the observed mean recruitment. Model estimates fit the measured data exceptionally well for all but the largest four year-classes, which suggests that the models will facilitate improvement in estimates of trophic transfer due to alewives.

  10. Trophic niche shifts driven by phytoplankton in sandy beach ecosystems

    Science.gov (United States)

    Bergamino, Leandro; Martínez, Ana; Han, Eunah; Lercari, Diego; Defeo, Omar

    2016-10-01

    Stable isotopes (δ13C and δ15N) together with chlorophyll a and densities of surf diatoms were used to analyze changes in trophic niches of species in two sandy beaches of Uruguay with contrasting morphodynamics (i.e. dissipative vs. reflective). Consumers and food sources were collected over four seasons, including sediment organic matter (SOM), suspended particulate organic matter (POM) and the surf zone diatom Asterionellopsis guyunusae. Circular statistics and a Bayesian isotope mixing model were used to quantify food web differences between beaches. Consumers changed their trophic niche between beaches in the same direction of the food web space towards higher reliance on surf diatoms in the dissipative beach. Mixing models indicated that A. guyunusae was the primary nutrition source for suspension feeders in the dissipative beach, explaining their change in dietary niche compared to the reflective beach where the proportional contribution of surf diatoms was low. The high C/N ratios in A. guyunusae indicated its high nutritional value and N content, and may help to explain the high assimilation by suspension feeders at the dissipative beach. Furthermore, density of A. guyunusae was higher in the dissipative than in the reflective beach, and cell density was positively correlated with chlorophyll a only in the dissipative beach. Therefore, surf diatoms are important drivers in the dynamics of sandy beach food webs, determining the trophic niche space and productivity. Our study provides valuable insights on shifting foraging behavior by beach fauna in response to changes in resource availability.

  11. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... and the resulting predictions can be compared with predictions from the ‘true’ model. By performing this analysis we expect to give the modeler insight into how the uncertainty of model-based prediction can be reduced.......A major purpose of groundwater modeling is to help decision-makers in efforts to manage the natural environment. Increasingly, it is recognized that both the predictions of interest and their associated uncertainties should be quantified to support robust decision making. In particular, decision...

  12. Trophic interactions, ecosystem structure and function in the southern Yellow Sea

    Science.gov (United States)

    Lin, Qun; Jin, Xianshi; Zhang, Bo

    2013-01-01

    The southern Yellow Sea is an important fishing ground, providing abundant fishery resources. However, overfishing and climate change have caused a decline in the resource and damaged the ecosystem. We developed an ecosystem model to analyze the trophic interactions and ecosystem structure and function to guide sustainable development of the ecosystem. A trophic mass-balance model of the southern Yellow Sea during 2000-2001 was constructed using Ecopath with Ecosim software. We defined 22 important functional groups and studied their diet composition. The trophic levels of fish, shrimp, crabs, and cephalopods were between 2.78 and 4.39, and the mean trophic level of the fisheries was 3.24. The trophic flows within the food web occurred primarily in the lower trophic levels. The mean trophic transfer efficiency was 8.1%, of which 7.1% was from primary producers and 9.3% was from detritus within the ecosystem. The transfer efficiency between trophic levels II to III to IV to V to >V was 5.0%, 5.7%, 18.5%, and 19.7%-20.4%, respectively. Of the total flow, phytoplankton contributed 61% and detritus contributed 39%. Fishing is defined as a top predator within the ecosystem, and has a negative impact on most commercial species. Moreover, the ecosystem had a high gross efficiency of the fishery and a high value of primary production required to sustain the fishery. Together, our data suggest there is high fishing pressure in the southern Yellow Sea. Based on analysis of Odum's ecological parameters, this ecosystem was at an immature stage. Our results provide some insights into the structure and development of this ecosystem.

  13. Neural Fuzzy Inference System-Based Weather Prediction Model and Its Precipitation Predicting Experiment

    Directory of Open Access Journals (Sweden)

    Jing Lu

    2014-11-01

    Full Text Available We propose a weather prediction model in this article based on neural network and fuzzy inference system (NFIS-WPM, and then apply it to predict daily fuzzy precipitation given meteorological premises for testing. The model consists of two parts: the first part is the “fuzzy rule-based neural network”, which simulates sequential relations among fuzzy sets using artificial neural network; and the second part is the “neural fuzzy inference system”, which is based on the first part, but could learn new fuzzy rules from the previous ones according to the algorithm we proposed. NFIS-WPM (High Pro and NFIS-WPM (Ave are improved versions of this model. It is well known that the need for accurate weather prediction is apparent when considering the benefits. However, the excessive pursuit of accuracy in weather prediction makes some of the “accurate” prediction results meaningless and the numerical prediction model is often complex and time-consuming. By adapting this novel model to a precipitation prediction problem, we make the predicted outcomes of precipitation more accurate and the prediction methods simpler than by using the complex numerical forecasting model that would occupy large computation resources, be time-consuming and which has a low predictive accuracy rate. Accordingly, we achieve more accurate predictive precipitation results than by using traditional artificial neural networks that have low predictive accuracy.

  14. Foundation Settlement Prediction Based on a Novel NGM Model

    Directory of Open Access Journals (Sweden)

    Peng-Yu Chen

    2014-01-01

    Full Text Available Prediction of foundation or subgrade settlement is very important during engineering construction. According to the fact that there are lots of settlement-time sequences with a nonhomogeneous index trend, a novel grey forecasting model called NGM (1,1,k,c model is proposed in this paper. With an optimized whitenization differential equation, the proposed NGM (1,1,k,c model has the property of white exponential law coincidence and can predict a pure nonhomogeneous index sequence precisely. We used two case studies to verify the predictive effect of NGM (1,1,k,c model for settlement prediction. The results show that this model can achieve excellent prediction accuracy; thus, the model is quite suitable for simulation and prediction of approximate nonhomogeneous index sequence and has excellent application value in settlement prediction.

  15. Nonconvex model predictive control for commercial refrigeration

    Science.gov (United States)

    Gybel Hovgaard, Tobias; Boyd, Stephen; Larsen, Lars F. S.; Bagterp Jørgensen, John

    2013-08-01

    We consider the control of a commercial multi-zone refrigeration system, consisting of several cooling units that share a common compressor, and is used to cool multiple areas or rooms. In each time period we choose cooling capacity to each unit and a common evaporation temperature. The goal is to minimise the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimisation method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out the iterations, which is more than fast enough to run in real time. We demonstrate our method on a realistic model, with a full year simulation and 15-minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost savings, on the order of 30%, compared to a standard thermostat-based control system. Perhaps more important, we see that the method exhibits sophisticated response to real-time variations in electricity prices. This demand response is critical to help balance real-time uncertainties in generation capacity associated with large penetration of intermittent renewable energy sources in a future smart grid.

  16. Ecosystem regime shifts disrupt trophic structure.

    Science.gov (United States)

    Hempson, Tessa N; Graham, Nicholas A J; MacNeil, M Aaron; Hoey, Andrew S; Wilson, Shaun K

    2018-01-01

    Regime shifts between alternative stable ecosystem states are becoming commonplace due to the combined effects of local stressors and global climate change. Alternative states are characterized as substantially different in form and function from pre-disturbance states, disrupting the delivery of ecosystem services and functions. On coral reefs, regime shifts are typically characterized by a change in the benthic composition from coral to macroalgal dominance. Such fundamental shifts in the benthos are anticipated to impact associated fish communities that are reliant on the reef for food and shelter, yet there is limited understanding of how regime shifts propagate through the fish community over time, relative to initial or recovery conditions. This study addresses this knowledge gap using long-term data of coral reef regime shifts and recovery on Seychelles reefs following the 1998 mass bleaching event. It shows how trophic structure of the reef fish community becomes increasingly dissimilar between alternative reef ecosystem states (regime-shifted vs. recovering) with time since disturbance. Regime-shifted reefs developed a concave trophic structure, with increased biomass in base trophic levels as herbivorous species benefitted from increased algal resources. Mid trophic level species, including specialists such as corallivores, declined with loss of coral habitat, while biomass was retained in upper trophic levels by large-bodied, generalist invertivores. Recovering reefs also experienced an initial decline in mid trophic level biomass, but moved toward a bottom-heavy pyramid shape, with a wide range of feeding groups (e.g., planktivores, corallivores, omnivores) represented at mid trophic levels. Given the importance of coral reef fishes in maintaining the ecological function of coral reef ecosystems and their associated fisheries, understanding the effects of regime shifts on these communities is essential to inform decisions that enhance ecological

  17. Predictive Modelling of Heavy Metals in Urban Lakes

    OpenAIRE

    Lindström, Martin

    2000-01-01

    Heavy metals are well-known environmental pollutants. In this thesis predictive models for heavy metals in urban lakes are discussed and new models presented. The base of predictive modelling is empirical data from field investigations of many ecosystems covering a wide range of ecosystem characteristics. Predictive models focus on the variabilities among lakes and processes controlling the major metal fluxes. Sediment and water data for this study were collected from ten small lakes in the ...

  18. Seasonal predictability of Kiremt rainfall in coupled general circulation models

    Science.gov (United States)

    Gleixner, Stephanie; Keenlyside, Noel S.; Demissie, Teferi D.; Counillon, François; Wang, Yiguo; Viste, Ellen

    2017-11-01

    The Ethiopian economy and population is strongly dependent on rainfall. Operational seasonal predictions for the main rainy season (Kiremt, June-September) are based on statistical approaches with Pacific sea surface temperatures (SST) as the main predictor. Here we analyse dynamical predictions from 11 coupled general circulation models for the Kiremt seasons from 1985-2005 with the forecasts starting from the beginning of May. We find skillful predictions from three of the 11 models, but no model beats a simple linear prediction model based on the predicted Niño3.4 indices. The skill of the individual models for dynamically predicting Kiremt rainfall depends on the strength of the teleconnection between Kiremt rainfall and concurrent Pacific SST in the models. Models that do not simulate this teleconnection fail to capture the observed relationship between Kiremt rainfall and the large-scale Walker circulation.

  19. MODELLING OF DYNAMIC SPEED LIMITS USING THE MODEL PREDICTIVE CONTROL

    Directory of Open Access Journals (Sweden)

    Andrey Borisovich Nikolaev

    2017-09-01

    Full Text Available The article considers the issues of traffic management using intelligent system “Car-Road” (IVHS, which consist of interacting intelligent vehicles (IV and intelligent roadside controllers. Vehicles are organized in convoy with small distances between them. All vehicles are assumed to be fully automated (throttle control, braking, steering. Proposed approaches for determining speed limits for traffic cars on the motorway using a model predictive control (MPC. The article proposes an approach to dynamic speed limit to minimize the downtime of vehicles in traffic.

  20. MJO prediction skill of the subseasonal-to-seasonal (S2S) prediction models

    Science.gov (United States)

    Son, S. W.; Lim, Y.; Kim, D.

    2017-12-01

    The Madden-Julian Oscillation (MJO), the dominant mode of tropical intraseasonal variability, provides the primary source of tropical and extratropical predictability on subseasonal to seasonal timescales. To better understand its predictability, this study conducts quantitative evaluation of MJO prediction skill in the state-of-the-art operational models participating in the subseasonal-to-seasonal (S2S) prediction project. Based on bivariate correlation coefficient of 0.5, the S2S models exhibit MJO prediction skill ranging from 12 to 36 days. These prediction skills are affected by both the MJO amplitude and phase errors, the latter becoming more important with forecast lead times. Consistent with previous studies, the MJO events with stronger initial amplitude are typically better predicted. However, essentially no sensitivity to the initial MJO phase is observed. Overall MJO prediction skill and its inter-model spread are further related with the model mean biases in moisture fields and longwave cloud-radiation feedbacks. In most models, a dry bias quickly builds up in the deep tropics, especially across the Maritime Continent, weakening horizontal moisture gradient. This likely dampens the organization and propagation of MJO. Most S2S models also underestimate the longwave cloud-radiation feedbacks in the tropics, which may affect the maintenance of the MJO convective envelop. In general, the models with a smaller bias in horizontal moisture gradient and longwave cloud-radiation feedbacks show a higher MJO prediction skill, suggesting that improving those processes would enhance MJO prediction skill.

  1. Trophic transfer of microplastics in aquatic ecosystems: Identifying critical research needs.

    Science.gov (United States)

    Au, Sarah Y; Lee, Cindy M; Weinstein, John E; van den Hurk, Peter; Klaine, Stephen J

    2017-05-01

    To evaluate the process of trophic transfer of microplastics, it is important to consider various abiotic and biotic factors involved in their ingestion, egestion, bioaccumulation, and biomagnification. Toward this end, a review of the literature on microplastics has been conducted to identify factors influencing their uptake and absorption; their residence times in organisms and bioaccumulation; the physical effects of their aggregation in gastrointestinal tracts; and their potential to act as vectors for the transfer of other contaminants. Limited field evidence from higher trophic level organisms in a variety of habitats suggests that trophic transfer of microplastics may be a common phenomenon and occurs concurrently with direct ingestion. Critical research needs include standardizing methods of field characterization of microplastics, quantifying uptake and depuration rates in organisms at different trophic levels, quantifying the influence that microplastics have on the uptake and/or depuration of environmental contaminants among different trophic levels, and investigating the potential for biomagnification of microplastic-associated chemicals. More integrated approaches involving computational modeling are required to fully assess trophic transfer of microplastics. Integr Environ Assess Manag 2017;13:505-509. © 2017 SETAC. © 2017 SETAC.

  2. Persistence of trophic hotspots and relation to human impacts within an upwelling marine ecosystem.

    Science.gov (United States)

    Santora, Jarrod A; Sydeman, William J; Schroeder, Isaac D; Field, John C; Miller, Rebecca R; Wells, Brian K

    2017-03-01

    Human impacts (e.g., fishing, pollution, and shipping) on pelagic ecosystems are increasing, causing concerns about stresses on marine food webs. Maintaining predator-prey relationships through protection of pelagic hotspots is crucial for conservation and management of living marine resources. Biotic components of pelagic, plankton-based, ecosystems exhibit high variability in abundance in time and space (i.e., extreme patchiness), requiring investigation of persistence of abundance across trophic levels to resolve trophic hotspots. Using a 26-yr record of indicators for primary production, secondary (zooplankton and larval fish), and tertiary (seabirds) consumers, we show distributions of trophic hotspots in the southern California Current Ecosystem result from interactions between a strong upwelling center and a productive retention zone with enhanced nutrients, which concentrate prey and predators across multiple trophic levels. Trophic hotspots also overlap with human impacts, including fisheries extraction of coastal pelagic and groundfish species, as well as intense commercial shipping traffic. Spatial overlap of trophic hotspots with fisheries and shipping increases vulnerability of the ecosystem to localized depletion of forage fish, ship strikes on marine mammals, and pollution. This study represents a critical step toward resolving pelagic areas of high conservation interest for planktonic ecosystems and may serve as a model for other ocean regions where ecosystem-based management and marine spatial planning of pelagic ecosystems is warranted. © 2016 by the Ecological Society of America.

  3. Butterfly, Recurrence, and Predictability in Lorenz Models

    Science.gov (United States)

    Shen, B. W.

    2017-12-01

    Over the span of 50 years, the original three-dimensional Lorenz model (3DLM; Lorenz,1963) and its high-dimensional versions (e.g., Shen 2014a and references therein) have been used for improving our understanding of the predictability of weather and climate with a focus on chaotic responses. Although the Lorenz studies focus on nonlinear processes and chaotic dynamics, people often apply a "linear" conceptual model to understand the nonlinear processes in the 3DLM. In this talk, we present examples to illustrate the common misunderstandings regarding butterfly effect and discuss the importance of solutions' recurrence and boundedness in the 3DLM and high-dimensional LMs. The first example is discussed with the following folklore that has been widely used as an analogy of the butterfly effect: "For want of a nail, the shoe was lost.For want of a shoe, the horse was lost.For want of a horse, the rider was lost.For want of a rider, the battle was lost.For want of a battle, the kingdom was lost.And all for the want of a horseshoe nail."However, in 2008, Prof. Lorenz stated that he did not feel that this verse described true chaos but that it better illustrated the simpler phenomenon of instability; and that the verse implicitly suggests that subsequent small events will not reverse the outcome (Lorenz, 2008). Lorenz's comments suggest that the verse neither describes negative (nonlinear) feedback nor indicates recurrence, the latter of which is required for the appearance of a butterfly pattern. The second example is to illustrate that the divergence of two nearby trajectories should be bounded and recurrent, as shown in Figure 1. Furthermore, we will discuss how high-dimensional LMs were derived to illustrate (1) negative nonlinear feedback that stabilizes the system within the five- and seven-dimensional LMs (5D and 7D LMs; Shen 2014a; 2015a; 2016); (2) positive nonlinear feedback that destabilizes the system within the 6D and 8D LMs (Shen 2015b; 2017); and (3

  4. Auditing predictive models : a case study in crop growth

    NARCIS (Netherlands)

    Metselaar, K.

    1999-01-01

    Methods were developed to assess and quantify the predictive quality of simulation models, with the intent to contribute to evaluation of model studies by non-scientists. In a case study, two models of different complexity, LINTUL and SUCROS87, were used to predict yield of forage maize

  5. Models for predicting compressive strength and water absorption of ...

    African Journals Online (AJOL)

    This work presents a mathematical model for predicting the compressive strength and water absorption of laterite-quarry dust cement block using augmented Scheffe's simplex lattice design. The statistical models developed can predict the mix proportion that will yield the desired property. The models were tested for lack of ...

  6. Statistical and Machine Learning Models to Predict Programming Performance

    OpenAIRE

    Bergin, Susan

    2006-01-01

    This thesis details a longitudinal study on factors that influence introductory programming success and on the development of machine learning models to predict incoming student performance. Although numerous studies have developed models to predict programming success, the models struggled to achieve high accuracy in predicting the likely performance of incoming students. Our approach overcomes this by providing a machine learning technique, using a set of three significant...

  7. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  8. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    Science.gov (United States)

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  9. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re-search...... of prediction models, it was observed that different models have different capabilities and also no single model is suitable under all situations. The idea behind EPS (ensemble prediction systems) is to take advantage of the unique features of each subsystem to detain diverse patterns that exist in the dataset...

  10. Testing the predictive power of nuclear mass models

    International Nuclear Information System (INIS)

    Mendoza-Temis, J.; Morales, I.; Barea, J.; Frank, A.; Hirsch, J.G.; Vieyra, J.C. Lopez; Van Isacker, P.; Velazquez, V.

    2008-01-01

    A number of tests are introduced which probe the ability of nuclear mass models to extrapolate. Three models are analyzed in detail: the liquid drop model, the liquid drop model plus empirical shell corrections and the Duflo-Zuker mass formula. If predicted nuclei are close to the fitted ones, average errors in predicted and fitted masses are similar. However, the challenge of predicting nuclear masses in a region stabilized by shell effects (e.g., the lead region) is far more difficult. The Duflo-Zuker mass formula emerges as a powerful predictive tool

  11. From Predictive Models to Instructional Policies

    Science.gov (United States)

    Rollinson, Joseph; Brunskill, Emma

    2015-01-01

    At their core, Intelligent Tutoring Systems consist of a student model and a policy. The student model captures the state of the student and the policy uses the student model to individualize instruction. Policies require different properties from the student model. For example, a mastery threshold policy requires the student model to have a way…

  12. Using social network analysis tools in ecology : Markov process transition models applied to the seasonal trophic network dynamics of the Chesapeake Bay

    NARCIS (Netherlands)

    Johnson, Jeffrey C.; Luczkovich, Joseph J.; Borgatti, Stephen P.; Snijders, Tom A. B.; Luczkovich, S.P.

    2009-01-01

    Ecosystem components interact in complex ways and change over time due to a variety of both internal and external influences (climate change, season cycles, human impacts). Such processes need to be modeled dynamically using appropriate statistical methods for assessing change in network structure.

  13. The influence of nutrients, biliary-pancreatic secretions, and systemic trophic hormones on intestinal adaptation in a Roux-en-Y bypass model

    DEFF Research Database (Denmark)

    Taqi, Esmaeel; Wallace, Laurie E; de Heuvel, Elaine

    2010-01-01

    The signals that govern the upregulation of nutrient absorption (adaptation) after intestinal resection are not well understood. A Gastric Roux-en-Y bypass (GRYB) model was used to isolate the relative contributions of direct mucosal stimulation by nutrients, biliary-pancreatic secretions......, and systemic enteric hormones on intestinal adaptation in short bowel syndrome....

  14. Size-based predictions of food web patterns

    DEFF Research Database (Denmark)

    Zhang, Lai; Hartvig, Martin; Knudsen, Kim

    2014-01-01

    We employ size-based theoretical arguments to derive simple analytic predictions of ecological patterns and properties of natural communities: size-spectrum exponent, maximum trophic level, and susceptibility to invasive species. The predictions are brought about by assuming that an infinite number...... of species are continuously distributed on a size-trait axis. It is, however, an open question whether such predictions are valid for a food web with a finite number of species embedded in a network structure. We address this question by comparing the size-based predictions to results from dynamic food web...... simulations with varying species richness. To this end, we develop a new size- and trait-based food web model that can be simplified into an analytically solvable size-based model. We confirm existing solutions for the size distribution and derive novel predictions for maximum trophic level and invasion...

  15. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  16. Sweat loss prediction using a multi-model approach.

    Science.gov (United States)

    Xu, Xiaojiang; Santee, William R

    2011-07-01

    A new multi-model approach (MMA) for sweat loss prediction is proposed to improve prediction accuracy. MMA was computed as the average of sweat loss predicted by two existing thermoregulation models: i.e., the rational model SCENARIO and the empirical model Heat Strain Decision Aid (HSDA). Three independent physiological datasets, a total of 44 trials, were used to compare predictions by MMA, SCENARIO, and HSDA. The observed sweat losses were collected under different combinations of uniform ensembles, environmental conditions (15-40°C, RH 25-75%), and exercise intensities (250-600 W). Root mean square deviation (RMSD), residual plots, and paired t tests were used to compare predictions with observations. Overall, MMA reduced RMSD by 30-39% in comparison with either SCENARIO or HSDA, and increased the prediction accuracy to 66% from 34% or 55%. Of the MMA predictions, 70% fell within the range of mean observed value ± SD, while only 43% of SCENARIO and 50% of HSDA predictions fell within the same range. Paired t tests showed that differences between observations and MMA predictions were not significant, but differences between observations and SCENARIO or HSDA predictions were significantly different for two datasets. Thus, MMA predicted sweat loss more accurately than either of the two single models for the three datasets used. Future work will be to evaluate MMA using additional physiological data to expand the scope of populations and conditions.

  17. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  18. Changes in the trophic structure of the northern Benguela before ...

    African Journals Online (AJOL)

    The dominant small pelagic fish, characteristic of upwelling systems, were replaced ... as did the weighted trophic level of the community (excluding plankton), after the ... may have altered the trophic control mechanism operating in the system, ...

  19. Spring diet and trophic partitioning in an alpine lizard community ...

    African Journals Online (AJOL)

    The influences of species interactions on habitat use, restrictions in trophic availability and evolutionary history as determinant factors are discussed. Keywords: trophic ecology, communities, pseudocommunity analysis, Lacerta perspicillata, Lacerta andreanszkyi, Podarcis vaucheri, Quedenfeldtia trachyblepharus, Morocco ...

  20. Modeling of Complex Life Cycle Prediction Based on Cell Division

    Directory of Open Access Journals (Sweden)

    Fucheng Zhang

    2017-01-01

    Full Text Available Effective fault diagnosis and reasonable life expectancy are of great significance and practical engineering value for the safety, reliability, and maintenance cost of equipment and working environment. At present, the life prediction methods of the equipment are equipment life prediction based on condition monitoring, combined forecasting model, and driven data. Most of them need to be based on a large amount of data to achieve the problem. For this issue, we propose learning from the mechanism of cell division in the organism. We have established a moderate complexity of life prediction model across studying the complex multifactor correlation life model. In this paper, we model the life prediction of cell division. Experiments show that our model can effectively simulate the state of cell division. Through the model of reference, we will use it for the equipment of the complex life prediction.

  1. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  2. Predictive modeling and reducing cyclic variability in autoignition engines

    Science.gov (United States)

    Hellstrom, Erik; Stefanopoulou, Anna; Jiang, Li; Larimore, Jacob

    2016-08-30

    Methods and systems are provided for controlling a vehicle engine to reduce cycle-to-cycle combustion variation. A predictive model is applied to predict cycle-to-cycle combustion behavior of an engine based on observed engine performance variables. Conditions are identified, based on the predicted cycle-to-cycle combustion behavior, that indicate high cycle-to-cycle combustion variation. Corrective measures are then applied to prevent the predicted high cycle-to-cycle combustion variation.

  3. Dynamic Simulation of Human Gait Model With Predictive Capability.

    Science.gov (United States)

    Sun, Jinming; Wu, Shaoli; Voglewede, Philip A

    2018-03-01

    In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.

  4. Comparative Evaluation of Some Crop Yield Prediction Models ...

    African Journals Online (AJOL)

    A computer program was adopted from the work of Hill et al. (1982) to calibrate and test three of the existing yield prediction models using tropical cowpea yieldÐweather data. The models tested were Hanks Model (first and second versions). Stewart Model (first and second versions) and HallÐButcher Model. Three sets of ...

  5. A model to predict the power output from wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L. [Riso National Lab., Roskilde (Denmark)

    1997-12-31

    This paper will describe a model that can predict the power output from wind farms. To give examples of input the model is applied to a wind farm in Texas. The predictions are generated from forecasts from the NGM model of NCEP. These predictions are made valid at individual sites (wind farms) by applying a matrix calculated by the sub-models of WASP (Wind Atlas Application and Analysis Program). The actual wind farm production is calculated using the Riso PARK model. Because of the preliminary nature of the results, they will not be given. However, similar results from Europe will be given.

  6. Modelling microbial interactions and food structure in predictive microbiology

    NARCIS (Netherlands)

    Malakar, P.K.

    2002-01-01

    Keywords: modelling, dynamic models, microbial interactions, diffusion, microgradients, colony growth, predictive microbiology.

    Growth response of microorganisms in foods is a complex process. Innovations in food production and preservation techniques have resulted in adoption of

  7. Ocean wave prediction using numerical and neural network models

    Digital Repository Service at National Institute of Oceanography (India)

    Mandal, S.; Prabaharan, N.

    This paper presents an overview of the development of the numerical wave prediction models and recently used neural networks for ocean wave hindcasting and forecasting. The numerical wave models express the physical concepts of the phenomena...

  8. A mathematical model for predicting earthquake occurrence ...

    African Journals Online (AJOL)

    We consider the continental crust under damage. We use the observed results of microseism in many seismic stations of the world which was established to study the time series of the activities of the continental crust with a view to predicting possible time of occurrence of earthquake. We consider microseism time series ...

  9. Model for predicting the injury severity score.

    Science.gov (United States)

    Hagiwara, Shuichi; Oshima, Kiyohiro; Murata, Masato; Kaneko, Minoru; Aoki, Makoto; Kanbe, Masahiko; Nakamura, Takuro; Ohyama, Yoshio; Tamura, Jun'ichi

    2015-07-01

    To determine the formula that predicts the injury severity score from parameters that are obtained in the emergency department at arrival. We reviewed the medical records of trauma patients who were transferred to the emergency department of Gunma University Hospital between January 2010 and December 2010. The injury severity score, age, mean blood pressure, heart rate, Glasgow coma scale, hemoglobin, hematocrit, red blood cell count, platelet count, fibrinogen, international normalized ratio of prothrombin time, activated partial thromboplastin time, and fibrin degradation products, were examined in those patients on arrival. To determine the formula that predicts the injury severity score, multiple linear regression analysis was carried out. The injury severity score was set as the dependent variable, and the other parameters were set as candidate objective variables. IBM spss Statistics 20 was used for the statistical analysis. Statistical significance was set at P  Watson ratio was 2.200. A formula for predicting the injury severity score in trauma patients was developed with ordinary parameters such as fibrin degradation products and mean blood pressure. This formula is useful because we can predict the injury severity score easily in the emergency department.

  10. Predicting Career Advancement with Structural Equation Modelling

    Science.gov (United States)

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  11. Evolutionary trade-offs in plants mediate the strength of trophic cascades.

    Science.gov (United States)

    Mooney, Kailen A; Halitschke, Rayko; Kessler, Andre; Agrawal, Anurag A

    2010-03-26

    Predators determine herbivore and plant biomass via so-called trophic cascades, and the strength of such effects is influenced by ecosystem productivity. To determine whether evolutionary trade-offs among plant traits influence patterns of trophic control, we manipulated predators and soil fertility and measured impacts of a major herbivore (the aphid Aphis nerii) on 16 milkweed species (Asclepias spp.) in a phylogenetic field experiment. Herbivore density was determined by variation in predation and trade-offs between herbivore resistance and plant growth strategy. Neither herbivore density nor predator effects on herbivores predicted the cascading effects of predators on plant biomass. Instead, cascade strength was strongly and positively associated with milkweed response to soil fertility. Accordingly, contemporary patterns of trophic control are driven by evolutionary convergent trade-offs faced by plants.

  12. Statistical model based gender prediction for targeted NGS clinical panels

    Directory of Open Access Journals (Sweden)

    Palani Kannan Kandavel

    2017-12-01

    The reference test dataset are being used to test the model. The sensitivity on predicting the gender has been increased from the current “genotype composition in ChrX” based approach. In addition, the prediction score given by the model can be used to evaluate the quality of clinical dataset. The higher prediction score towards its respective gender indicates the higher quality of sequenced data.

  13. A predictive pilot model for STOL aircraft landing

    Science.gov (United States)

    Kleinman, D. L.; Killingsworth, W. R.

    1974-01-01

    An optimal control approach has been used to model pilot performance during STOL flare and landing. The model is used to predict pilot landing performance for three STOL configurations, each having a different level of automatic control augmentation. Model predictions are compared with flight simulator data. It is concluded that the model can be effective design tool for studying analytically the effects of display modifications, different stability augmentation systems, and proposed changes in the landing area geometry.

  14. Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?

    Science.gov (United States)

    Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander

    2016-01-01

    Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.

  15. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions...

  16. Wind turbine control and model predictive control for uncertain systems

    DEFF Research Database (Denmark)

    Thomsen, Sven Creutz

    as disturbance models for controller design. The theoretical study deals with Model Predictive Control (MPC). MPC is an optimal control method which is characterized by the use of a receding prediction horizon. MPC has risen in popularity due to its inherent ability to systematically account for time...

  17. Testing and analysis of internal hardwood log defect prediction models

    Science.gov (United States)

    R. Edward Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  18. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “ any fall ” and “ recurrent falls .” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  19. Models for predicting fuel consumption in sagebrush-dominated ecosystems

    Science.gov (United States)

    Clinton S. Wright

    2013-01-01

    Fuel consumption predictions are necessary to accurately estimate or model fire effects, including pollutant emissions during wildland fires. Fuel and environmental measurements on a series of operational prescribed fires were used to develop empirical models for predicting fuel consumption in big sagebrush (Artemisia tridentate Nutt.) ecosystems....

  20. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  1. A new, accurate predictive model for incident hypertension

    DEFF Research Database (Denmark)

    Völzke, Henry; Fung, Glenn; Ittermann, Till

    2013-01-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures.......Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures....

  2. Prediction models for successful external cephalic version: a systematic review

    NARCIS (Netherlands)

    Velzel, Joost; de Hundt, Marcella; Mulder, Frederique M.; Molkenboer, Jan F. M.; van der Post, Joris A. M.; Mol, Ben W.; Kok, Marjolein

    2015-01-01

    To provide an overview of existing prediction models for successful ECV, and to assess their quality, development and performance. We searched MEDLINE, EMBASE and the Cochrane Library to identify all articles reporting on prediction models for successful ECV published from inception to January 2015.

  3. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  4. Mathematical model for dissolved oxygen prediction in Cirata ...

    African Journals Online (AJOL)

    This paper presents the implementation and performance of mathematical model to predict theconcentration of dissolved oxygen in Cirata Reservoir, West Java by using Artificial Neural Network (ANN). The simulation program was created using Visual Studio 2012 C# software with ANN model implemented in it. Prediction ...

  5. Econometric models for predicting confusion crop ratios

    Science.gov (United States)

    Umberger, D. E.; Proctor, M. H.; Clark, J. E.; Eisgruber, L. M.; Braschler, C. B. (Principal Investigator)

    1979-01-01

    Results for both the United States and Canada show that econometric models can provide estimates of confusion crop ratios that are more accurate than historical ratios. Whether these models can support the LACIE 90/90 accuracy criterion is uncertain. In the United States, experimenting with additional model formulations could provide improved methods models in some CRD's, particularly in winter wheat. Improved models may also be possible for the Canadian CD's. The more aggressive province/state models outperformed individual CD/CRD models. This result was expected partly because acreage statistics are based on sampling procedures, and the sampling precision declines from the province/state to the CD/CRD level. Declining sampling precision and the need to substitute province/state data for the CD/CRD data introduced measurement error into the CD/CRD models.

  6. PEEX Modelling Platform for Seamless Environmental Prediction

    Science.gov (United States)

    Baklanov, Alexander; Mahura, Alexander; Arnold, Stephen; Makkonen, Risto; Petäjä, Tuukka; Kerminen, Veli-Matti; Lappalainen, Hanna K.; Ezau, Igor; Nuterman, Roman; Zhang, Wen; Penenko, Alexey; Gordov, Evgeny; Zilitinkevich, Sergej; Kulmala, Markku

    2017-04-01

    The Pan-Eurasian EXperiment (PEEX) is a multidisciplinary, multi-scale research programme stared in 2012 and aimed at resolving the major uncertainties in Earth System Science and global sustainability issues concerning the Arctic and boreal Northern Eurasian regions and in China. Such challenges include climate change, air quality, biodiversity loss, chemicalization, food supply, and the use of natural resources by mining, industry, energy production and transport. The research infrastructure introduces the current state of the art modeling platform and observation systems in the Pan-Eurasian region and presents the future baselines for the coherent and coordinated research infrastructures in the PEEX domain. The PEEX modeling Platform is characterized by a complex seamless integrated Earth System Modeling (ESM) approach, in combination with specific models of different processes and elements of the system, acting on different temporal and spatial scales. The ensemble approach is taken to the integration of modeling results from different models, participants and countries. PEEX utilizes the full potential of a hierarchy of models: scenario analysis, inverse modeling, and modeling based on measurement needs and processes. The models are validated and constrained by available in-situ and remote sensing data of various spatial and temporal scales using data assimilation and top-down modeling. The analyses of the anticipated large volumes of data produced by available models and sensors will be supported by a dedicated virtual research environment developed for these purposes.

  7. Using Species Distribution Models to Predict Potential Landscape Restoration Effects on Puma Conservation.

    Science.gov (United States)

    Angelieri, Cintia Camila Silva; Adams-Hosking, Christine; Ferraz, Katia Maria Paschoaletto Micchi de Barros; de Souza, Marcelo Pereira; McAlpine, Clive Alexander

    2016-01-01

    A mosaic of intact native and human-modified vegetation use can provide important habitat for top predators such as the puma (Puma concolor), avoiding negative effects on other species and ecological processes due to cascade trophic interactions. This study investigates the effects of restoration scenarios on the puma's habitat suitability in the most developed Brazilian region (São Paulo State). Species Distribution Models incorporating restoration scenarios were developed using the species' occurrence information to (1) map habitat suitability of pumas in São Paulo State, Southeast, Brazil; (2) test the relative contribution of environmental variables ecologically relevant to the species habitat suitability and (3) project the predicted habitat suitability to future native vegetation restoration scenarios. The Maximum Entropy algorithm was used (Test AUC of 0.84 ± 0.0228) based on seven environmental non-correlated variables and non-autocorrelated presence-only records (n = 342). The percentage of native vegetation (positive influence), elevation (positive influence) and density of roads (negative influence) were considered the most important environmental variables to the model. Model projections to restoration scenarios reflected the high positive relationship between pumas and native vegetation. These projections identified new high suitability areas for pumas (probability of presence >0.5) in highly deforested regions. High suitability areas were increased from 5.3% to 8.5% of the total State extension when the landscapes were restored for ≥ the minimum native vegetation cover rule (20%) established by the Brazilian Forest Code in private lands. This study highlights the importance of a landscape planning approach to improve the conservation outlook for pumas and other species, including not only the establishment and management of protected areas, but also the habitat restoration on private lands. Importantly, the results may inform environmental

  8. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  9. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  10. Impact of late glacial climate variations on stratification and trophic state of the meromictic lake Längsee (Austria: validation of a conceptual model by multi proxy studies

    Directory of Open Access Journals (Sweden)

    Jens MÜLLER

    2002-02-01

    Full Text Available Selected pigments, diatoms and diatom-inferred phosphorus (Di-TP concentrations of a late glacial sediment core section of the meromictic Längsee, Austria, were compared with tephra- and varve-dated pollen stratigraphic and geochemical results. A conceptual model was adopted for Längsee and evaluated using multi proxy data. During the unforested late Pleniglacial, a holomictic lake stage with low primary productivity prevailed. Subsequent to the Lateglacial Betula expansion, at about 14,300 cal. y BP, okenone and isorenieratene, pigments from purple and green sulphur bacteria, indicate the onset of anoxic conditions in the hypolimnion. The formation of laminae coincides with this anoxic, meromictic period with high, though fluctuating, amounts of okenone that persisted throughout the Lateglacial interstadial. The occurrence of unlaminated sediment sections of allochthonous origin, and concurrent low concentrations of okenone, were related to cool and wet climate fluctuations during this period, probably coupled with a complete mixing of the water column. Two of these oscillations of the Lateglacial interstadial have been correlated tentatively with the Aegelsee and Gerzensee oscillations in the Alps. The latter climate fluctuation divides a period of enhanced anoxia and primary productivity, correlated with the Alleröd chronozone. Continental climate conditions were assumed to be the main driving forces for meromictic stability during Alleröd times. In addition, calcite dissolution due to severe hypolimnetic anoxia, appear to have supported meromictic stability. Increased pigment concentrations, which are in contrast to low diatom-inferred total phosphorus (Di- TP, indicate the formation of a productive metalimnion during this period, probably due to a clear-water phase (low catchment erosion, increased temperatures, and a steep gradient between the phosphorus enriched hypolimnion and the oligotrophic epilimnion. Meltwater impacts from an

  11. Trophically available metal - A variable feast

    International Nuclear Information System (INIS)

    Rainbow, Philip S.; Luoma, Samuel N.; Wang Wenxiong

    2011-01-01

    Assimilation of trace metals by predators from prey is affected by the physicochemical form of the accumulated metal in the prey, leading to the concept of a Trophically Available Metal (TAM) component in the food item definable in terms of particular subcellular fractions of accumulated metal. As originally defined TAM consists of soluble metal forms and metal associated with cell organelles, the combination of separated fractions which best explained particular results involving a decapod crustacean predator feeding on bivalve mollusc tissues. Unfortunately TAM as originally defined has subsequently frequently been used in the literature as an absolute description of that component of accumulated metal that is trophically available in all prey to all consumers. It is now clear that what is trophically available varies between food items, consumers and metals. TAM as originally defined should be seen as a useful starting hypothesis, not as a statement of fact. - Trophically Available Metal (TAM), the component of accumulated metal in food that is taken up by a feeding animal, varies with food type and consumer.

  12. Trophically available metal - A variable feast

    Energy Technology Data Exchange (ETDEWEB)

    Rainbow, Philip S., E-mail: p.rainbow@nhm.ac.uk [Department of Zoology, Natural History Museum, Cromwell Rd, London SW7 5BD (United Kingdom); Luoma, Samuel N. [Department of Zoology, Natural History Museum, Cromwell Rd, London SW7 5BD (United Kingdom); John Muir Institute of the Environment, University of California, Davis, CA 95616 (United States); Wang Wenxiong [College of Marine and Environmental Sciences, State Key Laboratory for Marine Environmental Sciences, Xiamen University, Fujian (China)

    2011-10-15

    Assimilation of trace metals by predators from prey is affected by the physicochemical form of the accumulated metal in the prey, leading to the concept of a Trophically Available Metal (TAM) component in the food item definable in terms of particular subcellular fractions of accumulated metal. As originally defined TAM consists of soluble metal forms and metal associated with cell organelles, the combination of separated fractions which best explained particular results involving a decapod crustacean predator feeding on bivalve mollusc tissues. Unfortunately TAM as originally defined has subsequently frequently been used in the literature as an absolute description of that component of accumulated metal that is trophically available in all prey to all consumers. It is now clear that what is trophically available varies between food items, consumers and metals. TAM as originally defined should be seen as a useful starting hypothesis, not as a statement of fact. - Trophically Available Metal (TAM), the component of accumulated metal in food that is taken up by a feeding animal, varies with food type and consumer.

  13. NOx PREDICTION FOR FBC BOILERS USING EMPIRICAL MODELS

    Directory of Open Access Journals (Sweden)

    Jiří Štefanica

    2014-02-01

    Full Text Available Reliable prediction of NOx emissions can provide useful information for boiler design and fuel selection. Recently used kinetic prediction models for FBC boilers are overly complex and require large computing capacity. Even so, there are many uncertainties in the case of FBC boilers. An empirical modeling approach for NOx prediction has been used exclusively for PCC boilers. No reference is available for modifying this method for FBC conditions. This paper presents possible advantages of empirical modeling based prediction of NOx emissions for FBC boilers, together with a discussion of its limitations. Empirical models are reviewed, and are applied to operation data from FBC boilers used for combusting Czech lignite coal or coal-biomass mixtures. Modifications to the model are proposed in accordance with theoretical knowledge and prediction accuracy.

  14. Complex versus simple models: ion-channel cardiac toxicity prediction.

    Science.gov (United States)

    Mistry, Hitesh B

    2018-01-01

    There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  15. Complex versus simple models: ion-channel cardiac toxicity prediction

    Directory of Open Access Journals (Sweden)

    Hitesh B. Mistry

    2018-02-01

    Full Text Available There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model Bnet was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the Bnet model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  16. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  17. [Application of ARIMA model on prediction of malaria incidence].

    Science.gov (United States)

    Jing, Xia; Hua-Xun, Zhang; Wen, Lin; Su-Jian, Pei; Ling-Cong, Sun; Xiao-Rong, Dong; Mu-Min, Cao; Dong-Ni, Wu; Shunxiang, Cai

    2016-01-29

    To predict the incidence of local malaria of Hubei Province applying the Autoregressive Integrated Moving Average model (ARIMA). SPSS 13.0 software was applied to construct the ARIMA model based on the monthly local malaria incidence in Hubei Province from 2004 to 2009. The local malaria incidence data of 2010 were used for model validation and evaluation. The model of ARIMA (1, 1, 1) (1, 1, 0) 12 was tested as relatively the best optimal with the AIC of 76.085 and SBC of 84.395. All the actual incidence data were in the range of 95% CI of predicted value of the model. The prediction effect of the model was acceptable. The ARIMA model could effectively fit and predict the incidence of local malaria of Hubei Province.

  18. Qualitative models to predict impacts of human interventions in a wetland ecosystem

    Directory of Open Access Journals (Sweden)

    S. Loiselle

    2002-07-01

    Full Text Available The large shallow wetlands that dominate much of the South American continent are rich in biodiversity and complexity. Many of these undamaged ecosystems are presently being examined for their potential economic utility, putting pressure on local authorities and the conservation community to find ways of correctly utilising the available natural resources without compromising the ecosystem functioning and overall integrity. Contrary to many northern hemisphere ecosystems, there have been little long term ecological studies of these systems, leading to a lack of quantitative data on which to construct ecological or resource use models. As a result, decision makers, even well meaning ones, have difficulty in determining if particular economic activities can potentially cause significant damage to the ecosystem and how one should go about monitoring the impacts of such activities. While the direct impact of many activities is often known, the secondary indirect impacts are usually less clear and can depend on local ecological conditions.

    The use of qualitative models is a helpful tool to highlight potential feedback mechanisms and secondary effects of management action on ecosystem integrity. The harvesting of a single, apparently abundant, species can have indirect secondary effects on key trophic and abiotic compartments. In this paper, loop model analysis is used to qualitatively examine secondary effects of potential economic activities in a large wetland area in northeast Argentina, the Esteros del Ibera. Based on interaction with local actors together with observed ecological information, loop models were constructed to reflect relationships between biotic and abiotic compartments. A series of analyses were made to study the effect of different economic scenarios on key ecosystem compartments. Important impacts on key biotic compartments (phytoplankton, zooplankton, ichthyofauna, aquatic macrophytes and on the abiotic environment

  19. Mobility Modelling through Trajectory Decomposition and Prediction

    OpenAIRE

    Faghihi, Farbod

    2017-01-01

    The ubiquity of mobile devices with positioning sensors make it possible to derive user's location at any time. However, constantly sensing the position in order to track the user's movement is not feasible, either due to the unavailability of sensors, or computational and storage burdens. In this thesis, we present and evaluate a novel approach for efficiently tracking user's movement trajectories using decomposition and prediction of trajectories. We facilitate tracking by taking advantage ...

  20. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  1. Predicting birth weight with conditionally linear transformation models.

    Science.gov (United States)

    Möst, Lisa; Schmid, Matthias; Faschingbauer, Florian; Hothorn, Torsten

    2016-12-01

    Low and high birth weight (BW) are important risk factors for neonatal morbidity and mortality. Gynecologists must therefore accurately predict BW before delivery. Most prediction formulas for BW are based on prenatal ultrasound measurements carried out within one week prior to birth. Although successfully used in clinical practice, these formulas focus on point predictions of BW but do not systematically quantify uncertainty of the predictions, i.e. they result in estimates of the conditional mean of BW but do not deliver prediction intervals. To overcome this problem, we introduce conditionally linear transformation models (CLTMs) to predict BW. Instead of focusing only on the conditional mean, CLTMs model the whole conditional distribution function of BW given prenatal ultrasound parameters. Consequently, the CLTM approach delivers both point predictions of BW and fetus-specific prediction intervals. Prediction intervals constitute an easy-to-interpret measure of prediction accuracy and allow identification of fetuses subject to high prediction uncertainty. Using a data set of 8712 deliveries at the Perinatal Centre at the University Clinic Erlangen (Germany), we analyzed variants of CLTMs and compared them to standard linear regression estimation techniques used in the past and to quantile regression approaches. The best-performing CLTM variant was competitive with quantile regression and linear regression approaches in terms of conditional coverage and average length of the prediction intervals. We propose that CLTMs be used because they are able to account for possible heteroscedasticity, kurtosis, and skewness of the distribution of BWs. © The Author(s) 2014.

  2. Prediction of hourly solar radiation with multi-model framework

    International Nuclear Information System (INIS)

    Wu, Ji; Chan, Chee Keong

    2013-01-01

    Highlights: • A novel approach to predict solar radiation through the use of clustering paradigms. • Development of prediction models based on the intrinsic pattern observed in each cluster. • Prediction based on proper clustering and selection of model on current time provides better results than other methods. • Experiments were conducted on actual solar radiation data obtained from a weather station in Singapore. - Abstract: In this paper, a novel multi-model prediction framework for prediction of solar radiation is proposed. The framework started with the assumption that there are several patterns embedded in the solar radiation series. To extract the underlying pattern, the solar radiation series is first segmented into smaller subsequences, and the subsequences are further grouped into different clusters. For each cluster, an appropriate prediction model is trained. Hence a procedure for pattern identification is developed to identify the proper pattern that fits the current period. Based on this pattern, the corresponding prediction model is applied to obtain the prediction value. The prediction result of the proposed framework is then compared to other techniques. It is shown that the proposed framework provides superior performance as compared to others

  3. Coastal habitats as surrogates for taxonomic, functional and trophic structures of benthic faunal communities.

    Science.gov (United States)

    Törnroos, Anna; Nordström, Marie C; Bonsdorff, Erik

    2013-01-01

    Due to human impact, there is extensive degradation and loss of marine habitats, which calls for measures that incorporate taxonomic as well as functional and trophic aspects of biodiversity. Since such data is less easily quantifiable in nature, the use of habitats as surrogates or proxies for biodiversity is on the rise in marine conservation and management. However, there is a critical gap in knowledge of whether pre-defined habitat units adequately represent the functional and trophic structure of communities. We also lack comparisons of different measures of community structure in terms of both between- (β) and within-habitat (α) variability when accounting for species densities. Thus, we evaluated a priori defined coastal habitats as surrogates for traditional taxonomic, functional and trophic zoobenthic community structure. We focused on four habitats (bare sand, canopy-forming algae, seagrass above- and belowground), all easily delineated in nature and defined through classification systems. We analyzed uni- and multivariate data on species and trait diversity as well as stable isotope ratios of benthic macrofauna. A good fit between habitat types and taxonomic and functional structure was found, although habitats were more similar functionally. This was attributed to within-habitat heterogeneity so when habitat divisions matched the taxonomic structure, only bare sand was functionally distinct. The pre-defined habitats did not meet the variability of trophic structure, which also proved to differentiate on a smaller spatial scale. The quantification of trophic structure using species density only identified an epi- and an infaunal unit. To summarize the results we present a conceptual model illustrating the match between pre-defined habitat types and the taxonomic, functional and trophic community structure. Our results show the importance of including functional and trophic aspects more comprehensively in marine management and spatial planning.

  4. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  5. Model predictive control of a crude oil distillation column

    Directory of Open Access Journals (Sweden)

    Morten Hovd

    1999-04-01

    Full Text Available The project of designing and implementing model based predictive control on the vacuum distillation column at the Nynäshamn Refinery of Nynäs AB is described in this paper. The paper describes in detail the modeling for the model based control, covers the controller implementation, and documents the benefits gained from the model based controller.

  6. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    T. Wu; E. Lester; M. Cloke [University of Nottingham, Nottingham (United Kingdom). Nottingham Energy and Fuel Centre

    2005-07-01

    Poor burnout in a coal-fired power plant has marked penalties in the form of reduced energy efficiency and elevated waste material that can not be utilized. The prediction of coal combustion behaviour in a furnace is of great significance in providing valuable information not only for process optimization but also for coal buyers in the international market. Coal combustion models have been developed that can make predictions about burnout behaviour and burnout potential. Most of these kinetic models require standard parameters such as volatile content, particle size and assumed char porosity in order to make a burnout prediction. This paper presents a new model called the Char Burnout Model (ChB) that also uses detailed information about char morphology in its prediction. The model can use data input from one of two sources. Both sources are derived from image analysis techniques. The first from individual analysis and characterization of real char types using an automated program. The second from predicted char types based on data collected during the automated image analysis of coal particles. Modelling results were compared with a different carbon burnout kinetic model and burnout data from re-firing the chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen across several residence times. An improved agreement between ChB model and DTF experimental data proved that the inclusion of char morphology in combustion models can improve model predictions. 27 refs., 4 figs., 4 tabs.

  7. Questioning the Faith - Models and Prediction in Stream Restoration (Invited)

    Science.gov (United States)

    Wilcock, P.

    2013-12-01

    River management and restoration demand prediction at and beyond our present ability. Management questions, framed appropriately, can motivate fundamental advances in science, although the connection between research and application is not always easy, useful, or robust. Why is that? This presentation considers the connection between models and management, a connection that requires critical and creative thought on both sides. Essential challenges for managers include clearly defining project objectives and accommodating uncertainty in any model prediction. Essential challenges for the research community include matching the appropriate model to project duration, space, funding, information, and social constraints and clearly presenting answers that are actually useful to managers. Better models do not lead to better management decisions or better designs if the predictions are not relevant to and accepted by managers. In fact, any prediction may be irrelevant if the need for prediction is not recognized. The predictive target must be developed in an active dialog between managers and modelers. This relationship, like any other, can take time to develop. For example, large segments of stream restoration practice have remained resistant to models and prediction because the foundational tenet - that channels built to a certain template will be able to transport the supplied sediment with the available flow - has no essential physical connection between cause and effect. Stream restoration practice can be steered in a predictive direction in which project objectives are defined as predictable attributes and testable hypotheses. If stream restoration design is defined in terms of the desired performance of the channel (static or dynamic, sediment surplus or deficit), then channel properties that provide these attributes can be predicted and a basis exists for testing approximations, models, and predictions.

  8. Predicting Magazine Audiences with a Loglinear Model.

    Science.gov (United States)

    1987-07-01

    TITLE (InciudeSecuirty Clauificalson, Predicting !iagaz:ine Atidiences with a Loglinvar \\lode] * 12. PERSONAL AUTHOR(S) * Peter J.1 .:)anahel 1 3&. TYPE...important use of e.d. estimates is in media selection ( Aaker 1975; Lee 1962, 1963; Little and Lodish 1969). All advertising campaigns have a budget. It...BBD we obtain the modified BBD (MBBD). Let X be the number of exposures a person has to k insertions in a single magazine. The mass function of the

  9. Predicting and Modelling of Survival Data when Cox's Regression Model does not hold

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects...

  10. Predicting Error Bars for QSAR Models

    International Nuclear Information System (INIS)

    Schroeter, Timon; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Mueller, Klaus-Robert

    2007-01-01

    Unfavorable physicochemical properties often cause drug failures. It is therefore important to take lipophilicity and water solubility into account early on in lead discovery. This study presents log D 7 models built using Gaussian Process regression, Support Vector Machines, decision trees and ridge regression algorithms based on 14556 drug discovery compounds of Bayer Schering Pharma. A blind test was conducted using 7013 new measurements from the last months. We also present independent evaluations using public data. Apart from accuracy, we discuss the quality of error bars that can be computed by Gaussian Process models, and ensemble and distance based techniques for the other modelling approaches

  11. Trophic diversity of Poznań Lakeland lakes

    Directory of Open Access Journals (Sweden)

    Dzieszko Piotr

    2015-06-01

    Full Text Available The main goal of the presented work is to determine the current trophic state of 31 lakes located in Poznań Lakeland. These lakes are included in the lake monitoring programme executed by the Voivodship Environmental Protection Inspectorate in Poznań. The place in the trophic classification for investigated lakes was determined as well as the relationships between their trophic state indices. The trophic state of investigated lakes in the research area is poor. More than a half of the investigated lakes are eutrophic. Depending on the factor that is taken into account the trophic state of investigated lakes differs radically.

  12. Influence of dispersants on trophic transfer of petroleum hydrocarbons in a marine food chain

    International Nuclear Information System (INIS)

    Wolfe, M. F.; Schwartz, G. J. B.; Singaram, S.; Tjeerdema, R. S.

    1997-01-01

    Experiments were conducted to determine the impact of dispersing agents on petroleum hydrocarbons (PH) bioavailability and trophic transfer in primary levels of a marine food chain. Uptake, bioaccumulation and metabolic transformation of a model PH, ( 1 4C)naphthalene, were measured and compared with Prudhoe Bay Crude Oil (PBCO) dispersed with Corexit 9527, and undispersed preparations of PBCO. The model food chain consisted of a primary algae producer and a primary rotifer consumer. Results showed that uptake of naphthalene increased significantly in the presence of a dispersant in algae. A significant increase in uptake was also recorded in rotifers via trophic transfer. Trophic transfer played a significant, sometimes even dominant, role in uptake and bioaccumulation. 27 refs., 6 figs

  13. Prediction models for successful external cephalic version: a systematic review.

    Science.gov (United States)

    Velzel, Joost; de Hundt, Marcella; Mulder, Frederique M; Molkenboer, Jan F M; Van der Post, Joris A M; Mol, Ben W; Kok, Marjolein

    2015-12-01

    To provide an overview of existing prediction models for successful ECV, and to assess their quality, development and performance. We searched MEDLINE, EMBASE and the Cochrane Library to identify all articles reporting on prediction models for successful ECV published from inception to January 2015. We extracted information on study design, sample size, model-building strategies and validation. We evaluated the phases of model development and summarized their performance in terms of discrimination, calibration and clinical usefulness. We collected different predictor variables together with their defined significance, in order to identify important predictor variables for successful ECV. We identified eight articles reporting on seven prediction models. All models were subjected to internal validation. Only one model was also validated in an external cohort. Two prediction models had a low overall risk of bias, of which only one showed promising predictive performance at internal validation. This model also completed the phase of external validation. For none of the models their impact on clinical practice was evaluated. The most important predictor variables for successful ECV described in the selected articles were parity, placental location, breech engagement and the fetal head being palpable. One model was assessed using discrimination and calibration using internal (AUC 0.71) and external validation (AUC 0.64), while two other models were assessed with discrimination and calibration, respectively. We found one prediction model for breech presentation that was validated in an external cohort and had acceptable predictive performance. This model should be used to council women considering ECV. Copyright © 2015. Published by Elsevier Ireland Ltd.

  14. Risk Prediction Model for Severe Postoperative Complication in Bariatric Surgery.

    Science.gov (United States)

    Stenberg, Erik; Cao, Yang; Szabo, Eva; Näslund, Erik; Näslund, Ingmar; Ottosson, Johan

    2018-01-12

    Factors associated with risk for adverse outcome are important considerations in the preoperative assessment of patients for bariatric surgery. As yet, prediction models based on preoperative risk factors have not been able to predict adverse outcome sufficiently. This study aimed to identify preoperative risk factors and to construct a risk prediction model based on these. Patients who underwent a bariatric surgical procedure in Sweden between 2010 and 2014 were identified from the Scandinavian Obesity Surgery Registry (SOReg). Associations between preoperative potential risk factors and severe postoperative complications were analysed using a logistic regression model. A multivariate model for risk prediction was created and validated in the SOReg for patients who underwent bariatric surgery in Sweden, 2015. Revision surgery (standardized OR 1.19, 95% confidence interval (CI) 1.14-0.24, p prediction model. Despite high specificity, the sensitivity of the model was low. Revision surgery, high age, low BMI, large waist circumference, and dyspepsia/GERD were associated with an increased risk for severe postoperative complication. The prediction model based on these factors, however, had a sensitivity that was too low to predict risk in the individual patient case.

  15. AN EFFICIENT PATIENT INFLOW PREDICTION MODEL FOR HOSPITAL RESOURCE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Kottalanka Srikanth

    2017-07-01

    Full Text Available There has been increasing demand in improving service provisioning in hospital resources management. Hospital industries work with strict budget constraint at the same time assures quality care. To achieve quality care with budget constraint an efficient prediction model is required. Recently there has been various time series based prediction model has been proposed to manage hospital resources such ambulance monitoring, emergency care and so on. These models are not efficient as they do not consider the nature of scenario such climate condition etc. To address this artificial intelligence is adopted. The issues with existing prediction are that the training suffers from local optima error. This induces overhead and affects the accuracy in prediction. To overcome the local minima error, this work presents a patient inflow prediction model by adopting resilient backpropagation neural network. Experiment are conducted to evaluate the performance of proposed model inter of RMSE and MAPE. The outcome shows the proposed model reduces RMSE and MAPE over existing back propagation based artificial neural network. The overall outcomes show the proposed prediction model improves the accuracy of prediction which aid in improving the quality of health care management.

  16. Prediction Model for Gastric Cancer Incidence in Korean Population.

    Directory of Open Access Journals (Sweden)

    Bang Wool Eom

    Full Text Available Predicting high risk groups for gastric cancer and motivating these groups to receive regular checkups is required for the early detection of gastric cancer. The aim of this study is was to develop a prediction model for gastric cancer incidence based on a large population-based cohort in Korea.Based on the National Health Insurance Corporation data, we analyzed 10 major risk factors for gastric cancer. The Cox proportional hazards model was used to develop gender specific prediction models for gastric cancer development, and the performance of the developed model in terms of discrimination and calibration was also validated using an independent cohort. Discrimination ability was evaluated using Harrell's C-statistics, and the calibration was evaluated using a calibration plot and slope.During a median of 11.4 years of follow-up, 19,465 (1.4% and 5,579 (0.7% newly developed gastric cancer cases were observed among 1,372,424 men and 804,077 women, respectively. The prediction models included age, BMI, family history, meal regularity, salt preference, alcohol consumption, smoking and physical activity for men, and age, BMI, family history, salt preference, alcohol consumption, and smoking for women. This prediction model showed good accuracy and predictability in both the developing and validation cohorts (C-statistics: 0.764 for men, 0.706 for women.In this study, a prediction model for gastric cancer incidence was developed that displayed a good performance.

  17. Stage-specific predictive models for breast cancer survivability.

    Science.gov (United States)

    Kate, Rohit J; Nadig, Ramya

    2017-01-01

    Survivability rates vary widely among various stages of breast cancer. Although machine learning models built in past to predict breast cancer survivability were given stage as one of the features, they were not trained or evaluated separately for each stage. To investigate whether there are differences in performance of machine learning models trained and evaluated across different stages for predicting breast cancer survivability. Using three different machine learning methods we built models to predict breast cancer survivability separately for each stage and compared them with the traditional joint models built for all the stages. We also evaluated the models separately for each stage and together for all the stages. Our results show that the most suitable model to predict survivability for a specific stage is the model trained for that particular stage. In our experiments, using additional examples of other stages during training did not help, in fact, it made it worse in some cases. The most important features for predicting survivability were also found to be different for different stages. By evaluating the models separately on different stages we found that the performance widely varied across them. We also demonstrate that evaluating predictive models for survivability on all the stages together, as was done in the past, is misleading because it overestimates performance. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Predictive modeling of pedestal structure in KSTAR using EPED model

    Energy Technology Data Exchange (ETDEWEB)

    Han, Hyunsun; Kim, J. Y. [National Fusion Research Institute, Daejeon 305-806 (Korea, Republic of); Kwon, Ohjin [Department of Physics, Daegu University, Gyeongbuk 712-714 (Korea, Republic of)

    2013-10-15

    A predictive calculation is given for the structure of edge pedestal in the H-mode plasma of the KSTAR (Korea Superconducting Tokamak Advanced Research) device using the EPED model. Particularly, the dependence of pedestal width and height on various plasma parameters is studied in detail. The two codes, ELITE and HELENA, are utilized for the stability analysis of the peeling-ballooning and kinetic ballooning modes, respectively. Summarizing the main results, the pedestal slope and height have a strong dependence on plasma current, rapidly increasing with it, while the pedestal width is almost independent of it. The plasma density or collisionality gives initially a mild stabilization, increasing the pedestal slope and height, but above some threshold value its effect turns to a destabilization, reducing the pedestal width and height. Among several plasma shape parameters, the triangularity gives the most dominant effect, rapidly increasing the pedestal width and height, while the effect of elongation and squareness appears to be relatively weak. Implication of these edge results, particularly in relation to the global plasma performance, is discussed.

  19. Model predictions for auxiliary heating in spheromaks

    International Nuclear Information System (INIS)

    Fauler, T.K.; Khua, D.D.

    1997-01-01

    Calculations are presented of the plasma temperature waited for under auxiliary heating in spheromaks. A model, ensuring good agreement of earlier experiments with joule heating results, is used. The model includes heat losses due to magnetic fluctuations and shows that the plasma temperatures of the kilo-electron-volt order may be achieved in a small device with the radius of 0.3 m only

  20. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  1. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  2. Trophic specialization drives morphological evolution in sea snakes.

    Science.gov (United States)

    Sherratt, Emma; Rasmussen, Arne R; Sanders, Kate L

    2018-03-01

    Viviparous sea snakes are the most rapidly speciating reptiles known, yet the ecological factors underlying this radiation are poorly understood. Here, we reconstructed dated trees for 75% of sea snake species and quantified body shape (forebody relative to hindbody girth), maximum body length and trophic diversity to examine how dietary specialization has influenced morphological diversification in this rapid radiation. We show that sea snake body shape and size are strongly correlated with the proportion of burrowing prey in the diet. Specialist predators of burrowing eels have convergently evolved a 'microcephalic' morphotype with dramatically reduced forebody relative to hindbody girth and intermediate body length. By comparison, snakes that predominantly feed on burrowing gobies are generally short-bodied and small-headed, but there is no evidence of convergent evolution. The eel specialists also exhibit faster rates of size and shape evolution compared to all other sea snakes, including those that feed on gobies. Our results suggest that trophic specialization to particular burrowing prey (eels) has invoked strong selective pressures that manifest as predictable and rapid morphological changes. Further studies are needed to examine the genetic and developmental mechanisms underlying these dramatic morphological changes and assess their role in sea snake speciation.

  3. Evaluation of wave runup predictions from numerical and parametric models

    Science.gov (United States)

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  4. A neighborhood statistics model for predicting stream pathogen indicator levels.

    Science.gov (United States)

    Pandey, Pramod K; Pasternack, Gregory B; Majumder, Mahbubul; Soupir, Michelle L; Kaiser, Mark S

    2015-03-01

    Because elevated levels of water-borne Escherichia coli in streams are a leading cause of water quality impairments in the U.S., water-quality managers need tools for predicting aqueous E. coli levels. Presently, E. coli levels may be predicted using complex mechanistic models that have a high degree of unchecked uncertainty or simpler statistical models. To assess spatio-temporal patterns of instream E. coli levels, herein we measured E. coli, a pathogen indicator, at 16 sites (at four different times) within the Squaw Creek watershed, Iowa, and subsequently, the Markov Random Field model was exploited to develop a neighborhood statistics model for predicting instream E. coli levels. Two observed covariates, local water temperature (degrees Celsius) and mean cross-sectional depth (meters), were used as inputs to the model. Predictions of E. coli levels in the water column were compared with independent observational data collected from 16 in-stream locations. The results revealed that spatio-temporal averages of predicted and observed E. coli levels were extremely close. Approximately 66 % of individual predicted E. coli concentrations were within a factor of 2 of the observed values. In only one event, the difference between prediction and observation was beyond one order of magnitude. The mean of all predicted values at 16 locations was approximately 1 % higher than the mean of the observed values. The approach presented here will be useful while assessing instream contaminations such as pathogen/pathogen indicator levels at the watershed scale.

  5. Prediction skill of rainstorm events over India in the TIGGE weather prediction models

    Science.gov (United States)

    Karuna Sagar, S.; Rajeevan, M.; Vijaya Bhaskara Rao, S.; Mitra, A. K.

    2017-12-01

    Extreme rainfall events pose a serious threat of leading to severe floods in many countries worldwide. Therefore, advance prediction of its occurrence and spatial distribution is very essential. In this paper, an analysis has been made to assess the skill of numerical weather prediction models in predicting rainstorms over India. Using gridded daily rainfall data set and objective criteria, 15 rainstorms were identified during the monsoon season (June to September). The analysis was made using three TIGGE (THe Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble) models. The models considered are the European Centre for Medium-Range Weather Forecasts (ECMWF), National Centre for Environmental Prediction (NCEP) and the UK Met Office (UKMO). Verification of the TIGGE models for 43 observed rainstorm days from 15 rainstorm events has been made for the period 2007-2015. The comparison reveals that rainstorm events are predictable up to 5 days in advance, however with a bias in spatial distribution and intensity. The statistical parameters like mean error (ME) or Bias, root mean square error (RMSE) and correlation coefficient (CC) have been computed over the rainstorm region using the multi-model ensemble (MME) mean. The study reveals that the spread is large in ECMWF and UKMO followed by the NCEP model. Though the ensemble spread is quite small in NCEP, the ensemble member averages are not well predicted. The rank histograms suggest that the forecasts are under prediction. The modified Contiguous Rain Area (CRA) technique was used to verify the spatial as well as the quantitative skill of the TIGGE models. Overall, the contribution from the displacement and pattern errors to the total RMSE is found to be more in magnitude. The volume error increases from 24 hr forecast to 48 hr forecast in all the three models.

  6. Ecosystem structure and trophic analysis of Angolan fishery landings

    Directory of Open Access Journals (Sweden)

    Ronaldo Angelini

    2011-06-01

    Full Text Available Information on the mean trophic level of fishery landings in Angola and the output from a preliminary Ecopath with Ecosim (EwE model were used to examine the dynamics of the Angolan marine ecosystem. Results were compared with the nearby Namibian and South African ecosystems, which share some of the exploited fish populations. The results show that: (i The mean trophic level of Angola’s fish landings has not decreased over the years; (ii There are significant correlations between the landings of Angola, Namibia and South Africa; (iii The ecosystem attributes calculated by the EwE models for the three ecosystems were similar, and the main differences were related to the magnitude of flows and biomass; (iv The similarity among ecosystem trends for Namibia, South Africa and Angola re-emphasizes the need to continue collaborative regional studies on the fish stocks and their ecosystems. To improve the Angolan model it is necessary to gain a better understanding of plankton dynamics because plankton are essential for Sardinella spp. An expanded analysis of the gut contents of the fish species occupying Angola’s coastline is also necessary.

  7. Preclinical models used for immunogenicity prediction of therapeutic proteins.

    Science.gov (United States)

    Brinks, Vera; Weinbuch, Daniel; Baker, Matthew; Dean, Yann; Stas, Philippe; Kostense, Stefan; Rup, Bonita; Jiskoot, Wim

    2013-07-01

    All therapeutic proteins are potentially immunogenic. Antibodies formed against these drugs can decrease efficacy, leading to drastically increased therapeutic costs and in rare cases to serious and sometimes life threatening side-effects. Many efforts are therefore undertaken to develop therapeutic proteins with minimal immunogenicity. For this, immunogenicity prediction of candidate drugs during early drug development is essential. Several in silico, in vitro and in vivo models are used to predict immunogenicity of drug leads, to modify potentially immunogenic properties and to continue development of drug candidates with expected low immunogenicity. Despite the extensive use of these predictive models, their actual predictive value varies. Important reasons for this uncertainty are the limited/insufficient knowledge on the immune mechanisms underlying immunogenicity of therapeutic proteins, the fact that different predictive models explore different components of the immune system and the lack of an integrated clinical validation. In this review, we discuss the predictive models in use, summarize aspects of immunogenicity that these models predict and explore the merits and the limitations of each of the models.

  8. Development of Interpretable Predictive Models for BPH and Prostate Cancer.

    Science.gov (United States)

    Bermejo, Pablo; Vivo, Alicia; Tárraga, Pedro J; Rodríguez-Montes, J A

    2015-01-01

    Traditional methods for deciding whether to recommend a patient for a prostate biopsy are based on cut-off levels of stand-alone markers such as prostate-specific antigen (PSA) or any of its derivatives. However, in the last decade we have seen the increasing use of predictive models that combine, in a non-linear manner, several predictives that are better able to predict prostate cancer (PC), but these fail to help the clinician to distinguish between PC and benign prostate hyperplasia (BPH) patients. We construct two new models that are capable of predicting both PC and BPH. An observational study was performed on 150 patients with PSA ≥3 ng/mL and age >50 years. We built a decision tree and a logistic regression model, validated with the leave-one-out methodology, in order to predict PC or BPH, or reject both. Statistical dependence with PC and BPH was found for prostate volume (P-value BPH prediction. PSA and volume together help to build predictive models that accurately distinguish among PC, BPH, and patients without any of these pathologies. Our decision tree and logistic regression models outperform the AUC obtained in the compared studies. Using these models as decision support, the number of unnecessary biopsies might be significantly reduced.

  9. Predicting Footbridge Response using Stochastic Load Models

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2013-01-01

    Walking parameters such as step frequency, pedestrian mass, dynamic load factor, etc. are basically stochastic, although it is quite common to adapt deterministic models for these parameters. The present paper considers a stochastic approach to modeling the action of pedestrians, but when doing so...... decisions need to be made in terms of statistical distributions of walking parameters and in terms of the parameters describing the statistical distributions. The paper explores how sensitive computations of bridge response are to some of the decisions to be made in this respect. This is useful...

  10. A COMPARISON BETWEEN THREE PREDICTIVE MODELS OF COMPUTATIONAL INTELLIGENCE

    Directory of Open Access Journals (Sweden)

    DUMITRU CIOBANU

    2013-12-01

    Full Text Available Time series prediction is an open problem and many researchers are trying to find new predictive methods and improvements for the existing ones. Lately methods based on neural networks are used extensively for time series prediction. Also, support vector machines have solved some of the problems faced by neural networks and they began to be widely used for time series prediction. The main drawback of those two methods is that they are global models and in the case of a chaotic time series it is unlikely to find such model. In this paper it is presented a comparison between three predictive from computational intelligence field one based on neural networks one based on support vector machine and another based on chaos theory. We show that the model based on chaos theory is an alternative to the other two methods.

  11. A biogeochemical model of Lake Pusiano (North Italy and its use in the predictability of phytoplankton blooms: first preliminary results

    Directory of Open Access Journals (Sweden)

    Alessandro OGGIONI

    2006-02-01

    Full Text Available This study reports the first preliminary results of the DYRESM-CAEDYM model application to a mid size sub-alpine lake (Lake Pusiano North Italy. The in-lake modelling is a part of a more general project called Pusiano Integrated Lake/Catchment project (PILE whose final goal is to understand the hydrological and trophic relationship between lake and catchment, supporting the restoration plan of the lake through field data analysis and numerical models. DYRESM is a 1D-3D hydrodynamics model for predicting the vertical profile of temperature, salinity and density. CAEDYM is multi-component ecological model, used here as a phytoplankton-zooplankton processes based model, which includes algorithms to simulate the nutrient cycles within the water column as well as the air-water gas exchanges and the water-sediments fluxes. The first results of the hydrodynamics simulations underline the capability of the model to accurately simulate the surface temperature seasonal trend and the thermal gradient whereas, during summer stratification, the model underestimates the bottom temperature of around 2 °C. The ecological model describes the epilimnetic reactive phosphorus (PO4 depletion (due to the phytoplankton uptake and the increase in PO4 concentrations in the deepest layers of the lake (due to the mineralization processes and the sediments release. In terms of phytoplankton dynamics the model accounts for the Planktothrix rubescens dominance during the whole season, whereas it seems to underestimate the peak in primary production related to both the simulated algal groups (P. rubescens and the rest of the other species aggregated in a single class. The future aims of the project are to complete the model parameterization and to connect the in-lake and the catchment modelling in order to gain an integrated view of the lake-catchment ecosystem as well as to develop a three dimensional model of the lake.

  12. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  13. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  14. Trophic interactions between native and introduced fish species in a littoral fish community.

    Science.gov (United States)

    Monroy, M; Maceda-Veiga, A; Caiola, N; De Sostoa, A

    2014-11-01

    The trophic interactions between 15 native and two introduced fish species, silverside Odontesthes bonariensis and rainbow trout Oncorhynchus mykiss, collected in a major fishery area at Lake Titicaca were explored by integrating traditional ecological knowledge and stable-isotope analyses (SIA). SIA suggested the existence of six trophic groups in this fish community based on δ(13)C and δ(15)N signatures. This was supported by ecological evidence illustrating marked spatial segregation between groups, but a similar trophic level for most of the native groups. Based on Bayesian ellipse analyses, niche overlap appeared to occur between small O. bonariensis (<90 mm) and benthopelagic native species (31.6%), and between the native pelagic killifish Orestias ispi and large O. bonariensis (39%) or O. mykiss (19.7%). In addition, Bayesian mixing models suggested that O. ispi and epipelagic species are likely to be the main prey items for the two introduced fish species. This study reveals a trophic link between native and introduced fish species, and demonstrates the utility of combining both SIA and traditional ecological knowledge to understand trophic relationships between fish species with similar feeding habits. © 2014 The Fisheries Society of the British Isles.

  15. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    Tao Wu; Edward Lester; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical, Environmental and Mining Engineering

    2006-05-15

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coal particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.

  16. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  17. Modeling for prediction of restrained shrinkage effect in concrete repair

    International Nuclear Information System (INIS)

    Yuan Yingshu; Li Guo; Cai Yue

    2003-01-01

    A general model of autogenous shrinkage caused by chemical reaction (chemical shrinkage) is developed by means of Arrhenius' law and a degree of chemical reaction. Models of tensile creep and relaxation modulus are built based on a viscoelastic, three-element model. Tests of free shrinkage and tensile creep were carried out to determine some coefficients in the models. Two-dimensional FEM analysis based on the models and other constitutions can predict the development of tensile strength and cracking. Three groups of patch-repaired beams were designed for analysis and testing. The prediction from the analysis shows agreement with the test results. The cracking mechanism after repair is discussed

  18. Evaluation of two models for predicting elemental accumulation by arthropods

    International Nuclear Information System (INIS)

    Webster, J.R.; Crossley, D.A. Jr.

    1978-01-01

    Two different models have been proposed for predicting elemental accumulation by arthropods. Parameters of both models can be quantified from radioisotope elimination experiments. Our analysis of the 2 models shows that both predict identical elemental accumulation for a whole organism, though differing in the accumulation in body and gut. We quantified both models with experimental data from 134 Cs and 85 Sr elimination by crickets. Computer simulations of radioisotope accumulation were then compared with actual accumulation experiments. Neither model showed exact fit to the experimental data, though both showed the general pattern of elemental accumulation

  19. Uncertainties in model-based outcome predictions for treatment planning

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry

    2001-01-01

    Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment

  20. Geospatial application of the Water Erosion Prediction Project (WEPP) Model

    Science.gov (United States)

    D. C. Flanagan; J. R. Frankenberger; T. A. Cochrane; C. S. Renschler; W. J. Elliot

    2011-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based technology for prediction of soil erosion by water at hillslope profile, field, and small watershed scales. In particular, WEPP utilizes observed or generated daily climate inputs to drive the surface hydrology processes (infiltration, runoff, ET) component, which subsequently impacts the rest of the...

  1. Techniques for discrimination-free predictive models (Chapter 12)

    NARCIS (Netherlands)

    Kamiran, F.; Calders, T.G.K.; Pechenizkiy, M.; Custers, B.H.M.; Calders, T.G.K.; Schermer, B.W.; Zarsky, T.Z.

    2013-01-01

    In this chapter, we give an overview of the techniques developed ourselves for constructing discrimination-free classifiers. In discrimination-free classification the goal is to learn a predictive model that classifies future data objects as accurately as possible, yet the predicted labels should be

  2. A model to predict the beginning of the pollen season

    DEFF Research Database (Denmark)

    Toldam-Andersen, Torben Bo

    1991-01-01

    for fruit trees are generally applicable, and give a reasonable description of the growth processes of other trees. This type of model can therefore be of value in predicting the start of the pollen season. The predicted dates were generally within 3-5 days of the observed. Finally the possibility of frost...

  3. Statistical models to predict flows at monthly level in Salvajina

    International Nuclear Information System (INIS)

    Gonzalez, Harold O

    1994-01-01

    It thinks about and models of lineal regression evaluate at monthly level that they allow to predict flows in Salvajina, with base in predictions variable, like the difference of pressure between Darwin and Tahiti, precipitation in Piendamo Cauca), temperature in Port Chicama (Peru) and pressure in Tahiti

  4. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  5. Global vegetation change predicted by the modified Budyko model

    Energy Technology Data Exchange (ETDEWEB)

    Monserud, R.A.; Tchebakova, N.M.; Leemans, R. (US Department of Agriculture, Moscow, ID (United States). Intermountain Research Station, Forest Service)

    1993-09-01

    A modified Budyko global vegetation model is used to predict changes in global vegetation patterns resulting from climate change (CO[sub 2] doubling). Vegetation patterns are predicted using a model based on a dryness index and potential evaporation determined by solving radiation balance equations. Climate change scenarios are derived from predictions from four General Circulation Models (GCM's) of the atmosphere (GFDL, GISS, OSU, and UKMO). All four GCM scenarios show similar trends in vegetation shifts and in areas that remain stable, although the UKMO scenario predicts greater warming than the others. Climate change maps produced by all four GCM scenarios show good agreement with the current climate vegetation map for the globe as a whole, although over half of the vegetation classes show only poor to fair agreement. The most stable areas are Desert and Ice/Polar Desert. Because most of the predicted warming is concentrated in the Boreal and Temperate zones, vegetation there is predicted to undergo the greatest change. Most vegetation classes in the Subtropics and Tropics are predicted to expand. Any shift in the Tropics favouring either Forest over Savanna, or vice versa, will be determined by the magnitude of the increased precipitation accompanying global warming. Although the model predicts equilibrium conditions to which many plant species cannot adjust (through migration or microevolution) in the 50-100 y needed for CO[sub 2] doubling, it is not clear if projected global warming will result in drastic or benign vegetation change. 72 refs., 3 figs., 3 tabs.

  6. Moment based model predictive control for systems with additive uncertainty

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Weiland, S.; Ludlage, J.H.A.

    2017-01-01

    In this paper, we present a model predictive control (MPC) strategy based on the moments of the state variables and the cost functional. The statistical properties of the state predictions are calculated through the open loop iteration of dynamics and used in the formulation of MPC cost function. We

  7. Risk predictive modelling for diabetes and cardiovascular disease.

    Science.gov (United States)

    Kengne, Andre Pascal; Masconi, Katya; Mbanya, Vivian Nchanchou; Lekoubou, Alain; Echouffo-Tcheugui, Justin Basile; Matsha, Tandi E

    2014-02-01

    Absolute risk models or clinical prediction models have been incorporated in guidelines, and are increasingly advocated as tools to assist risk stratification and guide prevention and treatments decisions relating to common health conditions such as cardiovascular disease (CVD) and diabetes mellitus. We have reviewed the historical development and principles of prediction research, including their statistical underpinning, as well as implications for routine practice, with a focus on predictive modelling for CVD and diabetes. Predictive modelling for CVD risk, which has developed over the last five decades, has been largely influenced by the Framingham Heart Study investigators, while it is only ∼20 years ago that similar efforts were started in the field of diabetes. Identification of predictive factors is an important preliminary step which provides the knowledge base on potential predictors to be tested for inclusion during the statistical derivation of the final model. The derived models must then be tested both on the development sample (internal validation) and on other populations in different settings (external validation). Updating procedures (e.g. recalibration) should be used to improve the performance of models that fail the tests of external validation. Ultimately, the effect of introducing validated models in routine practice on the process and outcomes of care as well as its cost-effectiveness should be tested in impact studies before wide dissemination of models beyond the research context. Several predictions models have been developed for CVD or diabetes, but very few have been externally validated or tested in impact studies, and their comparative performance has yet to be fully assessed. A shift of focus from developing new CVD or diabetes prediction models to validating the existing ones will improve their adoption in routine practice.

  8. Consensus models to predict endocrine disruption for all ...

    Science.gov (United States)

    Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an exte

  9. Mixed models for predictive modeling in actuarial science

    NARCIS (Netherlands)

    Antonio, K.; Zhang, Y.

    2012-01-01

    We start with a general discussion of mixed (also called multilevel) models and continue with illustrating specific (actuarial) applications of this type of models. Technical details on (linear, generalized, non-linear) mixed models follow: model assumptions, specifications, estimation techniques

  10. A multivariate model for predicting segmental body composition.

    Science.gov (United States)

    Tian, Simiao; Mioche, Laurence; Denis, Jean-Baptiste; Morio, Béatrice

    2013-12-01

    The aims of the present study were to propose a multivariate model for predicting simultaneously body, trunk and appendicular fat and lean masses from easily measured variables and to compare its predictive capacity with that of the available univariate models that predict body fat percentage (BF%). The dual-energy X-ray absorptiometry (DXA) dataset (52% men and 48% women) with White, Black and Hispanic ethnicities (1999-2004, National Health and Nutrition Examination Survey) was randomly divided into three sub-datasets: a training dataset (TRD), a test dataset (TED); a validation dataset (VAD), comprising 3835, 1917 and 1917 subjects. For each sex, several multivariate prediction models were fitted from the TRD using age, weight, height and possibly waist circumference. The most accurate model was selected from the TED and then applied to the VAD and a French DXA dataset (French DB) (526 men and 529 women) to assess the prediction accuracy in comparison with that of five published univariate models, for which adjusted formulas were re-estimated using the TRD. Waist circumference was found to improve the prediction accuracy, especially in men. For BF%, the standard error of prediction (SEP) values were 3.26 (3.75) % for men and 3.47 (3.95)% for women in the VAD (French DB), as good as those of the adjusted univariate models. Moreover, the SEP values for the prediction of body and appendicular lean masses ranged from 1.39 to 2.75 kg for both the sexes. The prediction accuracy was best for age < 65 years, BMI < 30 kg/m2 and the Hispanic ethnicity. The application of our multivariate model to large populations could be useful to address various public health issues.

  11. The Selection of Turbulence Models for Prediction of Room Airflow

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    This paper discusses the use of different turbulence models and their advantages in given situations. As an example, it is shown that a simple zero-equation model can be used for the prediction of special situations as flow with a low level of turbulence. A zero-equation model with compensation...

  12. Testing the Predictions of the Central Capacity Sharing Model

    Science.gov (United States)

    Tombu, Michael; Jolicoeur, Pierre

    2005-01-01

    The divergent predictions of 2 models of dual-task performance are investigated. The central bottleneck and central capacity sharing models argue that a central stage of information processing is capacity limited, whereas stages before and after are capacity free. The models disagree about the nature of this central capacity limitation. The…

  13. Scanpath Based N-Gram Models for Predicting Reading Behavior

    DEFF Research Database (Denmark)

    Mishra, Abhijit; Bhattacharyya, Pushpak; Carl, Michael

    2013-01-01

    Predicting reading behavior is a difficult task. Reading behavior depends on various linguistic factors (e.g. sentence length, structural complexity etc.) and other factors (e.g individual's reading style, age etc.). Ideally, a reading model should be similar to a language model where the model i...

  14. Droplet-model predictions of charge moments

    International Nuclear Information System (INIS)

    Myers, W.D.

    1982-04-01

    The Droplet Model expressions for calculating various moments of the nuclear charge distribution are given. There are contributions to the moments from the size and shape of the system, from the internal redistribution induced by the Coulomb repulsion, and from the diffuseness of the surface. A case is made for the use of diffuse charge distributions generated by convolution as an alternative to Fermi-functions

  15. Haskell financial data modeling and predictive analytics

    CERN Document Server

    Ryzhov, Pavel

    2013-01-01

    This book is a hands-on guide that teaches readers how to use Haskell's tools and libraries to analyze data from real-world sources in an easy-to-understand manner.This book is great for developers who are new to financial data modeling using Haskell. A basic knowledge of functional programming is not required but will be useful. An interest in high frequency finance is essential.

  16. An analysis of seasonal predictability in coupled model forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Peng, P.; Wang, W. [NOAA, Climate Prediction Center, Washington, DC (United States); Kumar, A. [NOAA, Climate Prediction Center, Washington, DC (United States); NCEP/NWS/NOAA, Climate Prediction Center, Camp Springs, MD (United States)

    2011-02-15

    In the recent decade, operational seasonal prediction systems based on initialized coupled models have been developed. An analysis of how the predictability of seasonal means in the initialized coupled predictions evolves with lead-time is presented. Because of the short lead-time, such an analysis for the temporal behavior of seasonal predictability involves a mix of both the predictability of the first and the second kind. The analysis focuses on the lead-time dependence of ensemble mean variance, and the forecast spread. Further, the analysis is for a fixed target season of December-January-February, and is for sea surface temperature, rainfall, and 200-mb height. The analysis is based on a large set of hindcasts from an initialized coupled seasonal prediction system. Various aspects of predictability of the first and the second kind are highlighted for variables with long (for example, SST), and fast (for example, atmospheric) adjustment time scale. An additional focus of the analysis is how the predictability in the initialized coupled seasonal predictions compares with estimates based on the AMIP simulations. The results indicate that differences in the set up of AMIP simulations and coupled predictions, for example, representation of air-sea interactions, and evolution of forecast spread from initial conditions do not change fundamental conclusion about the seasonal predictability. A discussion of the analysis presented herein, and its implications for the use of AMIP simulations for climate attribution, and for time-slice experiments to provide regional information, is also included. (orig.)

  17. Using Pareto points for model identification in predictive toxicology

    Science.gov (United States)

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  18. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  19. Form and function of damselfish skulls: rapid and repeated evolution into a limited number of trophic niches

    OpenAIRE

    Cooper, W James; Westneat, Mark W

    2009-01-01

    Abstract Background Damselfishes (Perciformes, Pomacentridae) are a major component of coral reef communities, and the functional diversity of their trophic anatomy is an important constituent of the ecological morphology of these systems. Using shape analyses, biomechanical modelling, and phylogenetically based comparative methods, we examined the anatomy of damselfish feeding among all genera and trophic groups. Coordinate based shape analyses of anatomical landmarks were used to describe p...

  20. Modeling Jambo wastewater treatment system to predict water re ...

    African Journals Online (AJOL)

    user

    C++ programme to implement Brown's model for determining water quality usage ... predicting the re-use options of the wastewater treatment system was a ... skins from rural slaughter slabs/butchers, slaughter .... City (Karnataka State, India).

  1. FPGA implementation of predictive degradation model for engine oil lifetime

    Science.gov (United States)

    Idros, M. F. M.; Razak, A. H. A.; Junid, S. A. M. Al; Suliman, S. I.; Halim, A. K.

    2018-03-01

    This paper presents the implementation of linear regression model for degradation prediction on Register Transfer Logic (RTL) using QuartusII. A stationary model had been identified in the degradation trend for the engine oil in a vehicle in time series method. As for RTL implementation, the degradation model is written in Verilog HDL and the data input are taken at a certain time. Clock divider had been designed to support the timing sequence of input data. At every five data, a regression analysis is adapted for slope variation determination and prediction calculation. Here, only the negative value are taken as the consideration for the prediction purposes for less number of logic gate. Least Square Method is adapted to get the best linear model based on the mean values of time series data. The coded algorithm has been implemented on FPGA for validation purposes. The result shows the prediction time to change the engine oil.

  2. Linear regression crash prediction models : issues and proposed solutions.

    Science.gov (United States)

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  3. On the Predictiveness of Single-Field Inflationary Models

    CERN Document Server

    Burgess, C.P.; Trott, Michael

    2014-01-01

    We re-examine the predictiveness of single-field inflationary models and discuss how an unknown UV completion can complicate determining inflationary model parameters from observations, even from precision measurements. Besides the usual naturalness issues associated with having a shallow inflationary potential, we describe another issue for inflation, namely, unknown UV physics modifies the running of Standard Model (SM) parameters and thereby introduces uncertainty into the potential inflationary predictions. We illustrate this point using the minimal Higgs Inflationary scenario, which is arguably the most predictive single-field model on the market, because its predictions for $A_s$, $r$ and $n_s$ are made using only one new free parameter beyond those measured in particle physics experiments, and run up to the inflationary regime. We find that this issue can already have observable effects. At the same time, this UV-parameter dependence in the Renormalization Group allows Higgs Inflation to occur (in prin...

  4. Compensatory versus noncompensatory models for predicting consumer preferences

    Directory of Open Access Journals (Sweden)

    Anja Dieckmann

    2009-04-01

    Full Text Available Standard preference models in consumer research assume that people weigh and add all attributes of the available options to derive a decision, while there is growing evidence for the use of simplifying heuristics. Recently, a greedoid algorithm has been developed (Yee, Dahan, Hauser and Orlin, 2007; Kohli and Jedidi, 2007 to model lexicographic heuristics from preference data. We compare predictive accuracies of the greedoid approach and standard conjoint analysis in an online study with a rating and a ranking task. The lexicographic model derived from the greedoid algorithm was better at predicting ranking compared to rating data, but overall, it achieved lower predictive accuracy for hold-out data than the compensatory model estimated by conjoint analysis. However, a considerable minority of participants was better predicted by lexicographic strategies. We conclude that the new algorithm will not replace standard tools for analyzing preferences, but can boost the study of situational and individual differences in preferential choice processes.

  5. Preoperative prediction model of outcome after cholecystectomy for symptomatic gallstones

    DEFF Research Database (Denmark)

    Borly, L; Anderson, I B; Bardram, L

    1999-01-01

    and sonography evaluated gallbladder motility, gallstones, and gallbladder volume. Preoperative variables in patients with or without postcholecystectomy pain were compared statistically, and significant variables were combined in a logistic regression model to predict the postoperative outcome. RESULTS: Eighty...... and by the absence of 'agonizing' pain and of symptoms coinciding with pain (P model 15 of 18 predicted patients had postoperative pain (PVpos = 0.83). Of 62 patients predicted as having no pain postoperatively, 56 were pain-free (PVneg = 0.90). Overall accuracy...... was 89%. CONCLUSION: From this prospective study a model based on preoperative symptoms was developed to predict postcholecystectomy pain. Since intrastudy reclassification may give too optimistic results, the model should be validated in future studies....

  6. Prediction of Chemical Function: Model Development and Application

    Science.gov (United States)

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...

  7. models for predicting compressive strength and water absorption

    African Journals Online (AJOL)

    user

    presents a mathematical model for predicting the compressive strength and water absorption of laterite-quarry dust cement block using ... building and construction of new infrastructure and .... In (6), R is a vector containing the real ratios of the.

  8. Fuzzy model predictive control algorithm applied in nuclear power plant

    International Nuclear Information System (INIS)

    Zuheir, Ahmad

    2006-01-01

    The aim of this paper is to design a predictive controller based on a fuzzy model. The Takagi-Sugeno fuzzy model with an Adaptive B-splines neuro-fuzzy implementation is used and incorporated as a predictor in a predictive controller. An optimization approach with a simplified gradient technique is used to calculate predictions of the future control actions. In this approach, adaptation of the fuzzy model using dynamic process information is carried out to build the predictive controller. The easy description of the fuzzy model and the easy computation of the gradient sector during the optimization procedure are the main advantages of the computation algorithm. The algorithm is applied to the control of a U-tube steam generation unit (UTSG) used for electricity generation. (author)

  9. MDOT Pavement Management System : Prediction Models and Feedback System

    Science.gov (United States)

    2000-10-01

    As a primary component of a Pavement Management System (PMS), prediction models are crucial for one or more of the following analyses: : maintenance planning, budgeting, life-cycle analysis, multi-year optimization of maintenance works program, and a...

  10. Predictive modelling using neuroimaging data in the presence of confounds.

    Science.gov (United States)

    Rao, Anil; Monteiro, Joao M; Mourao-Miranda, Janaina

    2017-04-15

    When training predictive models from neuroimaging data, we typically have available non-imaging variables such as age and gender that affect the imaging data but which we may be uninterested in from a clinical perspective. Such variables are commonly referred to as 'confounds'. In this work, we firstly give a working definition for confound in the context of training predictive models from samples of neuroimaging data. We define a confound as a variable which affects the imaging data and has an association with the target variable in the sample that differs from that in the population-of-interest, i.e., the population over which we intend to apply the estimated predictive model. The focus of this paper is the scenario in which the confound and target variable are independent in the population-of-interest, but the training sample is biased due to a sample association between the target and confound. We then discuss standard approaches for dealing with confounds in predictive modelling such as image adjustment and including the confound as a predictor, before deriving and motivating an Instance Weighting scheme that attempts to account for confounds by focusing model training so that it is optimal for the population-of-interest. We evaluate the standard approaches and Instance Weighting in two regression problems with neuroimaging data in which we train models in the presence of confounding, and predict samples that are representative of the population-of-interest. For comparison, these models are also evaluated when there is no confounding present. In the first experiment we predict the MMSE score using structural MRI from the ADNI database with gender as the confound, while in the second we predict age using structural MRI from the IXI database with acquisition site as the confound. Considered over both datasets we find that none of the methods for dealing with confounding gives more accurate predictions than a baseline model which ignores confounding, although

  11. Modeling Seizure Self-Prediction: An E-Diary Study

    Science.gov (United States)

    Haut, Sheryl R.; Hall, Charles B.; Borkowski, Thomas; Tennen, Howard; Lipton, Richard B.

    2013-01-01

    Purpose A subset of patients with epilepsy successfully self-predicted seizures in a paper diary study. We conducted an e-diary study to ensure that prediction precedes seizures, and to characterize the prodromal features and time windows that underlie self-prediction. Methods Subjects 18 or older with LRE and ≥3 seizures/month maintained an e-diary, reporting AM/PM data daily, including mood, premonitory symptoms, and all seizures. Self-prediction was rated by, “How likely are you to experience a seizure [time frame]”? Five choices ranged from almost certain (>95% chance) to very unlikely. Relative odds of seizure (OR) within time frames was examined using Poisson models with log normal random effects to adjust for multiple observations. Key Findings Nineteen subjects reported 244 eligible seizures. OR for prediction choices within 6hrs was as high as 9.31 (1.92,45.23) for “almost certain”. Prediction was most robust within 6hrs of diary entry, and remained significant up to 12hrs. For 9 best predictors, average sensitivity was 50%. Older age contributed to successful self-prediction, and self-prediction appeared to be driven by mood and premonitory symptoms. In multivariate modeling of seizure occurrence, self-prediction (2.84; 1.68,4.81), favorable change in mood (0.82; 0.67,0.99) and number of premonitory symptoms (1,11; 1.00,1.24) were significant. Significance Some persons with epilepsy can self-predict seizures. In these individuals, the odds of a seizure following a positive prediction are high. Predictions were robust, not attributable to recall bias, and were related to self awareness of mood and premonitory features. The 6-hour prediction window is suitable for the development of pre-emptive therapy. PMID:24111898

  12. Prediction of cloud droplet number in a general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Ghan, S.J.; Leung, L.R. [Pacific Northwest National Lab., Richland, WA (United States)

    1996-04-01

    We have applied the Colorado State University Regional Atmospheric Modeling System (RAMS) bulk cloud microphysics parameterization to the treatment of stratiform clouds in the National Center for Atmospheric Research Community Climate Model (CCM2). The RAMS predicts mass concentrations of cloud water, cloud ice, rain and snow, and number concnetration of ice. We have introduced the droplet number conservation equation to predict droplet number and it`s dependence on aerosols.

  13. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  14. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  15. Model Predictive Control of Wind Turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian

    Wind turbines play a major role in the transformation from a fossil fuel based energy production to a more sustainable production of energy. Total-cost-of-ownership is an important parameter when investors decide in which energy technology they should place their capital. Modern wind turbines...... the need for maintenance of the wind turbine. Either way, better total-cost-of-ownership for wind turbine operators can be achieved by improved control of the wind turbines. Wind turbine control can be improved in two ways, by improving the model on which the controller bases its design or by improving...

  16. Nonconvex Model Predictive Control for Commercial Refrigeration

    DEFF Research Database (Denmark)

    Hovgaard, Tobias Gybel; Larsen, Lars F.S.; Jørgensen, John Bagterp

    2013-01-01

    function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimization method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out...... the iterations, which is more than fast enough to run in real-time. We demonstrate our method on a realistic model, with a full year simulation and 15 minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost...... capacity associated with large penetration of intermittent renewable energy sources in a future smart grid....

  17. Catalytic cracking models developed for predictive control purposes

    Directory of Open Access Journals (Sweden)

    Dag Ljungqvist

    1993-04-01

    Full Text Available The paper deals with state-space modeling issues in the context of model-predictive control, with application to catalytic cracking. Emphasis is placed on model establishment, verification and online adjustment. Both the Fluid Catalytic Cracking (FCC and the Residual Catalytic Cracking (RCC units are discussed. Catalytic cracking units involve complex interactive processes which are difficult to operate and control in an economically optimal way. The strong nonlinearities of the FCC process mean that the control calculation should be based on a nonlinear model with the relevant constraints included. However, the model can be simple compared to the complexity of the catalytic cracking plant. Model validity is ensured by a robust online model adjustment strategy. Model-predictive control schemes based on linear convolution models have been successfully applied to the supervisory dynamic control of catalytic cracking units, and the control can be further improved by the SSPC scheme.

  18. Toward a predictive model for elastomer seals

    Science.gov (United States)

    Molinari, Nicola; Khawaja, Musab; Sutton, Adrian; Mostofi, Arash

    Nitrile butadiene rubber (NBR) and hydrogenated-NBR (HNBR) are widely used elastomers, especially as seals in oil and gas applications. During exposure to well-hole conditions, ingress of gases causes degradation of performance, including mechanical failure. We use computer simulations to investigate this problem at two different length and time-scales. First, we study the solubility of gases in the elastomer using a chemically-inspired description of HNBR based on the OPLS all-atom force-field. Starting with a model of NBR, C=C double bonds are saturated with either hydrogen or intramolecular cross-links, mimicking the hydrogenation of NBR to form HNBR. We validate against trends for the mass density and glass transition temperature for HNBR as a function of cross-link density, and for NBR as a function of the fraction of acrylonitrile in the copolymer. Second, we study mechanical behaviour using a coarse-grained model that overcomes some of the length and time-scale limitations of an all-atom approach. Nanoparticle fillers added to the elastomer matrix to enhance mechanical response are also included. Our initial focus is on understanding the mechanical properties at the elevated temperatures and pressures experienced in well-hole conditions.

  19. Predictive QSAR Models for the Toxicity of Disinfection Byproducts

    Directory of Open Access Journals (Sweden)

    Litang Qin

    2017-10-01

    Full Text Available Several hundred disinfection byproducts (DBPs in drinking water have been identified, and are known to have potentially adverse health effects. There are toxicological data gaps for most DBPs, and the predictive method may provide an effective way to address this. The development of an in-silico model of toxicology endpoints of DBPs is rarely studied. The main aim of the present study is to develop predictive quantitative structure–activity relationship (QSAR models for the reactive toxicities of 50 DBPs in the five bioassays of X-Microtox, GSH+, GSH−, DNA+ and DNA−. All-subset regression was used to select the optimal descriptors, and multiple linear-regression models were built. The developed QSAR models for five endpoints satisfied the internal and external validation criteria: coefficient of determination (R2 > 0.7, explained variance in leave-one-out prediction (Q2LOO and in leave-many-out prediction (Q2LMO > 0.6, variance explained in external prediction (Q2F1, Q2F2, and Q2F3 > 0.7, and concordance correlation coefficient (CCC > 0.85. The application domains and the meaning of the selective descriptors for the QSAR models were discussed. The obtained QSAR models can be used in predicting the toxicities of the 50 DBPs.

  20. Predictive QSAR Models for the Toxicity of Disinfection Byproducts.

    Science.gov (United States)

    Qin, Litang; Zhang, Xin; Chen, Yuhan; Mo, Lingyun; Zeng, Honghu; Liang, Yanpeng

    2017-10-09

    Several hundred disinfection byproducts (DBPs) in drinking water have been identified, and are known to have potentially adverse health effects. There are toxicological data gaps for most DBPs, and the predictive method may provide an effective way to address this. The development of an in-silico model of toxicology endpoints of DBPs is rarely studied. The main aim of the present study is to develop predictive quantitative structure-activity relationship (QSAR) models for the reactive toxicities of 50 DBPs in the five bioassays of X-Microtox, GSH+, GSH-, DNA+ and DNA-. All-subset regression was used to select the optimal descriptors, and multiple linear-regression models were built. The developed QSAR models for five endpoints satisfied the internal and external validation criteria: coefficient of determination ( R ²) > 0.7, explained variance in leave-one-out prediction ( Q ² LOO ) and in leave-many-out prediction ( Q ² LMO ) > 0.6, variance explained in external prediction ( Q ² F1 , Q ² F2 , and Q ² F3 ) > 0.7, and concordance correlation coefficient ( CCC ) > 0.85. The application domains and the meaning of the selective descriptors for the QSAR models were discussed. The obtained QSAR models can be used in predicting the toxicities of the 50 DBPs.

  1. Predictions for mt and MW in minimal supersymmetric models

    International Nuclear Information System (INIS)

    Buchmueller, O.; Ellis, J.R.; Flaecher, H.; Isidori, G.

    2009-12-01

    Using a frequentist analysis of experimental constraints within two versions of the minimal supersymmetric extension of the Standard Model, we derive the predictions for the top quark mass, m t , and the W boson mass, m W . We find that the supersymmetric predictions for both m t and m W , obtained by incorporating all the relevant experimental information and state-of-the-art theoretical predictions, are highly compatible with the experimental values with small remaining uncertainties, yielding an improvement compared to the case of the Standard Model. (orig.)

  2. Using a Prediction Model to Manage Cyber Security Threats

    Directory of Open Access Journals (Sweden)

    Venkatesh Jaganathan

    2015-01-01

    Full Text Available Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  3. Using a Prediction Model to Manage Cyber Security Threats.

    Science.gov (United States)

    Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.

  4. Using a Prediction Model to Manage Cyber Security Threats

    Science.gov (United States)

    Muthu Sivashanmugam, Premapriya

    2015-01-01

    Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization. PMID:26065024

  5. Aero-acoustic noise of wind turbines. Noise prediction models

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B. [ed.

    1997-12-31

    Semi-empirical and CAA (Computational AeroAcoustics) noise prediction techniques are the subject of this expert meeting. The meeting presents and discusses models and methods. The meeting may provide answers to the following questions: What Noise sources are the most important? How are the sources best modeled? What needs to be done to do better predictions? Does it boil down to correct prediction of the unsteady aerodynamics around the rotor? Or is the difficult part to convert the aerodynamics into acoustics? (LN)

  6. Fish community reassembly after a coral mass mortality: higher trophic groups are subject to increased rates of extinction.

    Science.gov (United States)

    Alonso, David; Pinyol-Gallemí, Aleix; Alcoverro, Teresa; Arthur, Rohan

    2015-05-01

    Since Gleason and Clements, our understanding of community dynamics has been influenced by theories emphasising either dispersal or niche assembly as central to community structuring. Determining the relative importance of these processes in structuring real-world communities remains a challenge. We tracked reef fish community reassembly after a catastrophic coral mortality in a relatively unfished archipelago. We revisited the stochastic model underlying MacArthur and Wilson's Island Biogeography Theory, with a simple extension to account for trophic identity. Colonisation and extinction rates calculated from decadal presence-absence data based on (1) species neutrality, (2) trophic identity and (3) site-specificity were used to model post-disturbance reassembly, and compared with empirical observations. Results indicate that species neutrality holds within trophic guilds, and trophic identity significantly increases overall model performance. Strikingly, extinction rates increased clearly with trophic position, indicating that fish communities may be inherently susceptible to trophic downgrading even without targeted fishing of top predators. © 2015 John Wiley & Sons Ltd/CNRS.

  7. Model Predictive Control of a Wave Energy Converter

    DEFF Research Database (Denmark)

    Andersen, Palle; Pedersen, Tom Søndergård; Nielsen, Kirsten Mølgaard

    2015-01-01

    In this paper reactive control and Model Predictive Control (MPC) for a Wave Energy Converter (WEC) are compared. The analysis is based on a WEC from Wave Star A/S designed as a point absorber. The model predictive controller uses wave models based on the dominating sea states combined with a model...... connecting undisturbed wave sequences to sequences of torque. Losses in the conversion from mechanical to electrical power are taken into account in two ways. Conventional reactive controllers are tuned for each sea state with the assumption that the converter has the same efficiency back and forth. MPC...

  8. Time dependent patient no-show predictive modelling development.

    Science.gov (United States)

    Huang, Yu-Li; Hanauer, David A

    2016-05-09

    Purpose - The purpose of this paper is to develop evident-based predictive no-show models considering patients' each past appointment status, a time-dependent component, as an independent predictor to improve predictability. Design/methodology/approach - A ten-year retrospective data set was extracted from a pediatric clinic. It consisted of 7,291 distinct patients who had at least two visits along with their appointment characteristics, patient demographics, and insurance information. Logistic regression was adopted to develop no-show models using two-thirds of the data for training and the remaining data for validation. The no-show threshold was then determined based on minimizing the misclassification of show/no-show assignments. There were a total of 26 predictive model developed based on the number of available past appointments. Simulation was employed to test the effective of each model on costs of patient wait time, physician idle time, and overtime. Findings - The results demonstrated the misclassification rate and the area under the curve of the receiver operating characteristic gradually improved as more appointment history was included until around the 20th predictive model. The overbooking method with no-show predictive models suggested incorporating up to the 16th model and outperformed other overbooking methods by as much as 9.4 per cent in the cost per patient while allowing two additional patients in a clinic day. Research limitations/implications - The challenge now is to actually implement the no-show predictive model systematically to further demonstrate its robustness and simplicity in various scheduling systems. Originality/value - This paper provides examples of how to build the no-show predictive models with time-dependent components to improve the overbooking policy. Accurately identifying scheduled patients' show/no-show status allows clinics to proactively schedule patients to reduce the negative impact of patient no-shows.

  9. Preprocedural Prediction Model for Contrast-Induced Nephropathy Patients.

    Science.gov (United States)

    Yin, Wen-Jun; Yi, Yi-Hu; Guan, Xiao-Feng; Zhou, Ling-Yun; Wang, Jiang-Lin; Li, Dai-Yang; Zuo, Xiao-Cong

    2017-02-03

    Several models have been developed for prediction of contrast-induced nephropathy (CIN); however, they only contain patients receiving intra-arterial contrast media for coronary angiographic procedures, which represent a small proportion of all contrast procedures. In addition, most of them evaluate radiological interventional procedure-related variables. So it is necessary for us to develop a model for prediction of CIN before radiological procedures among patients administered contrast media. A total of 8800 patients undergoing contrast administration were randomly assigned in a 4:1 ratio to development and validation data sets. CIN was defined as an increase of 25% and/or 0.5 mg/dL in serum creatinine within 72 hours above the baseline value. Preprocedural clinical variables were used to develop the prediction model from the training data set by the machine learning method of random forest, and 5-fold cross-validation was used to evaluate the prediction accuracies of the model. Finally we tested this model in the validation data set. The incidence of CIN was 13.38%. We built a prediction model with 13 preprocedural variables selected from 83 variables. The model obtained an area under the receiver-operating characteristic (ROC) curve (AUC) of 0.907 and gave prediction accuracy of 80.8%, sensitivity of 82.7%, specificity of 78.8%, and Matthews correlation coefficient of 61.5%. For the first time, 3 new factors are included in the model: the decreased sodium concentration, the INR value, and the preprocedural glucose level. The newly established model shows excellent predictive ability of CIN development and thereby provides preventative measures for CIN. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  10. Numerical modeling capabilities to predict repository performance

    International Nuclear Information System (INIS)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used

  11. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A; Giebel, G; Landberg, L [Risoe National Lab., Roskilde (Denmark); Madsen, H; Nielsen, H A [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  12. Predictive modeling of neuroanatomic structures for brain atrophy detection

    Science.gov (United States)

    Hu, Xintao; Guo, Lei; Nie, Jingxin; Li, Kaiming; Liu, Tianming

    2010-03-01

    In this paper, we present an approach of predictive modeling of neuroanatomic structures for the detection of brain atrophy based on cross-sectional MRI image. The underlying premise of applying predictive modeling for atrophy detection is that brain atrophy is defined as significant deviation of part of the anatomy from what the remaining normal anatomy predicts for that part. The steps of predictive modeling are as follows. The central cortical surface under consideration is reconstructed from brain tissue map and Regions of Interests (ROI) on it are predicted from other reliable anatomies. The vertex pair-wise distance between the predicted vertex and the true one within the abnormal region is expected to be larger than that of the vertex in normal brain region. Change of white matter/gray matter ratio within a spherical region is used to identify the direction of vertex displacement. In this way, the severity of brain atrophy can be defined quantitatively by the displacements of those vertices. The proposed predictive modeling method has been evaluated by using both simulated atrophies and MRI images of Alzheimer's disease.

  13. Outcome Prediction in Mathematical Models of Immune Response to Infection.

    Directory of Open Access Journals (Sweden)

    Manuel Mai

    Full Text Available Clinicians need to predict patient outcomes with high accuracy as early as possible after disease inception. In this manuscript, we show that patient-to-patient variability sets a fundamental limit on outcome prediction accuracy for a general class of mathematical models for the immune response to infection. However, accuracy can be increased at the expense of delayed prognosis. We investigate several systems of ordinary differential equations (ODEs that model the host immune response to a pathogen load. Advantages of systems of ODEs for investigating the immune response to infection include the ability to collect data on large numbers of 'virtual patients', each with a given set of model parameters, and obtain many time points during the course of the infection. We implement patient-to-patient variability v in the ODE models by randomly selecting the model parameters from distributions with coefficients of variation v that are centered on physiological values. We use logistic regression with one-versus-all classification to predict the discrete steady-state outcomes of the system. We find that the prediction algorithm achieves near 100% accuracy for v = 0, and the accuracy decreases with increasing v for all ODE models studied. The fact that multiple steady-state outcomes can be obtained for a given initial condition, i.e. the basins of attraction overlap in the space of initial conditions, limits the prediction accuracy for v > 0. Increasing the elapsed time of the variables used to train and test the classifier, increases the prediction accuracy, while adding explicit external noise to the ODE models decreases the prediction accuracy. Our results quantify the competition between early prognosis and high prediction accuracy that is frequently encountered by clinicians.

  14. Evaluation of burst pressure prediction models for line pipes

    International Nuclear Information System (INIS)

    Zhu, Xian-Kui; Leis, Brian N.

    2012-01-01

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487–492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: ► This paper evaluates different burst pressure prediction models for line pipes. ► The existing models are categorized into two major groups of Tresca and von Mises solutions. ► Prediction quality of each model is assessed statistically using a large full-scale burst test database. ► The Zhu-Leis solution is identified as the best predictive model.

  15. Evaluation of burst pressure prediction models for line pipes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xian-Kui, E-mail: zhux@battelle.org [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States); Leis, Brian N. [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States)

    2012-01-15

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487-492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: Black-Right-Pointing-Pointer This paper evaluates different burst pressure prediction models for line pipes. Black-Right-Pointing-Pointer The existing models are categorized into two major groups of Tresca and von Mises solutions. Black-Right-Pointing-Pointer Prediction quality of each model is assessed statistically using a large full-scale burst test database. Black-Right-Pointing-Pointer The Zhu-Leis solution is identified as the best predictive model.

  16. Survival prediction model for postoperative hepatocellular carcinoma patients.

    Science.gov (United States)

    Ren, Zhihui; He, Shasha; Fan, Xiaotang; He, Fangping; Sang, Wei; Bao, Yongxing; Ren, Weixin; Zhao, Jinming; Ji, Xuewen; Wen, Hao

    2017-09-01

    This study is to establish a predictive index (PI) model of 5-year survival rate for patients with hepatocellular carcinoma (HCC) after radical resection and to evaluate its prediction sensitivity, specificity, and accuracy.Patients underwent HCC surgical resection were enrolled and randomly divided into prediction model group (101 patients) and model evaluation group (100 patients). Cox regression model was used for univariate and multivariate survival analysis. A PI model was established based on multivariate analysis and receiver operating characteristic (ROC) curve was drawn accordingly. The area under ROC (AUROC) and PI cutoff value was identified.Multiple Cox regression analysis of prediction model group showed that neutrophil to lymphocyte ratio, histological grade, microvascular invasion, positive resection margin, number of tumor, and postoperative transcatheter arterial chemoembolization treatment were the independent predictors for the 5-year survival rate for HCC patients. The model was PI = 0.377 × NLR + 0.554 × HG + 0.927 × PRM + 0.778 × MVI + 0.740 × NT - 0.831 × transcatheter arterial chemoembolization (TACE). In the prediction model group, AUROC was 0.832 and the PI cutoff value was 3.38. The sensitivity, specificity, and accuracy were 78.0%, 80%, and 79.2%, respectively. In model evaluation group, AUROC was 0.822, and the PI cutoff value was well corresponded to the prediction model group with sensitivity, specificity, and accuracy of 85.0%, 83.3%, and 84.0%, respectively.The PI model can quantify the mortality risk of hepatitis B related HCC with high sensitivity, specificity, and accuracy.

  17. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  18. Predictive models of benthic invertebrate methylmercury in Ontario and Quebec lakes

    Energy Technology Data Exchange (ETDEWEB)

    Rennie, M.D.; Collins, N.C.; Purchase, C.F. [Toronto Univ., ON (Canada). Dept. of Biology; Tremblay, A. [Hydro-Quebec, Montreal, PQ (Canada)

    2005-12-01

    In both North America and Europe, high levels of mercury have been reported in lakes that do not receive obvious point-source mercury inputs. Concern over high contaminant levels in waterfowl and fish have prompted several government-issued advisories on safe levels of fish and wildlife intake for humans. Although the primary source of mercury in pristine lakes is directly through atmospheric deposition or indirectly via terrestrial runoff, there can be large variations in mercury concentrations in organisms in neighbouring lakes. Therefore, factors other than atmospheric deposition must influence bioavailability and accumulation of mercury in aquatic organisms. For that reason, multivariate analyses on benthic invertebrate methylmercury concentrations and water chemistry from 12 Quebec water bodies were used to construct simple, predictive models of benthic invertebrate methylmercury in 23 lakes in Ontario and Quebec. The study showed that the primary means of mercury accumulation for organisms in higher trophic positions is dietary through the assimilation of organic forms of mercury, principally methylmercury. The data from 12 Quebec water bodies, revealed that benthic invertebrates in reservoirs have higher methylmercury than those in natural lakes, and methylmercury is generally higher in predatory invertebrates. Reservoir age was found to correlate with fish, benthic invertebrate methylmercury, and also with lake chemistry parameters such as pH and dissolved organic carbon (DOC). The objective of the study was to determine the appropriate level of taxonomic or functional resolution for generating benthic invertebrate methylmercury models, and to identify which environmental variables correlate most with benthic invertebrate methylmercury. Empirical models using these correlations were constructed and their predicted efficiency was tested by cross-validation. In addition, the effect of exposure to fish digestive enzymes on invertebrate methylmercury was

  19. Comparison of Linear Prediction Models for Audio Signals

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available While linear prediction (LP has become immensely popular in speech modeling, it does not seem to provide a good approach for modeling audio signals. This is somewhat surprising, since a tonal signal consisting of a number of sinusoids can be perfectly predicted based on an (all-pole LP model with a model order that is twice the number of sinusoids. We provide an explanation why this result cannot simply be extrapolated to LP of audio signals. If noise is taken into account in the tonal signal model, a low-order all-pole model appears to be only appropriate when the tonal components are uniformly distributed in the Nyquist interval. Based on this observation, different alternatives to the conventional LP model can be suggested. Either the model should be changed to a pole-zero, a high-order all-pole, or a pitch prediction model, or the conventional LP model should be preceded by an appropriate frequency transform, such as a frequency warping or downsampling. By comparing these alternative LP models to the conventional LP model in terms of frequency estimation accuracy, residual spectral flatness, and perceptual frequency resolution, we obtain several new and promising approaches to LP-based audio modeling.

  20. State-space prediction model for chaotic time series

    Science.gov (United States)

    Alparslan, A. K.; Sayar, M.; Atilgan, A. R.

    1998-08-01

    A simple method for predicting the continuation of scalar chaotic time series ahead in time is proposed. The false nearest neighbors technique in connection with the time-delayed embedding is employed so as to reconstruct the state space. A local forecasting model based upon the time evolution of the topological neighboring in the reconstructed phase space is suggested. A moving root-mean-square error is utilized in order to monitor the error along the prediction horizon. The model is tested for the convection amplitude of the Lorenz model. The results indicate that for approximately 100 cycles of the training data, the prediction follows the actual continuation very closely about six cycles. The proposed model, like other state-space forecasting models, captures the long-term behavior of the system due to the use of spatial neighbors in the state space.

  1. Predicting artificailly drained areas by means of selective model ensemble

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Beucher, Amélie; Iversen, Bo Vangsø

    . The approaches employed include decision trees, discriminant analysis, regression models, neural networks and support vector machines amongst others. Several models are trained with each method, using variously the original soil covariates and principal components of the covariates. With a large ensemble...... out since the mid-19th century, and it has been estimated that half of the cultivated area is artificially drained (Olesen, 2009). A number of machine learning approaches can be used to predict artificially drained areas in geographic space. However, instead of choosing the most accurate model....... The study aims firstly to train a large number of models to predict the extent of artificially drained areas using various machine learning approaches. Secondly, the study will develop a method for selecting the models, which give a good prediction of artificially drained areas, when used in conjunction...

  2. An intermittency model for predicting roughness induced transition

    Science.gov (United States)

    Ge, Xuan; Durbin, Paul

    2014-11-01

    An extended model for roughness-induced transition is proposed based on an intermittency transport equation for RANS modeling formulated in local variables. To predict roughness effects in the fully turbulent boundary layer, published boundary conditions for k and ω are used, which depend on the equivalent sand grain roughness height, and account for the effective displacement of wall distance origin. Similarly in our approach, wall distance in the transition model for smooth surfaces is modified by an effective origin, which depends on roughness. Flat plate test cases are computed to show that the proposed model is able to predict the transition onset in agreement with a data correlation of transition location versus roughness height, Reynolds number, and inlet turbulence intensity. Experimental data for a turbine cascade are compared with the predicted results to validate the applicability of the proposed model. Supported by NSF Award Number 1228195.

  3. Driver's mental workload prediction model based on physiological indices.

    Science.gov (United States)

    Yan, Shengyuan; Tran, Cong Chi; Wei, Yingying; Habiyaremye, Jean Luc

    2017-09-15

    Developing an early warning model to predict the driver's mental workload (MWL) is critical and helpful, especially for new or less experienced drivers. The present study aims to investigate the correlation between new drivers' MWL and their work performance, regarding the number of errors. Additionally, the group method of data handling is used to establish the driver's MWL predictive model based on subjective rating (NASA task load index [NASA-TLX]) and six physiological indices. The results indicate that the NASA-TLX and the number of errors are positively correlated, and the predictive model shows the validity of the proposed model with an R 2 value of 0.745. The proposed model is expected to provide a reference value for the new drivers of their MWL by providing the physiological indices, and the driving lesson plans can be proposed to sustain an appropriate MWL as well as improve the driver's work performance.

  4. Modeling, Prediction, and Control of Heating Temperature for Tube Billet

    Directory of Open Access Journals (Sweden)

    Yachun Mao

    2015-01-01

    Full Text Available Annular furnaces have multivariate, nonlinear, large time lag, and cross coupling characteristics. The prediction and control of the exit temperature of a tube billet are important but difficult. We establish a prediction model for the final temperature of a tube billet through OS-ELM-DRPLS method. We address the complex production characteristics, integrate the advantages of PLS and ELM algorithms in establishing linear and nonlinear models, and consider model update and data lag. Based on the proposed model, we design a prediction control algorithm for tube billet temperature. The algorithm is validated using the practical production data of Baosteel Co., Ltd. Results show that the model achieves the precision required in industrial applications. The temperature of the tube billet can be controlled within the required temperature range through compensation control method.

  5. A model for predicting lung cancer response to therapy

    International Nuclear Information System (INIS)

    Seibert, Rebecca M.; Ramsey, Chester R.; Hines, J. Wesley; Kupelian, Patrick A.; Langen, Katja M.; Meeks, Sanford L.; Scaperoth, Daniel D.

    2007-01-01

    Purpose: Volumetric computed tomography (CT) images acquired by image-guided radiation therapy (IGRT) systems can be used to measure tumor response over the course of treatment. Predictive adaptive therapy is a novel treatment technique that uses volumetric IGRT data to actively predict the future tumor response to therapy during the first few weeks of IGRT treatment. The goal of this study was to develop and test a model for predicting lung tumor response during IGRT treatment using serial megavoltage CT (MVCT). Methods and Materials: Tumor responses were measured for 20 lung cancer lesions in 17 patients that were imaged and treated with helical tomotherapy with doses ranging from 2.0 to 2.5 Gy per fraction. Five patients were treated with concurrent chemotherapy, and 1 patient was treated with neoadjuvant chemotherapy. Tumor response to treatment was retrospectively measured by contouring 480 serial MVCT images acquired before treatment. A nonparametric, memory-based locally weight regression (LWR) model was developed for predicting tumor response using the retrospective tumor response data. This model predicts future tumor volumes and the associated confidence intervals based on limited observations during the first 2 weeks of treatment. The predictive accuracy of the model was tested using a leave-one-out cross-validation technique with the measured tumor responses. Results: The predictive algorithm was used to compare predicted verse-measured tumor volume response for all 20 lesions. The average error for the predictions of the final tumor volume was 12%, with the true volumes always bounded by the 95% confidence interval. The greatest model uncertainty occurred near the middle of the course of treatment, in which the tumor response relationships were more complex, the model has less information, and the predictors were more varied. The optimal days for measuring the tumor response on the MVCT images were on elapsed Days 1, 2, 5, 9, 11, 12, 17, and 18 during

  6. Updated climatological model predictions of ionospheric and HF propagation parameters

    International Nuclear Information System (INIS)

    Reilly, M.H.; Rhoads, F.J.; Goodman, J.M.; Singh, M.

    1991-01-01

    The prediction performances of several climatological models, including the ionospheric conductivity and electron density model, RADAR C, and Ionospheric Communications Analysis and Predictions Program, are evaluated for different regions and sunspot number inputs. Particular attention is given to the near-real-time (NRT) predictions associated with single-station updates. It is shown that a dramatic improvement can be obtained by using single-station ionospheric data to update the driving parameters for an ionospheric model for NRT predictions of f(0)F2 and other ionospheric and HF circuit parameters. For middle latitudes, the improvement extends out thousands of kilometers from the update point to points of comparable corrected geomagnetic latitude. 10 refs

  7. Modelling earth current precursors in earthquake prediction

    Directory of Open Access Journals (Sweden)

    R. Di Maio

    1997-06-01

    Full Text Available This paper deals with the theory of earth current precursors of earthquake. A dilatancy-diffusion-polarization model is proposed to explain the anomalies of the electric potential, which are observed on the ground surface prior to some earthquakes. The electric polarization is believed to be the electrokinetic effect due to the invasion of fluids into new pores, which are opened inside a stressed-dilated rock body. The time and space variation of the distribution of the electric potential in a layered earth as well as in a faulted half-space is studied in detail. It results that the surface response depends on the underground conductivity distribution and on the relative disposition of the measuring dipole with respect to the buried bipole source. A field procedure based on the use of an areal layout of the recording sites is proposed, in order to obtain the most complete information on the time and space evolution of the precursory phenomena in any given seismic region.

  8. Predictive modeling of coupled multi-physics systems: I. Theory

    International Nuclear Information System (INIS)

    Cacuci, Dan Gabriel

    2014-01-01

    Highlights: • We developed “predictive modeling of coupled multi-physics systems (PMCMPS)”. • PMCMPS reduces predicted uncertainties in predicted model responses and parameters. • PMCMPS treats efficiently very large coupled systems. - Abstract: This work presents an innovative mathematical methodology for “predictive modeling of coupled multi-physics systems (PMCMPS).” This methodology takes into account fully the coupling terms between the systems but requires only the computational resources that would be needed to perform predictive modeling on each system separately. The PMCMPS methodology uses the maximum entropy principle to construct an optimal approximation of the unknown a priori distribution based on a priori known mean values and uncertainties characterizing the parameters and responses for both multi-physics models. This “maximum entropy”-approximate a priori distribution is combined, using Bayes’ theorem, with the “likelihood” provided by the multi-physics simulation models. Subsequently, the posterior distribution thus obtained is evaluated using the saddle-point method to obtain analytical expressions for the optimally predicted values for the multi-physics models parameters and responses along with corresponding reduced uncertainties. Noteworthy, the predictive modeling methodology for the coupled systems is constructed such that the systems can be considered sequentially rather than simultaneously, while preserving exactly the same results as if the systems were treated simultaneously. Consequently, very large coupled systems, which could perhaps exceed available computational resources if treated simultaneously, can be treated with the PMCMPS methodology presented in this work sequentially and without any loss of generality or information, requiring just the resources that would be needed if the systems were treated sequentially

  9. Embryo quality predictive models based on cumulus cells gene expression

    Directory of Open Access Journals (Sweden)

    Devjak R

    2016-06-01

    Full Text Available Since the introduction of in vitro fertilization (IVF in clinical practice of infertility treatment, the indicators for high quality embryos were investigated. Cumulus cells (CC have a specific gene expression profile according to the developmental potential of the oocyte they are surrounding, and therefore, specific gene expression could be used as a biomarker. The aim of our study was to combine more than one biomarker to observe improvement in prediction value of embryo development. In this study, 58 CC samples from 17 IVF patients were analyzed. This study was approved by the Republic of Slovenia National Medical Ethics Committee. Gene expression analysis [quantitative real time polymerase chain reaction (qPCR] for five genes, analyzed according to embryo quality level, was performed. Two prediction models were tested for embryo quality prediction: a binary logistic and a decision tree model. As the main outcome, gene expression levels for five genes were taken and the area under the curve (AUC for two prediction models were calculated. Among tested genes, AMHR2 and LIF showed significant expression difference between high quality and low quality embryos. These two genes were used for the construction of two prediction models: the binary logistic model yielded an AUC of 0.72 ± 0.08 and the decision tree model yielded an AUC of 0.73 ± 0.03. Two different prediction models yielded similar predictive power to differentiate high and low quality embryos. In terms of eventual clinical decision making, the decision tree model resulted in easy-to-interpret rules that are highly applicable in clinical practice.

  10. Comparison of Predictive Modeling Methods of Aircraft Landing Speed

    Science.gov (United States)

    Diallo, Ousmane H.

    2012-01-01

    Expected increases in air traffic demand have stimulated the development of air traffic control tools intended to assist the air traffic controller in accurately and precisely spacing aircraft landing at congested airports. Such tools will require an accurate landing-speed prediction to increase throughput while decreasing necessary controller interventions for avoiding separation violations. There are many practical challenges to developing an accurate landing-speed model that has acceptable prediction errors. This paper discusses the development of a near-term implementation, using readily available information, to estimate/model final approach speed from the top of the descent phase of flight to the landing runway. As a first approach, all variables found to contribute directly to the landing-speed prediction model are used to build a multi-regression technique of the response surface equation (RSE). Data obtained from operations of a major airlines for a passenger transport aircraft type to the Dallas/Fort Worth International Airport are used to predict the landing speed. The approach was promising because it decreased the standard deviation of the landing-speed error prediction by at least 18% from the standard deviation of the baseline error, depending on the gust condition at the airport. However, when the number of variables is reduced to the most likely obtainable at other major airports, the RSE model shows little improvement over the existing methods. Consequently, a neural network that relies on a nonlinear regression technique is utilized as an alternative modeling approach. For the reduced number of variables cases, the standard deviation of the neural network models errors represent over 5% reduction compared to the RSE model errors, and at least 10% reduction over the baseline predicted landing-speed error standard deviation. Overall, the constructed models predict the landing-speed more accurately and precisely than the current state-of-the-art.

  11. Individualized prediction of perineural invasion in colorectal cancer: development and validation of a radiomics prediction model.

    Science.gov (United States)

    Huang, Yanqi; He, Lan; Dong, Di; Yang, Caiyun; Liang, Cuishan; Chen, Xin; Ma, Zelan; Huang, Xiaomei; Yao, Su; Liang, Changhong; Tian, Jie; Liu, Zaiyi

    2018-02-01

    To develop and validate a radiomics prediction model for individualized prediction of perineural invasion (PNI) in colorectal cancer (CRC). After computed tomography (CT) radiomics features extraction, a radiomics signature was constructed in derivation cohort (346 CRC patients). A prediction model was developed to integrate the radiomics signature and clinical candidate predictors [age, sex, tumor location, and carcinoembryonic antigen (CEA) level]. Apparent prediction performance was assessed. After internal validation, independent temporal validation (separate from the cohort used to build the model) was then conducted in 217 CRC patients. The final model was converted to an easy-to-use nomogram. The developed radiomics nomogram that integrated the radiomics signature and CEA level showed good calibration and discrimination performance [Harrell's concordance index (c-index): 0.817; 95% confidence interval (95% CI): 0.811-0.823]. Application of the nomogram in validation cohort gave a comparable calibration and discrimination (c-index: 0.803; 95% CI: 0.794-0.812). Integrating the radiomics signature and CEA level into a radiomics prediction model enables easy and effective risk assessment of PNI in CRC. This stratification of patients according to their PNI status may provide a basis for individualized auxiliary treatment.

  12. Longitudinal modeling to predict vital capacity in amyotrophic lateral sclerosis.

    Science.gov (United States)

    Jahandideh, Samad; Taylor, Albert A; Beaulieu, Danielle; Keymer, Mike; Meng, Lisa; Bian, Amy; Atassi, Nazem; Andrews, Jinsy; Ennist, David L

    2018-05-01

    Death in amyotrophic lateral sclerosis (ALS) patients is related to respiratory failure, which is assessed in clinical settings by measuring vital capacity. We developed ALS-VC, a modeling tool for longitudinal prediction of vital capacity in ALS patients. A gradient boosting machine (GBM) model was trained using the PRO-ACT (Pooled Resource Open-access ALS Clinical Trials) database of over 10,000 ALS patient records. We hypothesized that a reliable vital capacity predictive model could be developed using PRO-ACT. The model was used to compare FVC predictions with a 30-day run-in period to predictions made from just baseline. The internal root mean square deviations (RMSD) of the run-in and baseline models were 0.534 and 0.539, respectively, across the 7L FVC range captured in PRO-ACT. The RMSDs of the run-in and baseline models using an unrelated, contemporary external validation dataset (0.553 and 0.538, respectively) were comparable to the internal validation. The model was shown to have similar accuracy for predicting SVC (RMSD = 0.562). The most important features for both run-in and baseline models were "Baseline forced vital capacity" and "Days since baseline." We developed ALS-VC, a GBM model trained with the PRO-ACT ALS dataset that provides vital capacity predictions generalizable to external datasets. The ALS-VC model could be helpful in advising and counseling patients, and, in clinical trials, it could be used to generate virtual control arms against which observed outcomes could be compared, or used to stratify patients into slowly, average, and rapidly progressing subgroups.

  13. Prediction error, ketamine and psychosis: An updated model.

    Science.gov (United States)

    Corlett, Philip R; Honey, Garry D; Fletcher, Paul C

    2016-11-01

    In 2007, we proposed an explanation of delusion formation as aberrant prediction error-driven associative learning. Further, we argued that the NMDA receptor antagonist ketamine provided a good model for this process. Subsequently, we validated the model in patients with psychosis, relating aberrant prediction error signals to delusion severity. During the ensuing period, we have developed these ideas, drawing on the simple principle that brains build a model of the world and refine it by minimising prediction errors, as well as using it to guide perceptual inferences. While previously we focused on the prediction error signal per se, an updated view takes into account its precision, as well as the precision of prior expectations. With this expanded perspective, we see several possible routes to psychotic symptoms - which may explain the heterogeneity of psychotic illness, as well as the fact that other drugs, with different pharmacological actions, can produce psychotomimetic effects. In this article, we review the basic principles of this model and highlight specific ways in which prediction errors can be perturbed, in particular considering the reliability and uncertainty of predictions. The expanded model explains hallucinations as perturbations of the uncertainty mediated balance between expectation and prediction error. Here, expectations dominate and create perceptions by suppressing or ignoring actual inputs. Negative symptoms may arise due to poor reliability of predictions in service of action. By mapping from biology to belief and perception, the account proffers new explanations of psychosis. However, challenges remain. We attempt to address some of these concerns and suggest future directions, incorporating other symptoms into the model, building towards better understanding of psychosis. © The Author(s) 2016.

  14. Looplessness in networks is linked to trophic coherence.

    Science.gov (United States)

    Johnson, Samuel; Jones, Nick S

    2017-05-30

    Many natural, complex systems are remarkably stable thanks to an absence of feedback acting on their elements. When described as networks these exhibit few or no cycles, and associated matrices have small leading eigenvalues. It has been suggested that this architecture can confer advantages to the system as a whole, such as "qualitative stability," but this observation does not in itself explain how a loopless structure might arise. We show here that the number of feedback loops in a network, as well as the eigenvalues of associated matrices, is determined by a structural property called trophic coherence, a measure of how neatly nodes fall into distinct levels. Our theory correctly classifies a variety of networks-including those derived from genes, metabolites, species, neurons, words, computers, and trading nations-into two distinct regimes of high and low feedback and provides a null model to gauge the significance of related magnitudes. Because trophic coherence suppresses feedback, whereas an absence of feedback alone does not lead to coherence, our work suggests that the reasons for "looplessness" in nature should be sought in coherence-inducing mechanisms.

  15. Predicting soil acidification trends at Plynlimon using the SAFE model

    Directory of Open Access Journals (Sweden)

    B. Reynolds

    1997-01-01

    Full Text Available The SAFE model has been applied to an acid grassland site, located on base-poor stagnopodzol soils derived from Lower Palaeozoic greywackes. The model predicts that acidification of the soil has occurred in response to increased acid deposition following the industrial revolution. Limited recovery is predicted following the decline in sulphur deposition during the mid to late 1970s. Reducing excess sulphur and NOx deposition in 1998 to 40% and 70% of 1980 levels results in further recovery but soil chemical conditions (base saturation, soil water pH and ANC do not return to values predicted in pre-industrial times. The SAFE model predicts that critical loads (expressed in terms of the (Ca+Mg+K:Alcrit ratio for six vegetation species found in acid grassland communities are not exceeded despite the increase in deposited acidity following the industrial revolution. The relative growth response of selected vegetation species characteristic of acid grassland swards has been predicted using a damage function linking growth to soil solution base cation to aluminium ratio. The results show that very small growth reductions can be expected for 'acid tolerant' plants growing in acid upland soils. For more sensitive species such as Holcus lanatus, SAFE predicts that growth would have been reduced by about 20% between 1951 and 1983, when acid inputs were greatest. Recovery to c. 90% of normal growth (under laboratory conditions is predicted as acidic inputs decline.

  16. A deep auto-encoder model for gene expression prediction.

    Science.gov (United States)

    Xie, Rui; Wen, Jia; Quitadamo, Andrew; Cheng, Jianlin; Shi, Xinghua

    2017-11-17

    Gene expression is a key intermediate level that genotypes lead to a particular trait. Gene expression is affected by various factors including genotypes of genetic variants. With an aim of delineating the genetic impact on gene expression, we build a deep auto-encoder model to assess how good genetic variants will contribute to gene expression changes. This new deep learning model is a regression-based predictive model based on the MultiLayer Perceptron and Stacked Denoising Auto-encoder (MLP-SAE). The model is trained using a stacked denoising auto-encoder for feature selection and a multilayer perceptron framework for backpropagation. We further improve the model by introducing dropout to prevent overfitting and improve performance. To demonstrate the usage of this model, we apply MLP-SAE to a real genomic datasets with genotypes and gene expression profiles measured in yeast. Our results show that the MLP-SAE model with dropout outperforms other models including Lasso, Random Forests and the MLP-SAE model without dropout. Using the MLP-SAE model with dropout, we show that gene expression quantifications predicted by the model solely based on genotypes, align well with true gene expression patterns. We provide a deep auto-encoder model for predicting gene expression from SNP genotypes. This study demonstrates that deep learning is appropriate for tackling another genomic problem, i.e., building predictive models to understand genotypes' contribution to gene expression. With the emerging availability of richer genomic data, we anticipate that deep learning models play a bigger role in modeling and interpreting genomics.

  17. Predictive modeling of coral disease distribution within a reef system.

    Directory of Open Access Journals (Sweden)

    Gareth J Williams

    2010-02-01

    Full Text Available Diseases often display complex and distinct associations with their environment due to differences in etiology, modes of transmission between hosts, and the shifting balance between pathogen virulence and host resistance. Statistical modeling has been underutilized in coral disease research to explore the spatial patterns that result from this triad of interactions. We tested the hypotheses that: 1 coral diseases show distinct associations with multiple environmental factors, 2 incorporating interactions (synergistic collinearities among environmental variables is important when predicting coral disease spatial patterns, and 3 modeling overall coral disease prevalence (the prevalence of multiple diseases as a single proportion value will increase predictive error relative to modeling the same diseases independently. Four coral diseases: Porites growth anomalies (PorGA, Porites tissue loss (PorTL, Porites trematodiasis (PorTrem, and Montipora white syndrome (MWS, and their interactions with 17 predictor variables were modeled using boosted regression trees (BRT within a reef system in Hawaii. Each disease showed distinct associations with the predictors. Environmental predictors showing the strongest overall associations with the coral diseases were both biotic and abiotic. PorGA was optimally predicted by a negative association with turbidity, PorTL and MWS by declines in butterflyfish and juvenile parrotfish abundance respectively, and PorTrem by a modal relationship with Porites host cover. Incorporating interactions among predictor variables contributed to the predictive power of our models, particularly for PorTrem. Combining diseases (using overall disease prevalence as the model response, led to an average six-fold increase in cross-validation predictive deviance over modeling the diseases individually. We therefore recommend coral diseases to be modeled separately, unless known to have etiologies that respond in a similar manner to

  18. Plant water potential improves prediction of empirical stomatal models.

    Directory of Open Access Journals (Sweden)

    William R L Anderegg

    Full Text Available Climate change is expected to lead to increases in drought frequency and severity, with deleterious effects on many ecosystems. Stomatal responses to changing environmental conditions form the backbone of all ecosystem models, but are based on empirical relationships and are not well-tested during drought conditions. Here, we use a dataset of 34 woody plant species spanning global forest biomes to examine the effect of leaf water potential on stomatal conductance and test the predictive accuracy of three major stomatal models and a recently proposed model. We find that current leaf-level empirical models have consistent biases of over-prediction of stomatal conductance during dry conditions, particularly at low soil water potentials. Furthermore, the recently proposed stomatal conductance model yields increases in predictive capability compared to current models, and with particular improvement during drought conditions. Our results reveal that including stomatal sensitivity to declining water potential and consequent impairment of plant water transport will improve predictions during drought conditions and show that many biomes contain a diversity of plant stomatal strategies that range from risky to conservative stomatal regulation during water stress. Such improvements in stomatal simulation are greatly needed to help unravel and predict the response of ecosystems to future climate extremes.

  19. Cross-Validation of Aerobic Capacity Prediction Models in Adolescents.

    Science.gov (United States)

    Burns, Ryan Donald; Hannon, James C; Brusseau, Timothy A; Eisenman, Patricia A; Saint-Maurice, Pedro F; Welk, Greg J; Mahar, Matthew T

    2015-08-01

    Cardiorespiratory endurance is a component of health-related fitness. FITNESSGRAM recommends the Progressive Aerobic Cardiovascular Endurance Run (PACER) or One mile Run/Walk (1MRW) to assess cardiorespiratory endurance by estimating VO2 Peak. No research has cross-validated prediction models from both PACER and 1MRW, including the New PACER Model and PACER-Mile Equivalent (PACER-MEQ) using current standards. The purpose of this study was to cross-validate prediction models from PACER and 1MRW against measured VO2 Peak in adolescents. Cardiorespiratory endurance data were collected on 90 adolescents aged 13-16 years (Mean = 14.7 ± 1.3 years; 32 girls, 52 boys) who completed the PACER and 1MRW in addition to a laboratory maximal treadmill test to measure VO2 Peak. Multiple correlations among various models with measured VO2 Peak were considered moderately strong (R = .74-0.78), and prediction error (RMSE) ranged from 5.95 ml·kg⁻¹,min⁻¹ to 8.27 ml·kg⁻¹.min⁻¹. Criterion-referenced agreement into FITNESSGRAM's Healthy Fitness Zones was considered fair-to-good among models (Kappa = 0.31-0.62; Agreement = 75.5-89.9%; F = 0.08-0.65). In conclusion, prediction models demonstrated moderately strong linear relationships with measured VO2 Peak, fair prediction error, and fair-to-good criterion referenced agreement with measured VO2 Peak into FITNESSGRAM's Healthy Fitness Zones.

  20. Comparison of the models of financial distress prediction

    Directory of Open Access Journals (Sweden)

    Jiří Omelka

    2013-01-01

    Full Text Available Prediction of the financial distress is generally supposed as approximation if a business entity is closed on bankruptcy or at least on serious financial problems. Financial distress is defined as such a situation when a company is not able to satisfy its liabilities in any forms, or when its liabilities are higher than its assets. Classification of financial situation of business entities represents a multidisciplinary scientific issue that uses not only the economic theoretical bases but interacts to the statistical, respectively to econometric approaches as well.The first models of financial distress prediction have originated in the sixties of the 20th century. One of the most known is the Altman’s model followed by a range of others which are constructed on more or less conformable bases. In many existing models it is possible to find common elements which could be marked as elementary indicators of potential financial distress of a company. The objective of this article is, based on the comparison of existing models of prediction of financial distress, to define the set of basic indicators of company’s financial distress at conjoined identification of their critical aspects. The sample defined this way will be a background for future research focused on determination of one-dimensional model of financial distress prediction which would subsequently become a basis for construction of multi-dimensional prediction model.

  1. Predictive assessment of models for dynamic functional connectivity

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Schmidt, Mikkel Nørgaard; Madsen, Kristoffer Hougaard

    2018-01-01

    represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state......In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature...... dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework...

  2. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions.

    Science.gov (United States)

    Zuñiga, Cristal; Li, Chien-Ting; Huelsman, Tyler; Levering, Jennifer; Zielinski, Daniel C; McConnell, Brian O; Long, Christopher P; Knoshaug, Eric P; Guarnieri, Michael T; Antoniewicz, Maciek R; Betenbaugh, Michael J; Zengler, Karsten

    2016-09-01

    The green microalga Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organism to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Furthermore, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine. © 2016 American Society of Plant Biologists. All rights reserved.

  3. Genome-Scale Metabolic Model for the Green Alga Chlorella vulgaris UTEX 395 Accurately Predicts Phenotypes under Autotrophic, Heterotrophic, and Mixotrophic Growth Conditions1

    Science.gov (United States)

    Zuñiga, Cristal; Li, Chien-Ting; Zielinski, Daniel C.; Guarnieri, Michael T.; Antoniewicz, Maciek R.; Zengler, Karsten

    2016-01-01

    The green microalga Chlorella vulgaris has been widely recognized as a promising candidate for biofuel production due to its ability to store high lipid content and its natural metabolic versatility. Compartmentalized genome-scale metabolic models constructed from genome sequences enable quantitative insight into the transport and metabolism of compounds within a target organism. These metabolic models have long been utilized to generate optimized design strategies for an improved production process. Here, we describe the reconstruction, validation, and application of a genome-scale metabolic model for C. vulgaris UTEX 395, iCZ843. The reconstruction represents the most comprehensive model for any eukaryotic photosynthetic organism to date, based on the genome size and number of genes in the reconstruction. The highly curated model accurately predicts phenotypes under photoautotrophic, heterotrophic, and mixotrophic conditions. The model was validated against experimental data and lays the foundation for model-driven strain design and medium alteration to improve yield. Calculated flux distributions under different trophic conditions show that a number of key pathways are affected by nitrogen starvation conditions, including central carbon metabolism and amino acid, nucleotide, and pigment biosynthetic pathways. Furthermore, model prediction of growth rates under various medium compositions and subsequent experimental validation showed an increased growth rate with the addition of tryptophan and methionine. PMID:27372244

  4. SHMF: Interest Prediction Model with Social Hub Matrix Factorization

    Directory of Open Access Journals (Sweden)

    Chaoyuan Cui

    2017-01-01

    Full Text Available With the development of social networks, microblog has become the major social communication tool. There is a lot of valuable information such as personal preference, public opinion, and marketing in microblog. Consequently, research on user interest prediction in microblog has a positive practical significance. In fact, how to extract information associated with user interest orientation from the constantly updated blog posts is not so easy. Existing prediction approaches based on probabilistic factor analysis use blog posts published by user to predict user interest. However, these methods are not very effective for the users who post less but browse more. In this paper, we propose a new prediction model, which is called SHMF, using social hub matrix factorization. SHMF constructs the interest prediction model by combining the information of blogs posts published by both user and direct neighbors in user’s social hub. Our proposed model predicts user interest by integrating user’s historical behavior and temporal factor as well as user’s friendships, thus achieving accurate forecasts of user’s future interests. The experimental results on Sina Weibo show the efficiency and effectiveness of our proposed model.

  5. Prediction Models and Decision Support: Chances and Challenges

    NARCIS (Netherlands)

    Kappen, T.H.

    2015-01-01

    A clinical prediction model can assist doctors in arriving at the most likely diagnosis or estimating the prognosis. By utilizing various patient- and disease-related properties, such models can yield objective estimations of the risk of a disease or the probability of a certain disease course for

  6. A model to predict the sound reflection from forests

    NARCIS (Netherlands)

    Wunderli, J.M.; Salomons, E.M.

    2009-01-01

    A model is presented to predict the reflection of sound at forest edges. A single tree is modelled as a vertical cylinder. For the reflection at a cylinder an analytical solution is given based on the theory of scattering of spherical waves. The entire forest is represented by a line of cylinders

  7. Computationally efficient model predictive control algorithms a neural network approach

    CERN Document Server

    Ławryńczuk, Maciej

    2014-01-01

    This book thoroughly discusses computationally efficient (suboptimal) Model Predictive Control (MPC) techniques based on neural models. The subjects treated include: ·         A few types of suboptimal MPC algorithms in which a linear approximation of the model or of the predicted trajectory is successively calculated on-line and used for prediction. ·         Implementation details of the MPC algorithms for feedforward perceptron neural models, neural Hammerstein models, neural Wiener models and state-space neural models. ·         The MPC algorithms based on neural multi-models (inspired by the idea of predictive control). ·         The MPC algorithms with neural approximation with no on-line linearization. ·         The MPC algorithms with guaranteed stability and robustness. ·         Cooperation between the MPC algorithms and set-point optimization. Thanks to linearization (or neural approximation), the presented suboptimal algorithms do not require d...

  8. Prediction of speech intelligibility based on an auditory preprocessing model

    DEFF Research Database (Denmark)

    Christiansen, Claus Forup Corlin; Pedersen, Michael Syskind; Dau, Torsten

    2010-01-01

    in noise experiment was used for training and an ideal binary mask experiment was used for evaluation. All three models were able to capture the trends in the speech in noise training data well, but the proposed model provides a better prediction of the binary mask test data, particularly when the binary...... masks degenerate to a noise vocoder....

  9. Predictive ability of boiler production models | Ogundu | Animal ...

    African Journals Online (AJOL)

    The weekly body weight measurements of a growing strain of Ross broiler were used to compare the of ability of three mathematical models (the multi, linear, quadratic and Exponential) to predict 8 week body weight from early body measurements at weeks I, II, III, IV, V, VI and VII. The results suggest that the three models ...

  10. Inferential ecosystem models, from network data to prediction

    Science.gov (United States)

    James S. Clark; Pankaj Agarwal; David M. Bell; Paul G. Flikkema; Alan Gelfand; Xuanlong Nguyen; Eric Ward; Jun Yang

    2011-01-01

    Recent developments suggest that predictive modeling could begin to play a larger role not only for data analysis, but also for data collection. We address the example of efficient wireless sensor networks, where inferential ecosystem models can be used to weigh the value of an observation against the cost of data collection. Transmission costs make observations ‘‘...

  11. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  12. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  13. Modelling and prediction of non-stationary optical turbulence behaviour

    NARCIS (Netherlands)

    Doelman, N.J.; Osborn, J.

    2016-01-01

    There is a strong need to model the temporal fluctuations in turbulence parameters, for instance for scheduling, simulation and prediction purposes. This paper aims at modelling the dynamic behaviour of the turbulence coherence length r0, utilising measurement data from the Stereo-SCIDAR instrument

  14. A Mathematical Model for the Prediction of Injectivity Decline | Odeh ...

    African Journals Online (AJOL)

    Injectivity impairment due to invasion of solid suspensions has been studied by several investigators and some modelling approaches have also been reported. Worthy of note is the development of analytical models for internal and external filtration coupled with transition time concept for predicting the overall decline in ...

  15. Mathematical Model for Prediction of Flexural Strength of Mound ...

    African Journals Online (AJOL)

    The mound soil-cement blended proportions were mathematically optimized by using scheffe's approach and the optimization model developed. A computer program predicting the mix proportion for the model was written. The optimal proportion by the program was used prepare beam samples measuring 150mm x 150mm ...

  16. Predictive Model Equations for Palm Kernel (Elaeis guneensis J ...

    African Journals Online (AJOL)

    Estimated error of ± 0.18 and ± 0.2 are envisaged while applying the models for predicting palm kernel and sesame oil colours respectively. Keywords: Palm kernel, Sesame, Palm kernel, Oil Colour, Process Parameters, Model. Journal of Applied Science, Engineering and Technology Vol. 6 (1) 2006 pp. 34-38 ...

  17. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  18. A theoretical model for predicting neutron fluxes for cyclic Neutron ...

    African Journals Online (AJOL)

    A theoretical model has been developed for prediction of thermal neutron fluxes required for cyclic irradiations of a sample to obtain the same activity previously used for the detection of any radionuclide of interest. The model is suitable for radiotracer production or for long-lived neutron activation products where the ...

  19. Model Predictive Control for Offset-Free Reference Tracking

    Czech Academy of Sciences Publication Activity Database

    Belda, Květoslav

    2016-01-01

    Roč. 5, č. 1 (2016), s. 8-13 ISSN 1805-3386 Institutional support: RVO:67985556 Keywords : offset-free reference tracking * predictive control * ARX model * state-space model * multi-input multi-output system * robotic system * mechatronic system Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/belda-0458355.pdf

  20. Modeling the Temporal Nature of Human Behavior for Demographics Prediction

    DEFF Research Database (Denmark)

    Felbo, Bjarke; Sundsøy, Pål; Pentland, Alex

    2017-01-01

    Mobile phone metadata is increasingly used for humanitarian purposes in developing countries as traditional data is scarce. Basic demographic information is however often absent from mobile phone datasets, limiting the operational impact of the datasets. For these reasons, there has been a growing...... interest in predicting demographic information from mobile phone metadata. Previous work focused on creating increasingly advanced features to be modeled with standard machine learning algorithms. We here instead model the raw mobile phone metadata directly using deep learning, exploiting the temporal...... on both age and gender prediction using only the temporal modality in mobile metadata. We finally validate our method on low activity users and evaluate the modeling assumptions....

  1. Numerical Modelling and Prediction of Erosion Induced by Hydrodynamic Cavitation

    Science.gov (United States)

    Peters, A.; Lantermann, U.; el Moctar, O.

    2015-12-01

    The present work aims to predict cavitation erosion using a numerical flow solver together with a new developed erosion model. The erosion model is based on the hypothesis that collapses of single cavitation bubbles near solid boundaries form high velocity microjets, which cause sonic impacts with high pressure amplitudes damaging the surface. The erosion model uses information from a numerical Euler-Euler flow simulation to predict erosion sensitive areas and assess the erosion aggressiveness of the flow. The obtained numerical results were compared to experimental results from tests of an axisymmetric nozzle.

  2. Robust Output Model Predictive Control of an Unstable Rijke Tube

    Directory of Open Access Journals (Sweden)

    Fabian Jarmolowitz

    2012-01-01

    Full Text Available This work investigates the active control of an unstable Rijke tube using robust output model predictive control (RMPC. As internal model a polytopic linear system with constraints is assumed to account for uncertainties. For guaranteed stability, a linear state feedback controller is designed using linear matrix inequalities and used within a feedback formulation of the model predictive controller. For state estimation a robust gain-scheduled observer is developed. It is shown that the proposed RMPC ensures robust stability under constraints over the considered operating range.

  3. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  4. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Directory of Open Access Journals (Sweden)

    Jaime Cuevas

    2017-01-01

    Full Text Available The phenomenon of genotype × environment (G × E interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects ( u that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP and Gaussian (Gaussian kernel, GK. The other model has the same genetic component as the first model ( u plus an extra component, f, that captures random effects between environments that were not captured by the random effects u . We used five CIMMYT data sets (one maize and four wheat that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u   and   f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u .

  5. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  6. Prediction of resource volumes at untested locations using simple local prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2006-01-01

    This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.

  7. Three-model ensemble wind prediction in southern Italy

    Science.gov (United States)

    Torcasio, Rosa Claudia; Federico, Stefano; Calidonna, Claudia Roberta; Avolio, Elenio; Drofa, Oxana; Landi, Tony Christian; Malguzzi, Piero; Buzzi, Andrea; Bonasoni, Paolo

    2016-03-01

    Quality of wind prediction is of great importance since a good wind forecast allows the prediction of available wind power, improving the penetration of renewable energies into the energy market. Here, a 1-year (1 December 2012 to 30 November 2013) three-model ensemble (TME) experiment for wind prediction is considered. The models employed, run operationally at National Research Council - Institute of Atmospheric Sciences and Climate (CNR-ISAC), are RAMS (Regional Atmospheric Modelling System), BOLAM (BOlogna Limited Area Model), and MOLOCH (MOdello LOCale in H coordinates). The area considered for the study is southern Italy and the measurements used for the forecast verification are those of the GTS (Global Telecommunication System). Comparison with observations is made every 3 h up to 48 h of forecast lead time. Results show that the three-model ensemble outperforms the forecast of each individual model. The RMSE improvement compared to the best model is between 22 and 30 %, depending on the season. It is also shown that the three-model ensemble outperforms the IFS (Integrated Forecasting System) of the ECMWF (European Centre for Medium-Range Weather Forecast) for the surface wind forecasts. Notably, the three-model ensemble forecast performs better than each unbiased model, showing the added value of the ensemble technique. Finally, the sensitivity of the three-model ensemble RMSE to the length of the training period is analysed.

  8. A novel Bayesian hierarchical model for road safety hotspot prediction.

    Science.gov (United States)

    Fawcett, Lee; Thorpe, Neil; Matthews, Joseph; Kremer, Karsten

    2017-02-01

    In this paper, we propose a Bayesian hierarchical model for predicting accident counts in future years at sites within a pool of potential road safety hotspots. The aim is to inform road safety practitioners of the location of likely future hotspots to enable a proactive, rather than reactive, approach to road safety scheme implementation. A feature of our model is the ability to rank sites according to their potential to exceed, in some future time period, a threshold accident count which may be used as a criterion for scheme implementation. Our model specification enables the classical empirical Bayes formulation - commonly used in before-and-after studies, wherein accident counts from a single before period are used to estimate counterfactual counts in the after period - to be extended to incorporate counts from multiple time periods. This allows site-specific variations in historical accident counts (e.g. locally-observed trends) to offset estimates of safety generated by a global accident prediction model (APM), which itself is used to help account for the effects of global trend and regression-to-mean (RTM). The Bayesian posterior predictive distribution is exploited to formulate predictions and to properly quantify our uncertainty in these predictions. The main contributions of our model include (i) the ability to allow accident counts from multiple time-points to inform predictions, with counts in more recent years lending more weight to predictions than counts from time-points further in the past; (ii) where appropriate, the ability to offset global estimates of trend by variations in accident counts observed locally, at a site-specific level; and (iii) the ability to account for unknown/unobserved site-specific factors which may affect accident counts. We illustrate our model with an application to accident counts at 734 potential hotspots in the German city of Halle; we also propose some simple diagnostics to validate the predictive capability of our

  9. Stochastic models for predicting pitting corrosion damage of HLRW containers

    International Nuclear Information System (INIS)

    Henshall, G.A.

    1991-10-01

    Stochastic models for predicting aqueous pitting corrosion damage of high-level radioactive-waste containers are described. These models could be used to predict the time required for the first pit to penetrate a container and the increase in the number of breaches at later times, both of which would be useful in the repository system performance analysis. Monte Carlo implementations of the stochastic models are described, and predictions of induction time, survival probability and pit depth distributions are presented. These results suggest that the pit nucleation probability decreases with exposure time and that pit growth may be a stochastic process. The advantages and disadvantages of the stochastic approach, methods for modeling the effects of environment, and plans for future work are discussed

  10. Verification and improvement of a predictive model for radionuclide migration

    International Nuclear Information System (INIS)

    Miller, C.W.; Benson, L.V.; Carnahan, C.L.

    1982-01-01

    Prediction of the rates of migration of contaminant chemical species in groundwater flowing through toxic waste repositories is essential to the assessment of a repository's capability of meeting standards for release rates. A large number of chemical transport models, of varying degrees of complexity, have been devised for the purpose of providing this predictive capability. In general, the transport of dissolved chemical species through a water-saturated porous medium is influenced by convection, diffusion/dispersion, sorption, formation of complexes in the aqueous phase, and chemical precipitation. The reliability of predictions made with the models which omit certain of these processes is difficult to assess. A numerical model, CHEMTRN, has been developed to determine which chemical processes govern radionuclide migration. CHEMTRN builds on a model called MCCTM developed previously by Lichtner and Benson

  11. Ecosystem differences in the trophic enrichment of 13C in aquatic food webs

    International Nuclear Information System (INIS)

    France, R.L.; Peters, R.H.

    1997-01-01

    Data from 35 published studies were collated to examine patterns in the trophic enrichment of 13 C of consumers. Because both δ 13 C and δ 14 N vary systematically across ecosystems, it was necessary to standardize for such differences before combining data from numerous sources. Relationships of these measures of ecosystem-standardized δ 13 C to ecosystem-standardized trophic position (Δδ 15 N) for freshwater, estuarine, coastal, and open-ocean and for all aquatic ecosystems yielded regression equations of low predictive capability (average of 20% explained variance in δ 13 C). However, differences were observed in the slopes between δ 13 C and standardized trophic position when data were examined study-specifically: the average trophic fractionation of 13 C was found to increase from +0.2micron for freshwater to +0.5micron for estuarine to +0.8micron for coastal, and to +1.1micron for open-ocean food webs. This ecosystem-specific gradient in 13 C enrichment for consumers supports previous findings of a similar continuum existing for zooplankton - particulate organic matter differences in δ 13 C. Possible mechanisms to explain these ecosystem-specific patterns in 13 C enrichment may be related to the relative importance of detritus, heterotrophic respiration, partial reliance on alternative food sources, and lipid influences in the different ecosystems. (author)

  12. Numerical weather prediction (NWP) and hybrid ARMA/ANN model to predict global radiation

    International Nuclear Information System (INIS)

    Voyant, Cyril; Muselli, Marc; Paoli, Christophe; Nivet, Marie-Laure

    2012-01-01

    We propose in this paper an original technique to predict global radiation using a hybrid ARMA/ANN model and data issued from a numerical weather prediction model (NWP). We particularly look at the multi-layer perceptron (MLP). After optimizing our architecture with NWP and endogenous data previously made stationary and using an innovative pre-input layer selection method, we combined it to an ARMA model from a rule based on the analysis of hourly data series. This model has been used to forecast the hourly global radiation for five places in Mediterranean area. Our technique outperforms classical models for all the places. The nRMSE for our hybrid model MLP/ARMA is 14.9% compared to 26.2% for the naïve persistence predictor. Note that in the standalone ANN case the nRMSE is 18.4%. Finally, in order to discuss the reliability of the forecaster outputs, a complementary study concerning the confidence interval of each prediction is proposed. -- Highlights: ► Time series forecasting with hybrid method based on the use of ALADIN numerical weather model, ANN and ARMA. ► Innovative pre-input layer selection method. ► Combination of optimized MLP and ARMA model obtained from a rule based on the analysis of hourly data series. ► Stationarity process (method and control) for the global radiation time series.

  13. Key Questions in Building Defect Prediction Models in Practice

    Science.gov (United States)

    Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas

    The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.

  14. Hierarchical Neural Regression Models for Customer Churn Prediction

    Directory of Open Access Journals (Sweden)

    Golshan Mohammadi

    2013-01-01

    Full Text Available As customers are the main assets of each industry, customer churn prediction is becoming a major task for companies to remain in competition with competitors. In the literature, the better applicability and efficiency of hierarchical data mining techniques has been reported. This paper considers three hierarchical models by combining four different data mining techniques for churn prediction, which are backpropagation artificial neural networks (ANN, self-organizing maps (SOM, alpha-cut fuzzy c-means (α-FCM, and Cox proportional hazards regression model. The hierarchical models are ANN + ANN + Cox, SOM + ANN + Cox, and α-FCM + ANN + Cox. In particular, the first component of the models aims to cluster data in two churner and nonchurner groups and also filter out unrepresentative data or outliers. Then, the clustered data as the outputs are used to assign customers to churner and nonchurner groups by the second technique. Finally, the correctly classified data are used to create Cox proportional hazards model. To evaluate the performance of the hierarchical models, an Iranian mobile dataset is considered. The experimental results show that the hierarchical models outperform the single Cox regression baseline model in terms of prediction accuracy, Types I and II errors, RMSE, and MAD metrics. In addition, the α-FCM + ANN + Cox model significantly performs better than the two other hierarchical models.

  15. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  16. Enhancing pavement performance prediction models for the Illinois Tollway System

    Directory of Open Access Journals (Sweden)

    Laxmikanth Premkumar

    2016-01-01

    Full Text Available Accurate pavement performance prediction represents an important role in prioritizing future maintenance and rehabilitation needs, and predicting future pavement condition in a pavement management system. The Illinois State Toll Highway Authority (Tollway with over 2000 lane miles of pavement utilizes the condition rating survey (CRS methodology to rate pavement performance. Pavement performance models developed in the past for the Illinois Department of Transportation (IDOT are used by the Tollway to predict the future condition of its network. The model projects future CRS ratings based on pavement type, thickness, traffic, pavement age and current CRS rating. However, with time and inclusion of newer pavement types there was a need to calibrate the existing pavement performance models, as well as, develop models for newer pavement types.This study presents the results of calibrating the existing models, and developing new models for the various pavement types in the Illinois Tollway network. The predicted future condition of the pavements is used in estimating its remaining service life to failure, which is of immediate use in recommending future maintenance and rehabilitation requirements for the network. Keywords: Pavement performance models, Remaining life, Pavement management

  17. Error analysis in predictive modelling demonstrated on mould data.

    Science.gov (United States)

    Baranyi, József; Csernus, Olívia; Beczner, Judit

    2014-01-17

    The purpose of this paper was to develop a predictive model for the effect of temperature and water activity on the growth rate of Aspergillus niger and to determine the sources of the error when the model is used for prediction. Parallel mould growth curves, derived from the same spore batch, were generated and fitted to determine their growth rate. The variances of replicate ln(growth-rate) estimates were used to quantify the experimental variability, inherent to the method of determining the growth rate. The environmental variability was quantified by the variance of the respective means of replicates. The idea is analogous to the "within group" and "between groups" variability concepts of ANOVA procedures. A (secondary) model, with temperature and water activity as explanatory variables, was fitted to the natural logarithm of the growth rates determined by the primary model. The model error and the experimental and environmental errors were ranked according to their contribution to the total error of prediction. Our method can readily be applied to analysing the error structure of predictive models of bacterial growth models, too. © 2013.

  18. Predicting nucleic acid binding interfaces from structural models of proteins.

    Science.gov (United States)

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2012-02-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However, the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three-dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared with patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. Copyright © 2011 Wiley Periodicals, Inc.

  19. Robust Model Predictive Control of a Wind Turbine

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Poulsen, Niels Kjølstad; Niemann, Hans Henrik

    2012-01-01

    In this work the problem of robust model predictive control (robust MPC) of a wind turbine in the full load region is considered. A minimax robust MPC approach is used to tackle the problem. Nonlinear dynamics of the wind turbine are derived by combining blade element momentum (BEM) theory...... of the uncertain system is employed and a norm-bounded uncertainty model is used to formulate a minimax model predictive control. The resulting optimization problem is simplified by semidefinite relaxation and the controller obtained is applied on a full complexity, high fidelity wind turbine model. Finally...... and first principle modeling of the turbine flexible structure. Thereafter the nonlinear model is linearized using Taylor series expansion around system operating points. Operating points are determined by effective wind speed and an extended Kalman filter (EKF) is employed to estimate this. In addition...

  20. Enhancing pavement performance prediction models for the Illinois Tollway System

    OpenAIRE

    Laxmikanth Premkumar; William R. Vavrik

    2016-01-01

    Accurate pavement performance prediction represents an important role in prioritizing future maintenance and rehabilitation needs, and predicting future pavement condition in a pavement management system. The Illinois State Toll Highway Authority (Tollway) with over 2000 lane miles of pavement utilizes the condition rating survey (CRS) methodology to rate pavement performance. Pavement performance models developed in the past for the Illinois Department of Transportation (IDOT) are used by th...