WorldWideScience

Sample records for models assessing nutrient-limited

  1. Emergence of nutrient limitation in tropical dry forests: hypotheses from simulation models

    Science.gov (United States)

    Medvigy, D.; Waring, B. G.; Xu, X.; Trierweiler, A.; Werden, L. K.; Wang, G.; Zhu, Q.; Powers, J. S.

    2017-12-01

    It is unclear to what extent tropical dry forest productivity may be limited by nutrients. Direct assessment of nutrient limitation through fertilization experiments has been rare, and paradigms pertaining to other ecosystems may not extend to tropical dry forests. For example, because dry tropical forests have a lower water supply than moist tropical forests, dry forests can have lower decomposition rates, higher soil carbon and nitrogen concentrations, and a more open nitrogen cycle than moist forests. We used a mechanistic, numerical model to generate hypotheses about nutrient limitation in tropical dry forests. The model dynamically couples ED2 (vegetation dynamics), MEND (biogeochemistry), and N-COM (plant-microbe competition for nutrients). Here, the MEND-component of the model has been extended to include nitrogen (N) and phosphorus (P) cycles. We focus on simulation of sixteen 25m x 25m plots in Costa Rica where a fertilization experiment has been underway since 2015. Baseline simulations are characterized by both nitrogen and phosphorus limitation of vegetation. Fertilization with N and P increased vegetation biomass, with N fertilization having a somewhat stronger effect. Nutrient limitation was also sensitive to climate and was more pronounced during drought periods. Overflow respiration was identified as a key process that mitigated nutrient limitation. These results suggest that, despite often having richer soils than tropical moist forests, tropical dry forests can also become nutrient-limited. If the climate becomes drier in the next century, as is expected for Central America, drier soils may decrease microbial activity and exacerbate nutrient limitation. The importance of overflow respiration underscores the need for appropriate treatment of microbial dynamics in ecosystem models. Ongoing and new nutrient fertilization experiments will present opportunities for testing whether, and how, nutrient limitation may indeed be emerging in tropical dry

  2. Assessment of Nutrient Limitation in Flood plain Forests with Two Different Techniques

    International Nuclear Information System (INIS)

    Neatrour, M.A.; Jones, R.H.; Golladay, S.W.

    2008-01-01

    We assessed nitrogen and phosphorus limitation in a flood plain forest in southern Georgia in USA using two commonly used methods: nitrogen to phosphorus (N:P) ratios in litterfall and fertilized ingrowth cores. We measured nitrogen (N) and phosphorus (P) concentrations in litterfall to determine N:P mass ratios. We also installed ingrowth cores within each site containing native soil amended with nitrogen (N), phosphorus (P), or nitrogen and phosphorus (N + P) fertilizers or without added fertilizer (C). Litter N:P ratios ranged from 16 to 22, suggesting P limitation. However, fertilized ingrowth cores indicated N limitation because fine-root length density was greater in cores fertilized with N or N + P than in those fertilized with P or without added fertilizer. We feel that these two methods of assessing nutrient limitation should be corroborated with fertilization trials prior to use on a wider basis.

  3. Monitoring of nutrient limitation in growing E. coli: a mathematical model of a ppGpp-based biosensor.

    Science.gov (United States)

    Pokhilko, Alexandra

    2017-11-21

    E. coli can be used as bacterial cell factories for production of biofuels and other useful compounds. The efficient production of the desired products requires careful monitoring of growth conditions and the optimization of metabolic fluxes. To avoid nutrient depletion and maximize product yields we suggest using a natural mechanism for sensing nutrient limitation, related to biosynthesis of an intracellular messenger - guanosine tetraphosphate (ppGpp). We propose a design for a biosensor, which monitors changes in the intracellular concentration of ppGpp by coupling it to a fluorescent output. We used mathematical modelling to analyse the intracellular dynamics of ppGpp, its fluorescent reporter, and cell growth in normal and fatty acid-producing E. coli lines. The model integrates existing mechanisms of ppGpp regulation and predicts the biosensor response to changes in nutrient state. In particular, the model predicts that excessive stimulation of fatty acid production depletes fatty acid intermediates, downregulates growth and increases the levels of ppGpp-related fluorescence. Our analysis demonstrates that the ppGpp sensor can be used for early detection of nutrient limitation during cell growth and for testing productivity of engineered lines.

  4. Unlocking Ft: Modeling thermodynamic controls and isotope fractionation factors in nutrient limited environments

    Science.gov (United States)

    Druhan, J. L.; Giannetta, M.; Sanford, R. A.

    2017-12-01

    , resulting in model behavior which is, in effect, a microbial redox analog to the variable observed fractionation factor resulting from a transition state theory rate law as derived by DePaolo (2011).

  5. Nutrient limitation reduces land carbon uptake in simulations with a model of combined carbon, nitrogen and phosphorus cycling

    Directory of Open Access Journals (Sweden)

    D. S. Goll

    2012-09-01

    Full Text Available Terrestrial carbon (C cycle models applied for climate projections simulate a strong increase in net primary productivity (NPP due to elevated atmospheric CO2 concentration during the 21st century. These models usually neglect the limited availability of nitrogen (N and phosphorus (P, nutrients that commonly limit plant growth and soil carbon turnover. To investigate how the projected C sequestration is altered when stoichiometric constraints on C cycling are considered, we incorporated a P cycle into the land surface model JSBACH (Jena Scheme for Biosphere–Atmosphere Coupling in Hamburg, which already includes representations of coupled C and N cycles.

    The model reveals a distinct geographic pattern of P and N limitation. Under the SRES (Special Report on Emissions Scenarios A1B scenario, the accumulated land C uptake between 1860 and 2100 is 13% (particularly at high latitudes and 16% (particularly at low latitudes lower in simulations with N and P cycling, respectively, than in simulations without nutrient cycles. The combined effect of both nutrients reduces land C uptake by 25% compared to simulations without N or P cycling. Nutrient limitation in general may be biased by the model simplicity, but the ranking of limitations is robust against the parameterization and the inflexibility of stoichiometry. After 2100, increased temperature and high CO2 concentration cause a shift from N to P limitation at high latitudes, while nutrient limitation in the tropics declines. The increase in P limitation at high-latitudes is induced by a strong increase in NPP and the low P sorption capacity of soils, while a decline in tropical NPP due to high autotrophic respiration rates alleviates N and P limitations. The quantification of P limitation remains challenging. The poorly constrained processes of soil P sorption and biochemical mineralization are identified as the main uncertainties in the strength of P limitation

  6. Internal cycling, not external loading, decides the nutrient limitation in eutrophic lake: A dynamic model with temporal Bayesian hierarchical inference.

    Science.gov (United States)

    Wu, Zhen; Liu, Yong; Liang, Zhongyao; Wu, Sifeng; Guo, Huaicheng

    2017-06-01

    Lake eutrophication is associated with excessive anthropogenic nutrients (mainly nitrogen (N) and phosphorus (P)) and unobserved internal nutrient cycling. Despite the advances in understanding the role of external loadings, the contribution of internal nutrient cycling is still an open question. A dynamic mass-balance model was developed to simulate and measure the contributions of internal cycling and external loading. It was based on the temporal Bayesian Hierarchical Framework (BHM), where we explored the seasonal patterns in the dynamics of nutrient cycling processes and the limitation of N and P on phytoplankton growth in hyper-eutrophic Lake Dianchi, China. The dynamic patterns of the five state variables (Chla, TP, ammonia, nitrate and organic N) were simulated based on the model. Five parameters (algae growth rate, sediment exchange rate of N and P, nitrification rate and denitrification rate) were estimated based on BHM. The model provided a good fit to observations. Our model results highlighted the role of internal cycling of N and P in Lake Dianchi. The internal cycling processes contributed more than external loading to the N and P changes in the water column. Further insights into the nutrient limitation analysis indicated that the sediment exchange of P determined the P limitation. Allowing for the contribution of denitrification to N removal, N was the more limiting nutrient in most of the time, however, P was the more important nutrient for eutrophication management. For Lake Dianchi, it would not be possible to recover solely by reducing the external watershed nutrient load; the mechanisms of internal cycling should also be considered as an approach to inhibit the release of sediments and to enhance denitrification. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Nutrient Limitation in Central Red Sea Mangroves

    KAUST Repository

    Almahasheer, Hanan; Duarte, Carlos M.; Irigoien, Xabier

    2016-01-01

    Red Sea have characteristic heights of ~2 m, suggesting nutrient limitation. We assessed the nutrient status of mangrove stands in the Central Red Sea and conducted a fertilization experiment (N, P and Fe and various combinations thereof) on 4-week

  8. Successional dynamics drive tropical forest nutrient limitation

    Science.gov (United States)

    Chou, C.; Hedin, L. O. O.

    2017-12-01

    It is increasingly recognized that nutrients such as N and P may significantly constrain the land carbon sink. However, we currently lack a complete understanding of these nutrient cycles in forest ecosystems and how to incorporate them into Earth System Models. We have developed a framework of dynamic forest nutrient limitation, focusing on the role of secondary forest succession and canopy gap disturbances as bottlenecks of high plant nutrient demand and limitation. We used succession biomass data to parameterize a simple ecosystem model and examined the dynamics of nutrient limitation throughout tropical secondary forest succession. Due to the patterns of biomass recovery in secondary tropical forests, we found high nutrient demand from rapid biomass accumulation in the earliest years of succession. Depending on previous land use scenarios, soil nutrient availability may also be low in this time period. Coupled together, this is evidence that there may be high biomass nutrient limitation early in succession, which is partially met by abundant symbiotic nitrogen fixation from certain tree species. We predict a switch from nitrogen limitation in early succession to one of three conditions: (i) phosphorus only, (ii) phosphorus plus nitrogen, or (iii) phosphorus, nitrogen, plus light co-limitation. We will discuss the mechanisms that govern the exact trajectory of limitation as forests build biomass. In addition, we used our model to explore scenarios of tropical secondary forest impermanence and the impacts of these dynamics on ecosystem nutrient limitation. We found that secondary forest impermanence exacerbates nutrient limitation and the need for nitrogen fixation early in succession. Together, these results indicate that biomass recovery dynamics early in succession as well as their connection to nutrient demand and limitation are fundamental for understanding and modeling nutrient limitation of the tropical forest carbon sink.

  9. Nutrient Limitation in Central Red Sea Mangroves

    KAUST Repository

    Almahasheer, Hanan

    2016-12-24

    As coastal plants that can survive in salt water, mangroves play an essential role in large marine ecosystems (LMEs). The Red Sea, where the growth of mangroves is stunted, is one of the least studied LMEs in the world. Mangroves along the Central Red Sea have characteristic heights of ~2 m, suggesting nutrient limitation. We assessed the nutrient status of mangrove stands in the Central Red Sea and conducted a fertilization experiment (N, P and Fe and various combinations thereof) on 4-week-old seedlings of Avicennia marina to identify limiting nutrients and stoichiometric effects. We measured height, number of leaves, number of nodes and root development at different time periods as well as the leaf content of C, N, P, Fe, and Chl a in the experimental seedlings. Height, number of nodes and number of leaves differed significantly among treatments. Iron treatment resulted in significantly taller plants compared with other nutrients, demonstrating that iron is the primary limiting nutrient in the tested mangrove population and confirming Liebig\\'s law of the minimum: iron addition alone yielded results comparable to those using complete fertilizer. This result is consistent with the biogenic nature of the sediments in the Red Sea, which are dominated by carbonates, and the lack of riverine sources of iron.

  10. Plant ecosystem responses to rising atmospheric CO2: applying a "two-timing" approach to assess alternative hypotheses for mechanisms of nutrient limitation

    Science.gov (United States)

    Medlyn, B.; Jiang, M.; Zaehle, S.

    2017-12-01

    There is now ample experimental evidence that the response of terrestrial vegetation to rising atmospheric CO2 concentration is modified by soil nutrient availability. How to represent nutrient cycling processes is thus a key consideration for vegetation models. We have previously used model intercomparison to demonstrate that models incorporating different assumptions predict very different responses at Free-Air CO2 Enrichment experiments. Careful examination of model outputs has provided some insight into the reasons for the different model outcomes, but it is difficult to attribute outcomes to specific assumptions. Here we investigate the impact of individual assumptions in a generic plant carbon-nutrient cycling model. The G'DAY (Generic Decomposition And Yield) model is modified to incorporate alternative hypotheses for nutrient cycling. We analyse the impact of these assumptions in the model using a simple analytical approach known as "two-timing". This analysis identifies the quasi-equilibrium behaviour of the model at the time scales of the component pools. The analysis provides a useful mathematical framework for probing model behaviour and identifying the most critical assumptions for experimental study.

  11. Nutrient limitation of soil microbial activity during the earliest stages of ecosystem development.

    Science.gov (United States)

    Castle, Sarah C; Sullivan, Benjamin W; Knelman, Joseph; Hood, Eran; Nemergut, Diana R; Schmidt, Steven K; Cleveland, Cory C

    2017-11-01

    A dominant paradigm in ecology is that plants are limited by nitrogen (N) during primary succession. Whether generalizable patterns of nutrient limitation are also applicable to metabolically and phylogenetically diverse soil microbial communities, however, is not well understood. We investigated if measures of N and phosphorus (P) pools inform our understanding of the nutrient(s) most limiting to soil microbial community activities during primary succession. We evaluated soil biogeochemical properties and microbial processes using two complementary methodological approaches-a nutrient addition microcosm experiment and extracellular enzyme assays-to assess microbial nutrient limitation across three actively retreating glacial chronosequences. Microbial respiratory responses in the microcosm experiment provided evidence for N, P and N/P co-limitation at Easton Glacier, Washington, USA, Puca Glacier, Peru, and Mendenhall Glacier, Alaska, USA, respectively, and patterns of nutrient limitation generally reflected site-level differences in soil nutrient availability. The activities of three key extracellular enzymes known to vary with soil N and P availability developed in broadly similar ways among sites, increasing with succession and consistently correlating with changes in soil total N pools. Together, our findings demonstrate that during the earliest stages of soil development, microbial nutrient limitation and activity generally reflect soil nutrient supply, a result that is broadly consistent with biogeochemical theory.

  12. Investment in secreted enzymes during nutrient-limited growth is utility dependent.

    Science.gov (United States)

    Cezairliyan, Brent; Ausubel, Frederick M

    2017-09-12

    Pathogenic bacteria secrete toxins and degradative enzymes that facilitate their growth by liberating nutrients from the environment. To understand bacterial growth under nutrient-limited conditions, we studied resource allocation between cellular and secreted components by the pathogenic bacterium Pseudomonas aeruginosa during growth on a protein substrate that requires extracellular digestion by secreted proteases. We identified a quantitative relationship between the rate of increase of cellular biomass under nutrient-limiting growth conditions and the rate of increase in investment in secreted proteases. Production of secreted proteases is stimulated by secreted signals that convey information about the utility of secreted proteins during nutrient-limited growth. Growth modeling using this relationship recapitulated the observed kinetics of bacterial growth on a protein substrate. The proposed regulatory strategy suggests a rationale for quorum-sensing-dependent stimulation of the production of secreted enzymes whereby investment in secreted enzymes occurs in proportion to the utility they confer. Our model provides a framework that can be applied toward understanding bacterial growth in many environments where growth rate is limited by the availability of nutrients.

  13. Independence of nutrient limitation and carbon dioxide impacts on the Southern Ocean coccolithophore Emiliania huxleyi.

    Science.gov (United States)

    Müller, Marius N; Trull, Thomas W; Hallegraeff, Gustaaf M

    2017-08-01

    Future oceanic conditions induced by anthropogenic greenhouse gas emissions include warming, acidification and reduced nutrient supply due to increased stratification. Some parts of the Southern Ocean are expected to show rapid changes, especially for carbonate mineral saturation. Here we compare the physiological response of the model coccolithophore Emiliania huxleyi (strain EHSO 5.14, originating from 50 o S, 149 o E) with pH/CO 2 gradients (mimicking ocean acidification ranging from 1 to 4 × current pCO 2 levels) under nutrient-limited (nitrogen and phosphorus) and -replete conditions. Both nutrient limitations decreased per cell photosynthesis (particulate organic carbon (POC) production) and calcification (particulate inorganic carbon (PIC) production) rates for all pCO 2 levels, with more than 50% reductions under nitrogen limitation. These impacts, however, became indistinguishable from nutrient-replete conditions when normalized to cell volume. Calcification decreased three-fold and linearly with increasing pCO 2 under all nutrient conditions, and was accompanied by a smaller ~30% nonlinear reduction in POC production, manifested mainly above 3 × current pCO 2 . Our results suggest that normalization to cell volume allows the major impacts of nutrient limitation (changed cell sizes and reduced PIC and POC production rates) to be treated independently of the major impacts of increasing pCO 2 and, additionally, stresses the importance of including cell volume measurements to the toolbox of standard physiological analysis of coccolithophores in field and laboratory studies.

  14. Long-term seasonal nutrient limiting patterns at Meiliang Bay in a large, shallow and subtropical Lake Taihu, China

    Directory of Open Access Journals (Sweden)

    Rui Ye

    2015-04-01

    Full Text Available Lake Taihu has undergone severe eutrophication in the past three decades, and harmful cyanobacteria blooms occur nearly every year in Meiliang Bay at the north end of the lake. To elucidate the potential relationship between seasonal nutrient limitation and phytoplankton proliferation, a 20-year (1991-2012 time series of nutrient limitation in Meiliang Bay was analyzed for deviations between trophic state index (TSI parameters. Results showed that patterns of nutrient limitation in Meiliang Bay were distinctly seasonal, where phytoplankton growth was generally phosphorus (P-limited in winter and spring, but nitrogen (N-limited mainly occurred in summer and fall. This general pattern, however, shifted into N limitation across the four seasons during the mid-1990s because a rapid increase in industrialization led to a significant rise in the input of N and P from inflowing tributaries. The initial patterns were restored by environmental regulation in the end of 1990s, including the Zero Actions plan. Using routine monitoring data, a generalised additive model (GAM with time and deviations between trophic state indexes for nitrogen and phosphorus (TSIN-TSIP as explanatory variables was used to explore which nutrient was responsible for limitation of phytoplankton chlorophyll-a (Chl-a in different seasons. Surprisingly, the model revealed a weak N limitation (TSIN-TSIP = -10 corresponded to peak values of Chl-a in summer-autumn season, which is probably because the phytoplankton community is co-limited by N & P during the period. The shift of nutrition limitation during winter-spring would partially explain high values of Chl-a throughout 1996. This study suggests that seasonal patterns of nutrient limitation must be considered to develop effective management measures to control cyanobacterial blooms.

  15. Detecting terrestrial nutrient limitation: a global meta-analysis of foliar nutrient concentrations after fertilization

    Directory of Open Access Journals (Sweden)

    Rebecca eOstertag

    2016-03-01

    Full Text Available Examining foliar nutrient concentrations after fertilization provides an alternative method for detecting nutrient limitation of ecosystems, which is logistically simpler to measure than biomass change. We present a meta-analysis of response ratios of foliar nitrogen and phosphorus (RRN, RRP after addition of fertilizer of nitrogen (N, phosphorus (P, or the two elements in combination, in relation to climate, ecosystem type, life form, family, and methodological factors. Results support other meta-analyses using biomass, and demonstrate there is strong evidence for nutrient limitation in natural communities. However, because N fertilization experiments greatly outnumber P fertilization trials, it is difficult to discern the absolute importance of N vs. P vs. co-limitation across ecosystems. Despite these caveats, it is striking that results did not follow conventional wisdom that temperate ecosystems are N-limited and tropical ones are P-limited. In addition, the use of ratios of N-to-P rather than response ratios also are a useful index of nutrient limitation, but due to large overlap in values, there are unlikely to be universal cutoff values for delimiting N vs. P limitation. Differences in RRN and RRP were most significant across ecosystem types, plant families, life forms, and between competitive environments, but not across climatic variables.

  16. Shifts in lake N: P stoichiometry and nutrient limitation driven by atmospheric nitrogen deposition

    Science.gov (United States)

    Elser, J.J.; Andersen, T.; Baron, Jill S.; Bergstrom, A.-K.; Jansson, M.; Kyle, M.; Nydick, K.R.; Steger, L.; Hessen, D.O.

    2009-01-01

    Human activities have more than doubled the amount of nitrogen (N) circulating in the biosphere. One major pathway of this anthropogenic N input into ecosystems has been increased regional deposition from the atmosphere. Here we show that atmospheric N deposition increased the stoichiometric ratio of N and phosphorus (P) in lakes in Norway, Sweden, and Colorado, United States, and, as a result, patterns of ecological nutrient limitation were shifted. Under low N deposition, phytoplankton growth is generally N-limited; however, in high-N deposition lakes, phytoplankton growth is consistently P-limited. Continued anthropogenic amplification of the global N cycle will further alter ecological processes, such as biogeochemical cycling, trophic dynamics, and biological diversity, in the world's lakes, even in lakes far from direct human disturbance.

  17. Analogous nutrient limitations in unicellular diazotrophs and Prochlorococcus in the South Pacific Ocean.

    Science.gov (United States)

    Moisander, Pia H; Zhang, Ruifeng; Boyle, Edward A; Hewson, Ian; Montoya, Joseph P; Zehr, Jonathan P

    2012-04-01

    Growth limitation of phytoplankton and unicellular nitrogen (N(2)) fixers (diazotrophs) were investigated in the oligotrophic Western South Pacific Ocean. Based on change in abundances of nifH or 23S rRNA gene copies during nutrient-enrichment experiments, the factors limiting net growth of the unicellular diazotrophs UCYN-A (Group A), Crocosphaera watsonii, γ-Proteobacterium 24774A11, and the non-diazotrophic picocyanobacterium Prochlorococcus, varied within the region. At the westernmost stations, numbers were enhanced by organic carbon added as simple sugars, a combination of iron and an organic chelator, or iron added with phosphate. At stations nearest the equator, the nutrient-limiting growth was not apparent. Maximum net growth rates for UCYN-A, C. watsonii and γ-24774A11 were 0.19, 0.61 and 0.52 d(-1), respectively, which are the first known empirical growth rates reported for the uncultivated UCYN-A and the γ-24774A11. The addition of N enhanced total phytoplankton biomass up to 5-fold, and the non-N(2)-fixing Synechococcus was among the groups that responded favorably to N addition. Nitrogen was the major nutrient-limiting phytoplankton biomass in the Western South Pacific Ocean, while availability of organic carbon or iron and organic chelator appear to limit abundances of unicellular diazotrophs. Lack of phytoplankton response to nutrient additions in the Pacific warm pool waters suggests diazotroph growth in this area is controlled by different factors than in the higher latitudes, which may partially explain previously observed variability in community composition in the region.

  18. Kinetics of growth and lipids accumulation in Chlorella vulgaris during batch heterotrophic cultivation: Effect of different nutrient limitation strategies.

    Science.gov (United States)

    Sakarika, Myrsini; Kornaros, Michael

    2017-11-01

    The present study aimed at: (1) determining the effect of sulfur addition on biomass growth and (2) assessing the effect of sulfur, phosphorus and nitrogen limitation on lipid accumulation by C. vulgaris SAG 211-11b. The sulfur cellular content was more than two-fold higher under nitrogen and phosphorus limitation (0.52% and 0.54%ww -1 , respectively) compared to sulfur requirements (0.20%ww -1 ) under sulfur limiting conditions. The nitrogen needs are significantly lower (2.81-3.35%ww -1 ) when compared to other microalgae and become 23% lower under nitrogen or phosphorus limitation. The microalga exhibited substrate inhibition above 30gL -1 initial glucose concentration. Sulfur limitation had the most significant effect on lipid accumulation, resulting in maximum total lipid content of 53.43±3.93%gg DW -1 . In addition to enhancing lipid productivity, adopting the optimal nutrient limitation strategy can result in cost savings by avoiding unnecessary nutrient additions and eliminate the environmental burden due to wasted resources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Adaptation of Bacillus subtilis carbon core metabolism to simultaneous nutrient limitation and osmotic challenge : a multi-omics perspective

    NARCIS (Netherlands)

    Kohlstedt, Michael; Sappa, Praveen K; Meyer, Hanna; Maaß, Sandra; Zaprasis, Adrienne; Hoffmann, Tamara; Becker, Judith; Steil, Leif; Hecker, Michael; van Dijl, Jan Maarten; Lalk, Michael; Mäder, Ulrike; Stülke, Jörg; Bremer, Erhard; Völker, Uwe; Wittmann, Christoph

    The Gram-positive bacterium Bacillus subtilis encounters nutrient limitations and osmotic stress in its natural soil ecosystem. To ensure survival and sustain growth, highly integrated adaptive responses are required. Here, we investigated the system-wide response of B.subtilis to different,

  20. Nutrient Limitation in Surface Waters of the Oligotrophic Eastern Mediterranean Sea: an Enrichment Microcosm Experiment

    KAUST Repository

    Tsiola, A.

    2015-12-01

    The growth rates of planktonic microbes in the pelagic zone of the Eastern Mediterranean Sea are nutrient limited, but the type of limitation is still uncertain. During this study, we investigated the occurrence of N and P limitation among different groups of the prokaryotic and eukaryotic (pico-, nano-, and micro-) plankton using a microcosm experiment during stratified water column conditions in the Cretan Sea (Eastern Mediterranean). Microcosms were enriched with N and P (either solely or simultaneously), and the PO4 turnover time, prokaryotic heterotrophic activity, primary production, and the abundance of the different microbial components were measured. Flow cytometric and molecular fingerprint analyses showed that different heterotrophic prokaryotic groups were limited by different nutrients; total heterotrophic prokaryotic growth was limited by P, but only when both N and P were added, changes in community structure and cell size were detected. Phytoplankton were N and P co-limited, with autotrophic pico-eukaryotes being the exception as they increased even when only P was added after a 2-day time lag. The populations of Synechococcus and Prochlorococcus were highly competitive with each other; Prochlorococcus abundance increased during the first 2 days of P addition but kept increasing only when both N and P were added, whereas Synechococcus exhibited higher pigment content and increased in abundance 3 days after simultaneous N and P additions. Dinoflagellates also showed opportunistic behavior at simultaneous N and P additions, in contrast to diatoms and coccolithophores, which diminished in all incubations. High DNA content viruses, selective grazing, and the exhaustion of N sources probably controlled the populations of diatoms and coccolithophores.

  1. Aggregated filter-feeding consumers alter nutrient limitation: consequences for ecosystem and community dynamics.

    Science.gov (United States)

    Atkinson, Carla L; Vaughn, Caryn C; Forshay, Kenneth J; Cooper, Joshua T

    2013-06-01

    Nutrient cycling is a key process linking organisms in ecosystems. This is especially apparent in stream environments in which nutrients are taken up readily and cycled through the system in a downstream trajectory. Ecological stoichiometry predicts that biogeochemical cycles of different elements are interdependent because the organisms that drive these cycles require fixed ratios of nutrients. There is growing recognition that animals play an important role in biogeochemical cycling across ecosystems. In particular, dense aggregations of consumers can create biogeochemical hotspots in aquatic ecosystems via nutrient translocation. We predicted that filter-feeding freshwater mussels, which occur as speciose, high-biomass aggregates, would create biogeochemical hotspots in streams by altering nutrient limitation and algal dynamics. In a field study, we manipulated nitrogen and phosphorus using nutrient-diffusing substrates in areas with high and low mussel abundance, recorded algal growth and community composition, and determined in situ mussel excretion stoichiometry at 18 sites in three rivers (Kiamichi, Little, and Mountain Fork Rivers, south-central United States). Our results indicate that mussels greatly influence ecosystem processes by modifying the nutrients that limit primary productivity. Sites without mussels were N-limited with -26% higher relative abundances of N-fixing blue-green algae, while sites with high mussel densities were co-limited (N and P) and dominated by diatoms. These results corroborated the results of our excretion experiments; our path analysis indicated that mussel excretion has a strong influence on stream water column N:P. Due to the high N:P of mussel excretion, strict N-limitation was alleviated, and the system switched to being co-limited by both N and P. This shows that translocation of nutrients by mussel aggregations is important to nutrient dynamics and algal species composition in these rivers. Our study highlights the

  2. Global comparison reveals biogenic weathering as driven by nutrient limitation at ecosystem scale

    Science.gov (United States)

    Boy, Jens; Godoy, Roberto; Dechene, Annika; Shibistova, Olga; Amir, Hamid; Iskandar, Issi; Fogliano, Bruno; Boy, Diana; McCulloch, Robert; Andrino, Alberto; Gschwendtner, Silvia; Marin, Cesar; Sauheitl, Leopold; Dultz, Stefan; Mikutta, Robert; Guggenberger, Georg

    2017-04-01

    A substantial contribution of biogenic weathering in ecosystem nutrition, especially by symbiotic microorganisms, has often been proposed, but large-scale in vivo studies are still missing. Here we compare a set of ecosystems spanning from the Antarctic to tropical forests for their potential biogenic weathering and its drivers. To address biogenic weathering rates, we installed mineral mesocosms only accessible for bacteria and fungi for up to 4 years, which contained freshly broken and defined nutrient-baring minerals in soil A horizons of ecosystems along a gradient of soil development differing in climate and plant species communities. Alterations of the buried minerals were analyzed by grid-intersection, confocal lascer scanning microscopy, energy-dispersive X-ray spectroscopy, and X-ray photoelectron spectroscopy on the surface and on thin sections. On selected sites, carbon fluxes were tracked by 13C labeling, and microbial community was identified by DNA sequencing. In young ecosystems (protosoils) biogenic weathering is almost absent and starts after first carbon accumulation by aeolian (later litter) inputs and is mainly performed by bacteria. With ongoing soil development and appearance of symbiotic (mycorrhized) plants, nutrient availability in soil increasingly drove biogenic weathering, and fungi became the far more important players than bacteria. We found a close relation between fungal biogenic weathering and available potassium across all 16 forested sites in the study, regardless of the dominant mycorrhiza type (AM or EM), climate, and plant-species composition. We conclude that nutrient limitations at ecosystem scale are generally counteracted by adapted fungal biogenic weathering. The close relation between fungal weathering and plant-available nutrients over a large range of severely contrasting ecosystems points towards a direct energetic support of these weathering processes by the photoautotrophic community, making biogenic weathering a

  3. The repertoire and dynamics of evolutionary adaptations to controlled nutrient-limited environments in yeast.

    Directory of Open Access Journals (Sweden)

    David Gresham

    2008-12-01

    Full Text Available The experimental evolution of laboratory populations of microbes provides an opportunity to observe the evolutionary dynamics of adaptation in real time. Until very recently, however, such studies have been limited by our inability to systematically find mutations in evolved organisms. We overcome this limitation by using a variety of DNA microarray-based techniques to characterize genetic changes -- including point mutations, structural changes, and insertion variation -- that resulted from the experimental adaptation of 24 haploid and diploid cultures of Saccharomyces cerevisiae to growth in either glucose, sulfate, or phosphate-limited chemostats for approximately 200 generations. We identified frequent genomic amplifications and rearrangements as well as novel retrotransposition events associated with adaptation. Global nucleotide variation detection in ten clonal isolates identified 32 point mutations. On the basis of mutation frequencies, we infer that these mutations and the subsequent dynamics of adaptation are determined by the batch phase of growth prior to initiation of the continuous phase in the chemostat. We relate these genotypic changes to phenotypic outcomes, namely global patterns of gene expression, and to increases in fitness by 5-50%. We found that the spectrum of available mutations in glucose- or phosphate-limited environments combined with the batch phase population dynamics early in our experiments allowed several distinct genotypic and phenotypic evolutionary pathways in response to these nutrient limitations. By contrast, sulfate-limited populations were much more constrained in both genotypic and phenotypic outcomes. Thus, the reproducibility of evolution varies with specific selective pressures, reflecting the constraints inherent in the system-level organization of metabolic processes in the cell. We were able to relate some of the observed adaptive mutations (e.g., transporter gene amplifications to known features

  4. A set of nutrient limitations trigger yeast cell death in a nitrogen-dependent manner during wine alcoholic fermentation.

    Directory of Open Access Journals (Sweden)

    Camille Duc

    Full Text Available Yeast cell death can occur during wine alcoholic fermentation. It is generally considered to result from ethanol stress that impacts membrane integrity. This cell death mainly occurs when grape musts processing reduces lipid availability, resulting in weaker membrane resistance to ethanol. However the mechanisms underlying cell death in these conditions remain unclear. We examined cell death occurrence considering yeast cells ability to elicit an appropriate response to a given nutrient limitation and thus survive starvation. We show here that a set of micronutrients (oleic acid, ergosterol, pantothenic acid and nicotinic acid in low, growth-restricting concentrations trigger cell death in alcoholic fermentation when nitrogen level is high. We provide evidence that nitrogen signaling is involved in cell death and that either SCH9 deletion or Tor inhibition prevent cell death in several types of micronutrient limitation. Under such limitations, yeast cells fail to acquire any stress resistance and are unable to store glycogen. Unexpectedly, transcriptome analyses did not reveal any major changes in stress genes expression, suggesting that post-transcriptional events critical for stress response were not triggered by micronutrient starvation. Our data point to the fact that yeast cell death results from yeast inability to trigger an appropriate stress response under some conditions of nutrient limitations most likely not encountered by yeast in the wild. Our conclusions provide a novel frame for considering both cell death and the management of nutrients during alcoholic fermentation.

  5. Diagnosis & Correction of Soil Nutrient Limitations in Intensively managed southern pine forests

    Energy Technology Data Exchange (ETDEWEB)

    University of Florida

    2002-10-25

    Forest productivity is one manner to sequester carbon and it is a renewable energy source. Likewise, efficient use of fertilization can be a significant energy savings. To date, site-specific use of fertilization for the purpose of maximizing forest productivity has not been well developed. Site evaluation of nutrient deficiencies is primarily based on empirical approaches to soil testing and plot fertilizer tests with little consideration for soil water regimes and contributing site factors. This project uses mass flow diffusion theory in a modeling context, combined with process level knowledge of soil chemistry, to evaluate nutrient bioavailability to fast-growing juvenile forest stands growing on coastal plain Spodosols of the southeastern U.S. The model is not soil or site specific and should be useful for a wide range of soil management/nutrient management conditions. In order to use the model, field data of fast-growing southern pine needed to be measured and used in the validation of the model. The field aspect of the study was mainly to provide data that could be used to verify the model. However, we learned much about the growth and development of fast growing loblolly. Carbon allocation patterns, root shoot relationships and leaf area root relationships proved to be new, important information. The Project Objectives were to: (1) Develop a mechanistic nutrient management model based on the COMP8 uptake model. (2) Collect field data that could be used to verify and test the model. (3) Model testing.

  6. Protein Redox Dynamics During Light-to-Dark Transitions in Cyanobacteria and Impacts Due to Nutrient Limitation

    Directory of Open Access Journals (Sweden)

    Aaron T Wright

    2014-07-01

    Full Text Available Protein redox chemistry constitutes a major void in knowledge pertaining to photoautotrophic system regulation and signaling processes. We have employed a chemical biology approach to analyze redox sensitive proteins in live Synechococcus sp. PCC 7002 cells in both light and dark periods, and to understand how cellular redox balance is disrupted during nutrient perturbation. The present work identified 300 putative redox-sensitive proteins that are involved in the generation of reductant, macromolecule synthesis, and carbon flux through central metabolic pathways, and may be involved in cell signaling and response mechanisms. Furthermore, our research suggests that dynamic redox changes in response to specific nutrient limitations, including carbon and nitrogen limitations, contribute to the regulatory changes driven by a shift from light to dark. Taken together, these results contribute to a high-level understanding of post-translational mechanisms regulating flux distributions and suggest potential metabolic engineering targets for redirecting carbon towards biofuel precursors.

  7. Role of nutrient limitation and stationary-phase existence in Klebsiella pneumoniae biofilm resistance to ampicillin and ciprofloxacin.

    Science.gov (United States)

    Anderl, Jeff N; Zahller, Jeff; Roe, Frank; Stewart, Philip S

    2003-04-01

    Biofilms formed by Klebsiella pneumoniae resisted killing during prolonged exposure to ampicillin or ciprofloxacin even though these agents have been shown to penetrate bacterial aggregates. Bacteria dispersed from biofilms into medium quickly regained most of their susceptibility. Experiments with free-floating bacteria showed that stationary-phase bacteria were protected from killing by either antibiotic, especially when the test was performed in medium lacking carbon and nitrogen sources. These results suggested that the antibiotic tolerance of biofilm bacteria could be explained by nutrient limitation in the biofilm leading to stationary-phase existence of at least some of the cells in the biofilm. This mechanism was supported by experimental characterization of nutrient availability and growth status in biofilms. The average specific growth rate of bacteria in biofilms was only 0.032 h(-1) compared to the specific growth rate of planktonic bacteria of 0.59 h(-1) measured in the same medium. Glucose did not penetrate all the way through the biofilm, and oxygen was shown to penetrate only into the upper 100 micro m. The specific catalase activity was elevated in biofilm bacteria to a level similar to that of stationary-phase planktonic cells. Transmission electron microscopy revealed that bacteria were affected by ampicillin near the periphery of the biofilm but were not affected in the interior. Taken together, these results indicate that K. pneumoniae in this system experience nutrient limitation locally within the biofilm, leading to zones in which the bacteria enter stationary phase and are growing slowly or not at all. In these inactive regions, bacteria are less susceptible to killing by antibiotics.

  8. Lactational Stage of Pasteurized Human Donor Milk Contributes to Nutrient Limitations for Infants

    Directory of Open Access Journals (Sweden)

    Christina J. Valentine

    2017-03-01

    Full Text Available Background. Mother’s own milk is the first choice for feeding preterm infants, but when not available, pasteurized human donor milk (PDM is often used. Infants fed PDM have difficulties maintaining appropriate growth velocities. To assess the most basic elements of nutrition, we tested the hypotheses that fatty acid and amino acid composition of PDM is highly variable and standard pooling practices attenuate variability; however, total nutrients may be limiting without supplementation due to late lactational stage of the milk. Methods. A prospective cross-sectional sampling of milk was obtained from five donor milk banks located in Ohio, Michigan, Colorado, Texas-Ft Worth, and California. Milk samples were collected after Institutional Review Board (#07-0035 approval and informed consent. Fatty acid and amino acid contents were measured in milk from individual donors and donor pools (pooled per Human Milk Banking Association of North America guidelines. Statistical comparisons were performed using Kruskal–Wallis, Spearman’s, or Multivariate Regression analyses with center as the fixed factor and lactational stage as co-variate. Results. Ten of the fourteen fatty acids and seventeen of the nineteen amino acids analyzed differed across Banks in the individual milk samples. Pooling minimized these differences in amino acid and fatty acid contents. Concentrations of lysine and docosahexaenoic acid (DHA were not different across Banks, but concentrations were low compared to recommended levels. Conclusions. Individual donor milk fatty acid and amino acid contents are highly variable. Standardized pooling practice reduces this variability. Lysine and DHA concentrations were consistently low across geographic regions in North America due to lactational stage of the milk, and thus not adequately addressed by pooling. Targeted supplementation is needed to optimize PDM, especially for the preterm or volume restricted infant.

  9. Elemental economy: microbial strategies for optimizing growth in the face of nutrient limitation.

    Science.gov (United States)

    Merchant, Sabeeha S; Helmann, John D

    2012-01-01

    Microorganisms play a dominant role in the biogeochemical cycling of nutrients. They are rightly praised for their facility for fixing both carbon and nitrogen into organic matter, and microbial driven processes have tangibly altered the chemical composition of the biosphere and its surrounding atmosphere. Despite their prodigious capacity for molecular transformations, microorganisms are powerless in the face of the immutability of the elements. Limitations for specific elements, either fleeting or persisting over eons, have left an indelible trace on microbial genomes, physiology, and their very atomic composition. We here review the impact of elemental limitation on microbes, with a focus on selected genetic model systems and representative microbes from the ocean ecosystem. Evolutionary adaptations that enhance growth in the face of persistent or recurrent elemental limitations are evident from genome and proteome analyses. These range from the extreme (such as dispensing with a requirement for a hard to obtain element) to the extremely subtle (changes in protein amino acid sequences that slightly, but significantly, reduce cellular carbon, nitrogen, or sulfur demand). One near-universal adaptation is the development of sophisticated acclimation programs by which cells adjust their chemical composition in response to a changing environment. When specific elements become limiting, acclimation typically begins with an increased commitment to acquisition and a concomitant mobilization of stored resources. If elemental limitation persists, the cell implements austerity measures including elemental sparing and elemental recycling. Insights into these fundamental cellular properties have emerged from studies at many different levels, including ecology, biological oceanography, biogeochemistry, molecular genetics, genomics, and microbial physiology. Here, we present a synthesis of these diverse studies and attempt to discern some overarching themes. Copyright © 2012

  10. Nutrient limitation on ecosystem productivity and processes of mature and old-growth subtropical forests in China.

    Directory of Open Access Journals (Sweden)

    Enqing Hou

    Full Text Available Nitrogen (N is considered the dominant limiting nutrient in temperate regions, while phosphorus (P limitation frequently occurs in tropical regions, but in subtropical regions nutrient limitation is poorly understood. In this study, we investigated N and P contents and N:P ratios of foliage, forest floors, fine roots and mineral soils, and their relationships with community biomass, litterfall C, N and P productions, forest floor turnover rate, and microbial processes in eight mature and old-growth subtropical forests (stand age >80 yr at Dinghushan Biosphere Reserve, China. Average N:P ratios (mass based in foliage, litter (L layer and mixture of fermentation and humus (F/H layer, and fine roots were 28.3, 42.3, 32.0 and 32.7, respectively. These values are higher than the critical N:P ratios for P limitation proposed (16-20 for foliage, ca. 25 for forest floors. The markedly high N:P ratios were mainly attributed to the high N concentrations of these plant materials. Community biomass, litterfall C, N and P productions, forest floor turnover rate and microbial properties were more strongly related to measures of P than N and frequently negatively related to the N:P ratios, suggesting a significant role of P availability in determining ecosystem production and productivity and nutrient cycling at all the study sites except for one prescribed disturbed site where N availability may also be important. We propose that N enrichment is probably a significant driver of the potential P limitation in the study area. Low P parent material may also contribute to the potential P limitation. In general, our results provided strong evidence supporting a significant role for P availability, rather than N availability, in determining ecosystem primary productivity and ecosystem processes in subtropical forests of China.

  11. Community Structure and Activity of a Highly Dynamic and Nutrient-Limited Hypersaline Microbial Mat in Um Alhool Sabkha, Qatar

    KAUST Repository

    Al-Thani, Roda

    2014-03-21

    The Um Alhool area in Qatar is a dynamic evaporative ecosystem that receives seawater from below as it is surrounded by sand dunes. We investigated the chemical composition, the microbial activity and biodiversity of the four main layers (L1–L4) in the photosynthetic mats. Chlorophyll a (Chl a) concentration and distribution (measured by HPLC and hyperspectral imaging, respectively), the phycocyanin distribution (scanned with hyperspectral imaging), oxygenic photosynthesis (determined by microsensor), and the abundance of photosynthetic microorganisms (from 16S and 18S rRNA sequencing) decreased with depth in the euphotic layer (L1). Incident irradiance exponentially attenuated in the same zone reaching 1% at 1.7-mm depth. Proteobacteria dominated all layers of the mat (24%–42% of the identified bacteria). Anoxygenic photosynthetic bacteria (dominated by Chloroflexus) were most abundant in the third red layer of the mat (L3), evidenced by the spectral signature of Bacteriochlorophyll as well as by sequencing. The deep, black layer (L4) was dominated by sulfate reducing bacteria belonging to the Deltaproteobacteria, which were responsible for high sulfate reduction rates (measured using 35S tracer). Members of Halobacteria were the dominant Archaea in all layers of the mat (92%–97%), whereas Nematodes were the main Eukaryotes (up to 87%). Primary productivity rates of Um Alhool mat were similar to those of other hypersaline microbial mats. However, sulfate reduction rates were relatively low, indicating that oxygenic respiration contributes more to organic material degradation than sulfate reduction, because of bioturbation. Although Um Alhool hypersaline mat is a nutrient-limited ecosystem, it is interestingly dynamic and phylogenetically highly diverse. All its components work in a highly efficient and synchronized way to compensate for the lack of nutrient supply provided during regular inundation periods.

  12. Persistence of Only a Minute Viable Population in Chlorotic Microcystis aeruginosa PCC 7806 Cultures Obtained by Nutrient Limitation.

    Directory of Open Access Journals (Sweden)

    Diogo de Abreu Meireles

    Full Text Available Cultures from the cyanobacterial strain Microcystis aeruginosa PCC 7806 submitted to nutrient limitation become chlorotic. When returned to nutrient rich conditions these cultures regain their green colour. The aim of this study was to verify whether the cells in these cultures could be considered resting stages allowing the survival of periods of nutrient starvation as has been reported for Synechococcus PCC 7942. The experiments with Microcystis were carried out in parallel with Synechococcus cultures to rule out the possibility that any results obtained with Microcystis were due to our particular experimental conditions. The results of the experiments with Synechococcus PCC 7942 cultures were comparable to the reported in the literature. For Microcystis PCC 7806 a different response was observed. Analysis of chlorotic Microcystis cultures by flow cytometry showed that the phenotype of the cells in the population was not homogenous: the amount of nucleic acids was about the same in all cells but only around one percent of the population emitted red autofluorescence indicating the presence of chlorophyll. Monitoring of the reversion of chlorosis by flow cytometry showed that the re-greening was most likely the result of the division of the small population of red autofluorescent cells originally present in the chlorotic cultures. This assumption was confirmed by analysing the integrity of the DNA and the membrane permeability of the cells of chlorotic cultures. Most of the DNA of these cultures was degraded and only the autofluorescent population of the chlorotic cultures showed membrane integrity. Thus, contrary to what has been reported for other cyanobacterial genera, most of the cells in chlorotic Microcystis cultures are not resting stages but dead. It is interesting to note that the red autofluorescent cells of green and chlorotic cultures obtained in double strength ASM-1 medium differ with respect to metabolism: levels of emission of

  13. Persistence of Only a Minute Viable Population in Chlorotic Microcystis aeruginosa PCC 7806 Cultures Obtained by Nutrient Limitation.

    Science.gov (United States)

    Meireles, Diogo de Abreu; Schripsema, Jan; Arnholdt, Andrea Cristina Vetö; Dagnino, Denise

    2015-01-01

    Cultures from the cyanobacterial strain Microcystis aeruginosa PCC 7806 submitted to nutrient limitation become chlorotic. When returned to nutrient rich conditions these cultures regain their green colour. The aim of this study was to verify whether the cells in these cultures could be considered resting stages allowing the survival of periods of nutrient starvation as has been reported for Synechococcus PCC 7942. The experiments with Microcystis were carried out in parallel with Synechococcus cultures to rule out the possibility that any results obtained with Microcystis were due to our particular experimental conditions. The results of the experiments with Synechococcus PCC 7942 cultures were comparable to the reported in the literature. For Microcystis PCC 7806 a different response was observed. Analysis of chlorotic Microcystis cultures by flow cytometry showed that the phenotype of the cells in the population was not homogenous: the amount of nucleic acids was about the same in all cells but only around one percent of the population emitted red autofluorescence indicating the presence of chlorophyll. Monitoring of the reversion of chlorosis by flow cytometry showed that the re-greening was most likely the result of the division of the small population of red autofluorescent cells originally present in the chlorotic cultures. This assumption was confirmed by analysing the integrity of the DNA and the membrane permeability of the cells of chlorotic cultures. Most of the DNA of these cultures was degraded and only the autofluorescent population of the chlorotic cultures showed membrane integrity. Thus, contrary to what has been reported for other cyanobacterial genera, most of the cells in chlorotic Microcystis cultures are not resting stages but dead. It is interesting to note that the red autofluorescent cells of green and chlorotic cultures obtained in double strength ASM-1 medium differ with respect to metabolism: levels of emission of red autofluorescence

  14. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  15. Seasonal variability in the persistence of dissolved environmental DNA (eDNA in a marine system: The role of microbial nutrient limitation.

    Directory of Open Access Journals (Sweden)

    Ian Salter

    be linked to the metabolic response of microbial communities to nutrient limitation. Future studies should consider the effect of natural environmental gradients on the seasonal persistence of eDNA, which will be of particular relevance for time-series biomonitoring programs.

  16. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  17. Acclimation of Emiliania huxleyi (1516) to nutrient limitation involves precise modification of the proteome to scavenge alternative sources of N and P.

    Science.gov (United States)

    McKew, Boyd A; Metodieva, Gergana; Raines, Christine A; Metodiev, Metodi V; Geider, Richard J

    2015-10-01

    Limitation of marine primary production by the availability of nitrogen or phosphorus is common. Emiliania huxleyi, a ubiquitous phytoplankter that plays key roles in primary production, calcium carbonate precipitation and production of dimethyl sulfide, often blooms in mid-latitude at the beginning of summer when inorganic nutrient concentrations are low. To understand physiological mechanisms that allow such blooms, we examined how the proteome of E. huxleyi (strain 1516) responds to N and P limitation. We observed modest changes in much of the proteome despite large physiological changes (e.g. cellular biomass, C, N and P) associated with nutrient limitation of growth rate. Acclimation to nutrient limitation did however involve significant increases in the abundance of transporters for ammonium and nitrate under N limitation and for phosphate under P limitation. More notable were large increases in proteins involved in the acquisition of organic forms of N and P, including urea and amino acid/polyamine transporters and numerous C-N hydrolases under N limitation and a large upregulation of alkaline phosphatase under P limitation. This highly targeted reorganization of the proteome towards scavenging organic forms of macronutrients gives unique insight into the molecular mechanisms that underpin how E. huxleyi has found its niche to bloom in surface waters depleted of inorganic nutrients. © 2015 The Authors. Environmental Microbiology published by Society for Applied Microbiology and John Wiley & Sons Ltd.

  18. Suppressed translation as a mechanism of initiation of CASP8 (caspase 8)-dependent apoptosis in autophagy-deficient NSCLC cells under nutrient limitation.

    Science.gov (United States)

    Allavena, Giulia; Cuomo, Francesca; Baumgartner, Georg; Bele, Tadeja; Sellgren, Alexander Yarar; Oo, Kyaw Soe; Johnson, Kaylee; Gogvadze, Vladimir; Zhivotovsky, Boris; Kaminskyy, Vitaliy O

    2018-01-01

    Macroautophagy/autophagy inhibition under stress conditions is often associated with increased cell death. We found that under nutrient limitation, activation of CASP8/caspase-8 was significantly increased in autophagy-deficient lung cancer cells, which precedes mitochondria outer membrane permeabilization (MOMP), CYCS/cytochrome c release, and activation of CASP9/caspase-9, indicating that under such conditions the activation of CASP8 is a primary event in the initiation of apoptosis as well as essential to reduce clonogenic survival of autophagy-deficient cells. Starvation leads to suppression of CFLAR proteosynthesis and accumulation of CASP8 in SQSTM1 puncta. Overexpression of CFLARs reduces CASP8 activation and apoptosis during starvation, while its silencing promotes efficient activation of CASP8 and apoptosis in autophagy-deficient U1810 lung cancer cells even under nutrient-rich conditions. Similar to starvation, inhibition of protein translation leads to efficient activation of CASP8 and cell death in autophagy-deficient lung cancer cells. Thus, here for the first time we report that suppressed translation leads to activation of CASP8-dependent apoptosis in autophagy-deficient NSCLC cells under conditions of nutrient limitation. Our data suggest that targeting translational machinery can be beneficial for elimination of autophagy-deficient cells via the CASP8-dependent apoptotic pathway.

  19. Productivity and residual benefits of grain legumes to sorghum under semi-arid conditions in south-western Zimbabwe: Unravelling the effects of water and nitrogen using a simulation model

    NARCIS (Netherlands)

    Ncube, B.; Dimes, J.P.; Wijk, van M.T.; Twomlow, S.J.; Giller, K.E.

    2009-01-01

    The APSIM model was used to assess the impact of legumes on sorghum grown in rotation in a nutrient-limited system under dry conditions in south-western Zimbabwe. An experiment was conducted at Lucydale, Matopos Research Station, between 2002 and 2005. The model was used to simulate soil and plant

  20. Nutrient limitation leads to penetrative growth into agar and affects aroma formation in Pichia fabianii, P. kudriavzevii and Saccharomyces cerevisiae

    NARCIS (Netherlands)

    van Rijswijck, Irma M H; Dijksterhuis, Jan; Wolkers-Rooijackers, Judith C M; Abee, Tjakko; Smid, Eddy J

    Among fermentative yeast species, Saccharomyces cerevisiae is most frequently used as a model organism, although other yeast species may have special features that make them interesting candidates to apply in food-fermentation processes. In this study, we used three yeast species isolated from

  1. Nutrient limitation leads to penetrative growth into agar and affects aroma formation in Pichia fabianii, P. kudriavzevii and Saccharomyces cerevisiae

    NARCIS (Netherlands)

    Rijswijck, van I.M.H.; Dijksterhuis, J.; Wolkers-Rooijackers, J.C.M.; Abee, T.; Smid, E.J.

    2015-01-01

    Among fermentative yeast species, Saccharomyces cerevisiae is most frequently used as a model organism, although other yeast species may have special features that make them interesting candidates to apply in food-fermentation processes. In this study, we used three yeast species isolated from

  2. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R; Gimeno, B S; Bermejo, V; Elvira, S; Martin, F; Palacios, M; Rodriguez, E; Donaire, I [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  3. Seasonal changes in nutrient limitation and nitrate sources in the green macroalga Ulva lactuca at sites with and without green tides in a northeastern Pacific embayment.

    Science.gov (United States)

    Van Alstyne, Kathryn L

    2016-02-15

    In Penn Cove, ulvoid green algal mats occur annually. To examine seasonal variation in their causes, nitrogen and carbon were measured in Ulva lactuca in May, July, and September and stable nitrogen and oxygen isotope ratios were quantified in U. lactuca, Penn Cove seawater, upwelled water from Saratoga Passage, water near the Skagit River outflow, and effluents from wastewater treatment facilities. Ulvoid growth was nitrogen limited and the sources of nitrogen used by the algae changed during the growing season. Algal nitrogen concentrations were 0.85-4.55% and were highest in September and at sites where algae were abundant. Upwelled waters were the primary nitrogen source for the algae, but anthropogenic sources also contributed to algal growth towards the end of the growing season. This study suggests that small nitrogen inputs can result in crossing a "tipping point", causing the release of nutrient limitation and localized increases in algal growth. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Nutrient limitation and microbially mediated chemistry: studies using tuff inoculum obtained from the Exploratory Studies Facility, Yucca Mountain

    International Nuclear Information System (INIS)

    Chen, C. I.; Chuu, Y. J.; Meike, A.; Ringelberg, D.; Sawvel, A.

    1998-01-01

    Flow-through bioreactors are used to investigate the relationship between the supply (and limitation) of major nutrients required by microorganisms (C, N, P, S) and effluent chemistry to obtain data that can be useful to develop models of microbially mediated aqueous chemistry. The bioreactors were inoculated with crushed tuff from Yucca Mountain. Six of the 14 bioreactor experiments currently in operation have shown growth, which occurred in as few as 5 days and as much as a few months after initiation of the experiment. All of the bioreactors exhibiting growth contained glucose as a carbon source, but other nutritional components varied. Chemical signatures of each bioreactor were compared to each other and selected results were compared to computer simulations of the equivalent abiotic chemical reactions. At 21 C, the richest medium formulation produced a microbial community that lowered the effluent pH from 6.4 to as low as 3.9. The same medium formulation at 50 C produced no significant change in pH but caused a significant increase in Cl after a period of 200 days. Variations in concentrations of other elements, some of which appear to be periodic (Ca, Mg, etc.) also occur. Bioreactors fed with low C, N, P, S media showed growth, but had stabilized at lower cell densities. The room temperature bioreactor in this group exhibited a phospholipid fatty acid (PLFA) signature of sulfur- or iron-reducing bacteria, which produced a significant chemical signature in the effluent from that bioreactor. Growth had not been observed yet in the alkaline bioreactors, even in those containing glucose. The value of combining detailed chemical and community (e.g., ester-linked PLFA) analyses, long-duration experiments, and abiotic chemical models to distinguish chemical patterns is evident. Although all of the bioreactors contain the same initial microorganisms and mineral constituents, PLFA analysis demonstrates that both input chemistry and temperature determine the

  5. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  6. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  7. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  8. Heterologous expression of Anabaena PCC 7120 all3940 (a Dps family gene) protects Escherichia coli from nutrient limitation and abiotic stresses

    International Nuclear Information System (INIS)

    Narayan, Om Prakash; Kumari, Nidhi; Rai, Lal Chand

    2010-01-01

    This study presents first hand data on the cloning and heterologous expression of Anabaena PCC 7120 all3940 (a dps family gene) in combating nutrients limitation and multiple abiotic stresses. The Escherichia coli transformed with pGEX-5X-2-all3940 construct when subjected to iron, carbon, nitrogen, phosphorus limitation and carbofuron, copper, UV-B, heat, salt and cadmium stress registered significant increase in growth over the cells transformed with empty vector under iron (0%), carbon (0.05%), nitrogen (3.7 mM) and phosphorus (2 mM) limitation and carbofuron (0.025 mg ml -1 ), CuCl 2 (1 mM), UV-B (10 min), heat (47 o C), NaCl (6% w/v) and CdCl 2 (4 mM) stress. Enhanced expression of all3940 gene measured by semi-quantitative RT-PCR at different time points under above mentioned treatments clearly demonstrates its role in tolerance against aforesaid abiotic stresses. This study opens the gate for developing transgenic cyanobacteria capable of growing successfully under above mentioned stresses.

  9. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  10. Removal of nutrient limitations in forest gaps enhances growth rate and resistance to cavitation in subtropical canopy tree species differing in shade tolerance.

    Science.gov (United States)

    Villagra, Mariana; Campanello, Paula I; Montti, Lia; Goldstein, Guillermo

    2013-03-01

    A 4-year fertilization experiment with nitrogen (N) and phosphorus (P) was carried out in natural gaps of a subtropical forest in northeastern Argentina. Saplings of six dominant canopy species differing in shade tolerance were grown in five control and five N + P fertilized gaps. Hydraulic architectural traits such as wood density, the leaf area to sapwood area ratio (LA : SA), vulnerability to cavitation (P50) and specific and leaf-specific hydraulic conductivity were measured, as well as the relative growth rate, specific leaf area (SLA) and percentage of leaf damage by insect herbivores. Plant growth rates and resistance to drought-induced embolisms increased when nutrient limitations were removed. On average, the P50 of control plants was -1.1 MPa, while the P50 of fertilized plants was -1.6 MPa. Wood density and LA : SA decreased with N + P additions. A trade-off between vulnerability to cavitation and efficiency of water transport was not observed. The relative growth rate was positively related to the total leaf surface area per plant and negatively related to LA : SA, while P50 was positively related to SLA across species and treatments. Plants with higher growth rates and higher total leaf area in fertilized plots were able to avoid hydraulic dysfunction by becoming less vulnerable to cavitation (more negative P50). Two high-light-requiring species exhibited relatively low growth rates due to heavy herbivore damage. Contrary to expectations, shade-tolerant plants with relatively high resistance to hydraulic dysfunction and reduced herbivory damage were able to grow faster. These results suggest that during the initial phase of sapling establishment in gaps, species that were less vulnerable to cavitation and exhibited reduced herbivory damage had faster realized growth rates than less shade-tolerant species with higher potential growth rates. Finally, functional relationships between hydraulic traits and growth rate across species and treatments

  11. The SAVI vulnerability assessment model

    International Nuclear Information System (INIS)

    Winblad, A.E.

    1987-01-01

    The assessment model ''Systematic Analysis of Vulnerability to Intrusion'' (SAVI) presented in this report is a PC-based path analysis model. It can provide estimates of protection system effectiveness (or vulnerability) against a spectrum of outsider threats including collusion with an insider adversary. It calculates one measure of system effectiveness, the probability of interruption P(I), for all potential adversary paths. SAVI can perform both theft and sabotage vulnerability analyses. For theft, the analysis is based on the assumption that adversaries should be interrupted either before they can accomplish removal of the target material from its normal location or removal from the site boundary. For sabotage, the analysis is based on the assumption that adversaries should be interrupted before completion of their sabotage task

  12. Irrigation in dose assessments models

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, Ulla; Barkefors, Catarina [Studsvik RadWaste AB, Nykoeping (Sweden)

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  13. Irrigation in dose assessments models

    International Nuclear Information System (INIS)

    Bergstroem, Ulla; Barkefors, Catarina

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  14. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  15. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  16. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  17. Dose assessment models. Annex A

    International Nuclear Information System (INIS)

    1982-01-01

    The models presented in this chapter have been separated into 2 general categories: environmental transport models which describe the movement of radioactive materials through all sectors of the environment after their release, and dosimetric models to calculate the absorbed dose following an intake of radioactive materials or exposure to external irradiation. Various sections of this chapter also deal with atmospheric transport models, terrestrial models, and aquatic models.

  18. Assessing uncertainty in mechanistic models

    Science.gov (United States)

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  19. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  20. Sensitivity Assessment of Ozone Models

    Energy Technology Data Exchange (ETDEWEB)

    Shorter, Jeffrey A.; Rabitz, Herschel A.; Armstrong, Russell A.

    2000-01-24

    The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.

  1. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  2. The Model for Assessment of Telemedicine (MAST)

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Clemensen, Jane; Caffery, Liam J

    2017-01-01

    The evaluation of telemedicine can be achieved using different evaluation models or theoretical frameworks. This paper presents a scoping review of published studies which have applied the Model for Assessment of Telemedicine (MAST). MAST includes pre-implementation assessment (e.g. by use...

  3. Assessment of the Rescorla-Wagner model.

    Science.gov (United States)

    Miller, R R; Barnet, R C; Grahame, N J

    1995-05-01

    The Rescorla-Wagner model has been the most influential theory of associative learning to emerge from the study of animal behavior over the last 25 years. Recently, equivalence to this model has become a benchmark in assessing connectionist models, with such equivalence often achieved by incorporating the Widrow-Hoff delta rule. This article presents the Rescorla-Wagner model's basic assumptions, reviews some of the model's predictive successes and failures, relates the failures to the model's assumptions, and discusses the model's heuristic value. It is concluded that the model has had a positive influence on the study of simple associative learning by stimulating research and contributing to new model development. However, this benefit should neither lead to the model being regarded as inherently "correct" nor imply that its predictions can be profitably used to assess other models.

  4. Understanding National Models for Climate Assessments

    Science.gov (United States)

    Dave, A.; Weingartner, K.

    2017-12-01

    National-level climate assessments have been produced or are underway in a number of countries. These efforts showcase a variety of approaches to mapping climate impacts onto human and natural systems, and involve a variety of development processes, organizational structures, and intended purposes. This presentation will provide a comparative overview of national `models' for climate assessments worldwide, drawing from a geographically diverse group of nations with varying capacities to conduct such assessments. Using an illustrative sampling of assessment models, the presentation will highlight the range of assessment mandates and requirements that drive this work, methodologies employed, focal areas, and the degree to which international dimensions are included for each nation's assessment. This not only allows the U.S. National Climate Assessment to be better understood within an international context, but provides the user with an entry point into other national climate assessments around the world, enabling a better understanding of the risks and vulnerabilities societies face.

  5. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  6. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable

  7. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  8. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...... average between the global quality and the local quality. Experimental results demonstrate that the combination of the global quality and local quality outperforms both sole global quality and local quality, as well as other quality models, in video quality assessment. In addition, the proposed video...... quality modeling algorithm can improve the performance of image quality metrics on video quality assessment compared to the normal averaged spatiotemporal pooling scheme....

  9. evaluation of models for assessing groundwater vulnerability

    African Journals Online (AJOL)

    DR. AMINU

    applied models for groundwater vulnerability assessment mapping. The appraoches .... The overall 'pollution potential' or DRASTIC index is established by applying the formula: DRASTIC Index: ... affected by the structure of the soil surface.

  10. A Model for Situation and Threat Assessment

    Science.gov (United States)

    2006-12-01

    CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD 20855 UNITED STATES steinberg@cubrc.org A model is presented for situation and threat assessment...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Subject Matter Expert (SME) Calspan-UB Research Center ( CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD...1 A Model for Situation and Threat Assessment Alan Steinberg CUBRC , Inc. steinberg@cubrc.org November, 2005 2 Objectives • Advance the state-of

  11. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  12. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  13. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base

  14. Models and parameters for environmental radiological assessments

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C W [ed.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base. (ACR)

  15. A Simple Model of Self-Assessments

    NARCIS (Netherlands)

    S. Dominguez Martinez (Silvia); O.H. Swank (Otto)

    2006-01-01

    textabstractWe develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too

  16. A simple model of self-assessment

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Swank, O.H.

    2009-01-01

    We develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too positive and

  17. Growth of the coccolithophore Emiliania huxleyi in light- and nutrient-limited batch reactors: relevance for the BIOSOPE deep ecological niche of coccolithophores

    Science.gov (United States)

    Perrin, Laura; Probert, Ian; Langer, Gerald; Aloisi, Giovanni

    2016-11-01

    Coccolithophores are unicellular calcifying marine algae that play an important role in the oceanic carbon cycle via their cellular processes of photosynthesis (a CO2 sink) and calcification (a CO2 source). In contrast to the well-studied, surface-water coccolithophore blooms visible from satellites, the lower photic zone is a poorly known but potentially important ecological niche for coccolithophores in terms of primary production and carbon export to the deep ocean. In this study, the physiological responses of an Emiliania huxleyi strain to conditions simulating the deep niche in the oligotrophic gyres along the BIOSOPE transect in the South Pacific Gyre were investigated. We carried out batch culture experiments with an E. huxleyi strain isolated from the BIOSOPE transect, reproducing the in situ conditions of light and nutrient (nitrate and phosphate) limitation. By simulating coccolithophore growth using an internal stores (Droop) model, we were able to constrain fundamental physiological parameters for this E. huxleyi strain. We show that simple batch experiments, in conjunction with physiological modelling, can provide reliable estimates of fundamental physiological parameters for E. huxleyi that are usually obtained experimentally in more time-consuming and costly chemostat experiments. The combination of culture experiments, physiological modelling and in situ data from the BIOSOPE cruise show that E. huxleyi growth in the deep BIOSOPE niche is limited by availability of light and nitrate. This study contributes more widely to the understanding of E. huxleyi physiology and behaviour in a low-light and oligotrophic environment of the ocean.

  18. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  19. Underwater noise modelling for environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Farcas, Adrian [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom); Thompson, Paul M. [Lighthouse Field Station, Institute of Biological and Environmental Sciences, University of Aberdeen, Cromarty IV11 8YL (United Kingdom); Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom)

    2016-02-15

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  20. Underwater noise modelling for environmental impact assessment

    International Nuclear Information System (INIS)

    Farcas, Adrian; Thompson, Paul M.; Merchant, Nathan D.

    2016-01-01

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  1. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  2. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  3. The metabolic response of P. putida KT2442 producing high levels of polyhydroxyalkanoate under single- and multiple-nutrient-limited growth: Highlights from a multi-level omics approach

    Directory of Open Access Journals (Sweden)

    Poblete-Castro Ignacio

    2012-03-01

    Full Text Available Abstract Background Pseudomonas putida KT2442 is a natural producer of polyhydroxyalkanoates (PHAs, which can substitute petroleum-based non-renewable plastics and form the basis for the production of tailor-made biopolymers. However, despite the substantial body of work on PHA production by P. putida strains, it is not yet clear how the bacterium re-arranges its whole metabolism when it senses the limitation of nitrogen and the excess of fatty acids as carbon source, to result in a large accumulation of PHAs within the cell. In the present study we investigated the metabolic response of KT2442 using a systems biology approach to highlight the differences between single- and multiple-nutrient-limited growth in chemostat cultures. Results We found that 26, 62, and 81% of the cell dry weight consist of PHA under conditions of carbon, dual, and nitrogen limitation, respectively. Under nitrogen limitation a specific PHA production rate of 0.43 (g·(g·h-1 was obtained. The residual biomass was not constant for dual- and strict nitrogen-limiting growth, showing a different feature in comparison to other P. putida strains. Dual limitation resulted in patterns of gene expression, protein level, and metabolite concentrations that substantially differ from those observed under exclusive carbon or nitrogen limitation. The most pronounced differences were found in the energy metabolism, fatty acid metabolism, as well as stress proteins and enzymes belonging to the transport system. Conclusion This is the first study where the interrelationship between nutrient limitations and PHA synthesis has been investigated under well-controlled conditions using a system level approach. The knowledge generated will be of great assistance for the development of bioprocesses and further metabolic engineering work in this versatile organism to both enhance and diversify the industrial production of PHAs.

  4. Bioavailability in the boris assessment model

    International Nuclear Information System (INIS)

    Norden, M.; Avila, R.; Gonze, M.A.; Tamponnet, C.

    2004-01-01

    The fifth framework EU project BORIS (Bioavailability Of Radionuclides In Soils: role of biological components and resulting improvement of prediction models) has three scientific objectives. The first is to improve understanding of the mechanisms governing the transfer of radionuclides to plants. The second is to improve existing predictive models of radionuclide interaction with soils by incorporating the knowledge acquired from the experimental results. The last and third objective is to extract from the experimental results some scientific basis for the development of bioremediation methods of radionuclides contaminated soils and to apprehend the role of additional non-radioactive pollutants on radionuclide bio-availability. This paper is focused on the second objective. The purpose of the BORIS assessment model is to describe the behaviour of radionuclides in the soil-plant system with the aim of making predictions of the time dynamics of the bioavailability of radionuclides in soil and the radionuclides concentrations in plants. To be useful the assessment model should be rather simple and use only a few parameters, which are commonly available or possible to measure for different sites. The model shall take into account, as much as possible, the results of the experimental studies and the mechanistic models developed in the BORIS project. One possible approach is to introduce in the assessment model a quantitative relationship between bioavailability of the radionuclides in soil and the soil properties. To do this an operational definition of bioavailability is needed. Here operational means experimentally measurable, directly or indirectly, and that the bioavailability can be translated into a mathematical expression. This paper describes the reasoning behind the chosen definition of bioavailability for the assessment model, how to derive operational expressions for the bioavailability and how to use them in the assessment model. (author)

  5. Assessing alternative conceptual models of fracture flow

    International Nuclear Information System (INIS)

    Ho, C.K.

    1995-01-01

    The numerical code TOUGH2 was used to assess alternative conceptual models of fracture flow. The models that were considered included the equivalent continuum model (ECM) and the dual permeability (DK) model. A one-dimensional, layered, unsaturated domain was studied with a saturated bottom boundary and a constant infiltration at the top boundary. Two different infiltration rates were used in the studies. In addition, the connection areas between the fracture and matrix elements in the dual permeability model were varied. Results showed that the two conceptual models of fracture flow produced different saturation and velocity profiles-even under steady-state conditions. The magnitudes of the discrepancies were sensitive to two parameters that affected the flux between the fractures and matrix in the dual permeability model: (1) the fracture-matrix connection areas and (2) the capillary pressure gradients between the fracture and matrix elements

  6. Personalized pseudophakic model for refractive assessment.

    Directory of Open Access Journals (Sweden)

    Filomena J Ribeiro

    Full Text Available PURPOSE: To test a pseudophakic eye model that allows for intraocular lens power (IOL calculation, both in normal eyes and in extreme conditions, such as post-LASIK. METHODS: PARTICIPANTS: The model's efficacy was tested in 54 participants (104 eyes who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan® and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®. To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF weighted according to contrast sensitivity function (CSF, were applied. RESULTS: Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05. Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05. CONCLUSIONS: Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  7. Personalized pseudophakic model for refractive assessment.

    Science.gov (United States)

    Ribeiro, Filomena J; Castanheira-Dinis, António; Dias, João M

    2012-01-01

    To test a pseudophakic eye model that allows for intraocular lens power (IOL) calculation, both in normal eyes and in extreme conditions, such as post-LASIK. The model's efficacy was tested in 54 participants (104 eyes) who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan®) and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®). To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF) weighted according to contrast sensitivity function (CSF), were applied. Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05). Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05). Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  8. Assessment of Venous Thrombosis in Animal Models.

    Science.gov (United States)

    Grover, Steven P; Evans, Colin E; Patel, Ashish S; Modarai, Bijan; Saha, Prakash; Smith, Alberto

    2016-02-01

    Deep vein thrombosis and common complications, including pulmonary embolism and post-thrombotic syndrome, represent a major source of morbidity and mortality worldwide. Experimental models of venous thrombosis have provided considerable insight into the cellular and molecular mechanisms that regulate thrombus formation and subsequent resolution. Here, we critically appraise the ex vivo and in vivo techniques used to assess venous thrombosis in these models. Particular attention is paid to imaging modalities, including magnetic resonance imaging, micro-computed tomography, and high-frequency ultrasound that facilitate longitudinal assessment of thrombus size and composition. © 2015 American Heart Association, Inc.

  9. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard

    2012-01-01

    the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study.......Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  10. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1983-01-01

    This article reviews the forthcoming book Models and Parameters for Environmental Radiological Assessments, which presents a unified compilation of models and parameters for assessing the impact on man of radioactive discharges, both routine and accidental, into the environment. Models presented in this book include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Summaries are presented for each of the transport and dosimetry areas previously for each of the transport and dosimetry areas previously mentioned, and details are available in the literature cited. A chapter of example problems illustrates many of the methodologies presented throughout the text. Models and parameters presented are based on the results of extensive literature reviews and evaluations performed primarily by the staff of the Health and Safety Research Division of Oak Ridge National Laboratory

  11. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  12. PRACTICAL APPLICATION OF A MODEL FOR ASSESSING

    Directory of Open Access Journals (Sweden)

    Petr NOVOTNÝ

    2015-12-01

    Full Text Available Rail transport is an important sub-sector of transport infrastructure. Disruption of its operation due to emergencies can result in a reduction in functional parameters of provided services with consequent impacts on society. Identification of critical elements of this system enables its timely and effective protection. On that ground, the article presents a draft model for assessing the criticality of railway infrastructure elements. This model uses a systems approach and multicriteria semi-quantitative analysis with weighted criteria for calculating the criticality of individual elements of the railway infrastructure. In the conclusion, it presents a practical application of the proposed model including the discussion of results.

  13. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  14. Radionuclide transport and dose assessment modelling in biosphere assessment 2009

    International Nuclear Information System (INIS)

    Hjerpe, T.; Broed, R.

    2010-11-01

    Following the guidelines set forth by the Ministry of Trade and Industry (now Ministry of Employment and Economy), Posiva is preparing to submit a construction license application for the final disposal spent nuclear fuel at the Olkiluoto site, Finland, by the end of the year 2012. Disposal will take place in a geological repository implemented according to the KBS-3 method. The long-term safety section supporting the license application will be based on a safety case that, according to the internationally adopted definition, will be a compilation of the evidence, analyses and arguments that quantify and substantiate the safety and the level of expert confidence in the safety of the planned repository. This report documents in detail the conceptual and mathematical models and key data used in the landscape model set-up, radionuclide transport modelling, and radiological consequences analysis applied in the 2009 biosphere assessment. Resulting environmental activity concentrations in landscape model due to constant unit geosphere release rates, and the corresponding annual doses, are also calculated and presented in this report. This provides the basis for understanding the behaviour of the applied landscape model and subsequent dose calculations. (orig.)

  15. Review and assessment of pool scrubbing models

    International Nuclear Information System (INIS)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-01-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs

  16. Review and assessment of pool scrubbing models

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-07-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs.

  17. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  18. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  19. Integrated assessment models of global climate change

    International Nuclear Information System (INIS)

    Parson, E.A.; Fisher-Vanden, K.

    1997-01-01

    The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

  20. Exposure factors for marine eutrophication impacts assessment based on a mechanistic biological model

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Koski, Marja; Hauschild, Michael Zwicky

    2015-01-01

    marine ecosystem (LME), five climate zones, and site-generic. The XFs obtained range from 0.45 (Central Arctic Ocean) to 15.9kgO2kgN-1 (Baltic Sea). While LME resolution is recommended, aggregated PE or XF per climate zone can be adopted, but not global aggregation due to high variability. The XF......Emissions of nitrogen (N) from anthropogenic sources enrich marine waters and promote planktonic growth. This newly synthesised organic carbon is eventually exported to benthic waters where aerobic respiration by heterotrophic bacteria results in the consumption of dissolved oxygen (DO......). This pathway is typical of marine eutrophication. A model is proposed to mechanistically estimate the response of coastal marine ecosystems to N inputs. It addresses the biological processes of nutrient-limited primary production (PP), metazoan consumption, and bacterial degradation, in four distinct sinking...

  1. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  2. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  3. Nuclear security assessment with Markov model approach

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Terao, Norichika

    2013-01-01

    Nuclear security risk assessment with the Markov model based on random event is performed to explore evaluation methodology for physical protection in nuclear facilities. Because the security incidences are initiated by malicious and intentional acts, expert judgment and Bayes updating are used to estimate scenario and initiation likelihood, and it is assumed that the Markov model derived from stochastic process can be applied to incidence sequence. Both an unauthorized intrusion as Design Based Threat (DBT) and a stand-off attack as beyond-DBT are assumed to hypothetical facilities, and performance of physical protection and mitigation and minimization of consequence are investigated to develop the assessment methodology in a semi-quantitative manner. It is shown that cooperation between facility operator and security authority is important to respond to the beyond-DBT incidence. (author)

  4. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  5. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  6. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  7. Review and assessment of pool scrubbing models

    International Nuclear Information System (INIS)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez-Jimenez, J.

    1996-01-01

    Decontamination of fission products bearing bubbles as they through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC andBUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B.As a result of this work, model capabilities and shortcomings have beenassessed and some areas susceptible of further research have been identified.(Author) 73 refs

  8. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  9. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  10. Triangular model integrating clinical teaching and assessment.

    Science.gov (United States)

    Abdelaziz, Adel; Koshak, Emad

    2014-01-01

    Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment.

  11. The MESORAD dose assessment model: Computer code

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Bander, T.J.; Scherpelz, R.I.

    1988-10-01

    MESORAD is a dose equivalent model for emergency response applications that is designed to be run on minicomputers. It has been developed by the Pacific Northwest Laboratory for use as part of the Intermediate Dose Assessment System in the US Nuclear Regulatory Commission Operations Center in Washington, DC, and the Emergency Management System in the US Department of Energy Unified Dose Assessment Center in Richland, Washington. This volume describes the MESORAD computer code and contains a listing of the code. The technical basis for MESORAD is described in the first volume of this report (Scherpelz et al. 1986). A third volume of the documentation planned. That volume will contain utility programs and input and output files that can be used to check the implementation of MESORAD. 18 figs., 4 tabs

  12. Triangular model integrating clinical teaching and assessment

    Directory of Open Access Journals (Sweden)

    Abdelaziz A

    2014-03-01

    Full Text Available Adel Abdelaziz,1,2 Emad Koshak3 1Medical Education Development Unit, Faculty of Medicine, Al Baha University, Al Baha, Saudi Arabia; 2Medical Education Department, Faculty of Medicine, Suez Canal University, Egypt; 3Dean and Internal Medicine Department, Faculty of Medicine, Al Baha University, Al Baha, Saudi Arabia Abstract: Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment. Keywords: curriculum development, teaching, learning, assessment, apprenticeship, community-based settings, health service-based settings

  13. Assessing elders using the functional health pattern assessment model.

    Science.gov (United States)

    Beyea, S; Matzo, M

    1989-01-01

    The impact of older Americans on the health care system requires we increase our students' awareness of their unique needs. The authors discuss strategies to develop skills using Gordon's Functional Health Patterns Assessment for assessing older clients.

  14. An Exploratory Study: Assessment of Modeled Dioxin ...

    Science.gov (United States)

    EPA has released an external review draft entitled, An Exploratory Study: Assessment of Modeled Dioxin Exposure in Ceramic Art Studios(External Review Draft). The public comment period and the external peer-review workshop are separate processes that provide opportunities for all interested parties to comment on the document. In addition to consideration by EPA, all public comments submitted in accordance with this notice will also be forwarded to EPA’s contractor for the external peer-review panel prior to the workshop. EPA has realeased this draft document solely for the purpose of pre-dissemination peer review under applicable information quality guidelines. This document has not been formally disseminated by EPA. It does not represent and should not be construed to represent any Agency policy or determination. The purpose of this report is to describe an exploratory investigation of potential dioxin exposures to artists/hobbyists who use ball clay to make pottery and related products.

  15. Modelling saline intrusion for repository performance assessment

    International Nuclear Information System (INIS)

    Jackson, C.P.

    1989-04-01

    UK Nirex Ltd are currently considering the possibility of disposal of radioactive waste by burial in deep underground repositories. The natural pathway for radionuclides from such a repository to return to Man's immediate environment (the biosphere) is via groundwater. Thus analyses of the groundwater flow in the neighbourhood of a possible repository, and consequent radionuclide transport form an important part of a performance assessment for a repository. Some of the areas in the UK that might be considered as possible locations for a repository are near the coast. If a repository is located in a coastal region seawater may intrude into the groundwater flow system. As seawater is denser than fresh water buoyancy forces acting on the intruding saline water may have significant effects on the groundwater flow system, and consequently on the time for radionuclides to return to the biosphere. Further, the chemistry of the repository near-field may be strongly influenced by the salinity of the groundwater. It is therefore important for Nirex to have a capability for reliably modelling saline intrusion to an appropriate degree of accuracy in order to make performance assessments for a repository in a coastal region. This report describes work undertaken in the Nirex Research programme to provide such a capability. (author)

  16. Accuracy Assessment of Different Digital Surface Models

    Directory of Open Access Journals (Sweden)

    Ugur Alganci

    2018-03-01

    Full Text Available Digital elevation models (DEMs, which can occur in the form of digital surface models (DSMs or digital terrain models (DTMs, are widely used as important geospatial information sources for various remote sensing applications, including the precise orthorectification of high-resolution satellite images, 3D spatial analyses, multi-criteria decision support systems, and deformation monitoring. The accuracy of DEMs has direct impacts on specific calculations and process chains; therefore, it is important to select the most appropriate DEM by considering the aim, accuracy requirement, and scale of each study. In this research, DSMs obtained from a variety of satellite sensors were compared to analyze their accuracy and performance. For this purpose, freely available Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER 30 m, Shuttle Radar Topography Mission (SRTM 30 m, and Advanced Land Observing Satellite (ALOS 30 m resolution DSM data were obtained. Additionally, 3 m and 1 m resolution DSMs were produced from tri-stereo images from the SPOT 6 and Pleiades high-resolution (PHR 1A satellites, respectively. Elevation reference data provided by the General Command of Mapping, the national mapping agency of Turkey—produced from 30 cm spatial resolution stereo aerial photos, with a 5 m grid spacing and ±3 m or better overall vertical accuracy at the 90% confidence interval (CI—were used to perform accuracy assessments. Gross errors and water surfaces were removed from the reference DSM. The relative accuracies of the different DSMs were tested using a different number of checkpoints determined by different methods. In the first method, 25 checkpoints were selected from bare lands to evaluate the accuracies of the DSMs on terrain surfaces. In the second method, 1000 randomly selected checkpoints were used to evaluate the methods’ accuracies for the whole study area. In addition to the control point approach, vertical cross

  17. Hierarchical Model of Assessing and Selecting Experts

    Science.gov (United States)

    Chernysheva, T. Y.; Korchuganova, M. A.; Borisov, V. V.; Min'kov, S. L.

    2016-04-01

    Revealing experts’ competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  18. Hierarchical Model of Assessing and Selecting Experts

    OpenAIRE

    Chernysheva, Tatiana Yurievna; Korchuganova, Mariya Anatolievna; Borisov, V. V.; Minkov, S. L.

    2016-01-01

    Revealing experts' competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  19. Critical assessment of nuclear mass models

    International Nuclear Information System (INIS)

    Moeller, P.; Nix, J.R.

    1992-01-01

    Some of the physical assumptions underlying various nuclear mass models are discussed. The ability of different mass models to predict new masses that were not taken into account when the models were formulated and their parameters determined is analyzed. The models are also compared with respect to their ability to describe nuclear-structure properties in general. The analysis suggests future directions for mass-model development

  20. The Development of a Secondary School Health Assessment Model

    Science.gov (United States)

    Sriring, Srinual; Erawan, Prawit; Sriwarom, Monoon

    2015-01-01

    The objective of this research was to: 1) involved a survey of information relating to secondary school health, 2) involved the construction of a model of health assessment and a handbook for using the model in secondary school, 3) develop an assessment model for secondary school. The research included 3 phases. (1) involved a survey of…

  1. FORMATIVE ASSESSMENT MODEL OF LEARNING SUCCESS ACHIEVEMENTS

    Directory of Open Access Journals (Sweden)

    Mikhailova Elena Konstantinovna

    2013-05-01

    Full Text Available The paper is devoted to the problem of assessment of the school students’ learning success achievements. The problem is investigated from the viewpoint of assessing the students’ learning outcomes that is aimed to ensure the teachers and students with the means and conditions to improve the educational process and results.

  2. Assessing Asset Pricing Models Using Revealed Preference

    OpenAIRE

    Jonathan B. Berk; Jules H. van Binsbergen

    2014-01-01

    We propose a new method of testing asset pricing models that relies on using quantities rather than prices or returns. We use the capital flows into and out of mutual funds to infer which risk model investors use. We derive a simple test statistic that allows us to infer, from a set of candidate models, the model that is closest to the model that investors use in making their capital allocation decisions. Using this methodology, we find that of the models most commonly used in the literature,...

  3. Assessing NARCCAP climate model effects using spatial confidence regions

    Directory of Open Access Journals (Sweden)

    J. P. French

    2017-07-01

    Full Text Available We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

  4. A comparison of models for risk assessment

    International Nuclear Information System (INIS)

    Kellerer, A.M.; Jing Chen

    1993-01-01

    Various mathematical models have been used to represent the dependence of excess cancer risk on dose, age and time since exposure. For solid cancers, i.e. all cancers except leukaemia, the so-called relative risk model is usually employed. However, there can be quite different relative risk models. The most usual model for the quantification of excess tumour rate among the atomic bomb survivors has been a dependence of the relative risk on age at exposure, but it has been shown recently that an age attained model can be equally applied, to represent the observations among the atomic bomb survivors. The differences between the models and their implications are explained. It is also shown that the age attained model is similar to the approaches that have been used in the analysis of lung cancer incidence among radon exposed miners. A more unified approach to modelling of radiation risks can thus be achieved. (3 figs.)

  5. Assessment of multi class kinematic wave models

    NARCIS (Netherlands)

    Van Wageningen-Kessels, F.L.M.; Van Lint, J.W.C.; Vuik, C.; Hoogendoorn, S.P.

    2012-01-01

    In the last decade many multi class kinematic wave (MCKW) traffic ow models have been proposed. MCKW models introduce heterogeneity among vehicles and drivers. For example, they take into account differences in (maximum) velocities and driving style. Nevertheless, the models are macroscopic and the

  6. Theoretical Models, Assessment Frameworks and Test Construction.

    Science.gov (United States)

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  7. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  8. Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity.

    Science.gov (United States)

    Marson, Daniel

    2016-09-01

    The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Accuracy assessment of landslide prediction models

    International Nuclear Information System (INIS)

    Othman, A N; Mohd, W M N W; Noraini, S

    2014-01-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones

  10. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  11. Quality assessment for radiological model parameters

    International Nuclear Information System (INIS)

    Funtowicz, S.O.

    1989-01-01

    A prototype framework for representing uncertainties in radiological model parameters is introduced. This follows earlier development in this journal of a corresponding framework for representing uncertainties in radiological data. Refinements and extensions to the earlier framework are needed in order to take account of the additional contextual factors consequent on using data entries to quantify model parameters. The parameter coding can in turn feed in to methods for evaluating uncertainties in calculated model outputs. (author)

  12. Polytomous Rasch Models in Counseling Assessment

    Science.gov (United States)

    Willse, John T.

    2017-01-01

    This article provides a brief introduction to the Rasch model. Motivation for using Rasch analyses is provided. Important Rasch model concepts and key aspects of result interpretation are introduced, with major points reinforced using a simulation demonstration. Concrete guidelines are provided regarding sample size and the evaluation of items.

  13. Assessing a Theoretical Model on EFL College Students

    Science.gov (United States)

    Chang, Yu-Ping

    2011-01-01

    This study aimed to (1) integrate relevant language learning models and theories, (2) construct a theoretical model of college students' English learning performance, and (3) assess the model fit between empirically observed data and the theoretical model proposed by the researchers of this study. Subjects of this study were 1,129 Taiwanese EFL…

  14. Assessing Model Characterization of Single Source ...

    Science.gov (United States)

    Aircraft measurements made downwind from specific coal fired power plants during the 2013 Southeast Nexus field campaign provide a unique opportunity to evaluate single source photochemical model predictions of both O3 and secondary PM2.5 species. The model did well at predicting downwind plume placement. The model shows similar patterns of an increasing fraction of PM2.5 sulfate ion to the sum of SO2 and PM2.5 sulfate ion by distance from the source compared with ambient based estimates. The model was less consistent in capturing downwind ambient based trends in conversion of NOX to NOY from these sources. Source sensitivity approaches capture near-source O3 titration by fresh NO emissions, in particular subgrid plume treatment. However, capturing this near-source chemical feature did not translate into better downwind peak estimates of single source O3 impacts. The model estimated O3 production from these sources but often was lower than ambient based source production. The downwind transect ambient measurements, in particular secondary PM2.5 and O3, have some level of contribution from other sources which makes direct comparison with model source contribution challenging. Model source attribution results suggest contribution to secondary pollutants from multiple sources even where primary pollutants indicate the presence of a single source. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, deci

  15. Mathematical Models for Camouflage Pattern Assessment

    Science.gov (United States)

    2013-04-01

    Matemático Facultad de Ciencias F́ısicas y Matemáticas http://www.cmm.uchile.cl DISTRIBUTION A: Distribution approved for public release University of Chile...Centro de Modelamiento Matemático Facultad de Ciencias Físicas y Matemáticas Final Report: Camouage Assessment January 2013 Abstract The main

  16. Structure ignition assessment model (SIAM)\\t

    Science.gov (United States)

    Jack D. Cohen

    1995-01-01

    Major wildland/urban interface fire losses, principally residences, continue to occur. Although the problem is not new, the specific mechanisms are not well known on how structures ignite in association with wildland fires. In response to the need for a better understanding of wildland/urban interface ignition mechanisms and a method of assessing the ignition risk,...

  17. Auditory modelling for assessing room acoustics

    NARCIS (Netherlands)

    Van Dorp Schuitman, J.

    2011-01-01

    The acoustics of a concert hall, or any other room, are generally assessed by measuring room impulse responses for one or multiple source and receiver location(s). From these responses, objective parameters can be determined that should be related to various perceptual attributes of room acoustics.

  18. Specialty Payment Model Opportunities and Assessment

    Science.gov (United States)

    Mulcahy, Andrew W.; Chan, Chris; Hirshman, Samuel; Huckfeldt, Peter J.; Kofner, Aaron; Liu, Jodi L.; Lovejoy, Susan L.; Popescu, Ioana; Timbie, Justin W.; Hussey, Peter S.

    2015-01-01

    Abstract Gastroenterology and cardiology services are common and costly among Medicare beneficiaries. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model. This article describes research related to the design of episode-based payment models for ambulatory gastroenterology and cardiology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare and Medicaid Services (CMS). The authors analyzed Medicare claims data to describe the frequency and characteristics of gastroenterology and cardiology index procedures, the practices that delivered index procedures, and the patients that received index procedures. The results of these analyses can help inform CMS decisions about the definition of episodes in an episode-based payment model; payment adjustments for service setting, multiple procedures, or other factors; and eligibility for the payment model. PMID:28083363

  19. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  20. Assessing The Performance of Hydrological Models

    Science.gov (United States)

    van der Knijff, Johan

    The performance of hydrological models is often characterized using the coefficient of efficiency, E. The sensitivity of E to extreme streamflow values, and the difficulty of deciding what value of E should be used as a threshold to identify 'good' models or model parameterizations, have proven to be serious shortcomings of this index. This paper reviews some alternative performance indices that have appeared in the litera- ture. Legates and McCabe (1999) suggested a more generalized form of E, E'(j,B). Here, j is a parameter that controls how much emphasis is put on extreme streamflow values, and B defines a benchmark or 'null hypothesis' against which the results of the model are tested. E'(j,B) was used to evaluate a large number of parameterizations of a conceptual rainfall-runoff model, using 6 different combinations of j and B. First, the effect of j and B is explained. Second, it is demonstrated how the index can be used to explicitly test hypotheses about the model and the data. This approach appears to be particularly attractive if the index is used as a likelihood measure within a GLUE-type analysis.

  1. A normative model for assessing competitive strategy

    OpenAIRE

    Ungerer, Gerard David; Cayzer, Steve

    2016-01-01

    The hyper-competitive nature of e-business has raised the need for a generic way to appraise the merit of a developed business strategy. Although progress has been made in the domain of strategy evaluation, the established literature differs over the ‘tests’ that a strategy must pass to be considered well-constructed. This paper therefore investigates the existing strategy-evaluation literature to propose a more integrated and comprehensive normative strategic assessment that can be used to e...

  2. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  3. Assessing physical models used in nuclear aerosol transport models

    International Nuclear Information System (INIS)

    McDonald, B.H.

    1987-01-01

    Computer codes used to predict the behaviour of aerosols in water-cooled reactor containment buildings after severe accidents contain a variety of physical models. Special models are in place for describing agglomeration processes where small aerosol particles combine to form larger ones. Other models are used to calculate the rates at which aerosol particles are deposited on building structures. Condensation of steam on aerosol particles is currently a very active area in aerosol modelling. In this paper, the physical models incorporated in the current available international codes for all of these processes are reviewed and documented. There is considerable variation in models used in different codes, and some uncertainties exist as to which models are superior. 28 refs

  4. Review of early assessment models of innovative medical technologies.

    Science.gov (United States)

    Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller

    2017-08-01

    Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  5. Regional Models for Sediment Toxicity Assessment

    Science.gov (United States)

    This paper investigates the use of empirical models to predict the toxicity of sediment samples within a region to laboratory test organisms based on sediment chemistry. In earlier work, we used a large nationwide database of matching sediment chemistry and marine amphipod sedim...

  6. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  7. The air emissions risk assessment model (AERAM)

    International Nuclear Information System (INIS)

    Gratt, L.B.

    1991-01-01

    AERAM is an environmental analysis and power generation station investment decision support tool. AERAM calculates the public health risk (in terms of the lifetime cancers) in the nearby population from pollutants released into the air. AERAM consists of four main subroutines: Emissions, Air, Exposure and Risk. The Emission subroutine uses power plant parameters to calculate the expected release of the pollutants. A coal-fired and oil-fired power plant are currently available. A gas-fired plant model is under preparation. The release of the pollutants into the air is followed by their dispersal in the environment. The dispersion in the Air Subroutine uses the Environmental Protection Agency's model, Industrial Source Complex-Long Term. Additional dispersion models (Industrial Source Complex - Short Term and Cooling Tower Drift) are being implemented for future AERAM versions. The Expose Subroutine uses the ambient concentrations to compute population exposures for the pollutants of concern. The exposures are used with corresponding dose-response model in the Risk Subroutine to estimate both the total population risk and individual risk. The risk for the dispersion receptor-population centroid for the maximum concentration is also calculated for regulatory-population purposes. In addition, automated interfaces with AirTox (an air risk decision model) have been implemented to extend AERAM's steady-state single solution to the decision-under-uncertainty domain. AERAM was used for public health risks, the investment decision for additional pollution control systems based on health risk reductions, and the economics of fuel vs. health risk tradeoffs. AERAM provides that state-of-the-art capability for evaluating the public health impact airborne toxic substances in response to regulations and public concern

  8. A normative model for assessing competitive strategy

    Directory of Open Access Journals (Sweden)

    Ungerer, Gerard David

    2016-12-01

    Full Text Available The hyper-competitive nature of e-business has raised the need for a generic way to appraise the merit of a developed business strategy. Although progress has been made in the domain of strategy evaluation, the established literature differs over the ‘tests’ that a strategy must pass to be considered well-constructed. This paper therefore investigates the existing strategy-evaluation literature to propose a more integrated and comprehensive normative strategic assessment that can be used to evaluate and refine a business’ s competitive strategy , adding to its robustness and survivability.

  9. Sustainability Assessment Model in Product Development

    Science.gov (United States)

    Turan, Faiz Mohd; Johan, Kartina; Nor, Nik Hisyamudin Muhd; Omar, Badrul

    2017-08-01

    Faster and more efficient development of innovative and sustainable products has become the focus for manufacturing companies in order to remain competitive in today’s technologically driven world. Design concept evaluation which is the end of conceptual design is one of the most critical decision points. It relates to the final success of product development, because poor criteria assessment in design concept evaluation can rarely compensated at the later stages. Furthermore, consumers, investors, shareholders and even competitors are basing their decisions on what to buy or invest in, from whom, and also on what company report, and sustainability is one of a critical component. In this research, a new methodology of sustainability assessment in product development for Malaysian industry has been developed using integration of green project management, new scale of “Weighting criteria” and Rough-Grey Analysis. This method will help design engineers to improve the effectiveness and objectivity of the sustainable design concept evaluation, enable them to make better-informed decisions before finalising their choice and consequently create value to the company or industry. The new framework is expected to provide an alternative to existing methods.

  10. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  11. ITER plasma safety interface models and assessments

    International Nuclear Information System (INIS)

    Uckan, N.A.; Bartels, H-W.; Honda, T.; Amano, T.; Boucher, D.; Post, D.; Wesley, J.

    1996-01-01

    Physics models and requirements to be used as a basis for safety analysis studies are developed and physics results motivated by safety considerations are presented for the ITER design. Physics specifications are provided for enveloping plasma dynamic events for Category I (operational event), Category II (likely event), and Category III (unlikely event). A safety analysis code SAFALY has been developed to investigate plasma anomaly events. The plasma response to ex-vessel component failure and machine response to plasma transients are considered

  12. Survivability Assessment: Modeling A Recovery Process

    OpenAIRE

    Paputungan, Irving Vitra; Abdullah, Azween

    2009-01-01

    Survivability is the ability of a system to continue operating, in a timely manner, in the presence ofattacks, failures, or accidents. Recovery in survivability is a process of a system to heal or recover from damageas early as possible to fulfill its mission as condition permit. In this paper, we show a preliminary recoverymodel to enhance the system survivability. The model focuses on how we preserve the system and resumes itscritical service under attacks as soon as possible.Keywords: surv...

  13. Enterprise Cloud Adoption - Cloud Maturity Assessment Model

    OpenAIRE

    Conway, Gerry; Doherty, Eileen; Carcary, Marian; Crowley, Catherine

    2017-01-01

    The introduction and use of cloud computing by an organization has the promise of significant benefits that include reduced costs, improved services, and a pay-per-use model. Organizations that successfully harness these benefits will potentially have a distinct competitive edge, due to their increased agility and flexibility to rapidly respond to an ever changing and complex business environment. However, as cloud technology is a relatively new ph...

  14. Assessing testamentary and decision-making capacity: Approaches and models.

    Science.gov (United States)

    Purser, Kelly; Rosenfeld, Tuly

    2015-09-01

    The need for better and more accurate assessments of testamentary and decision-making capacity grows as Australian society ages and incidences of mentally disabling conditions increase. Capacity is a legal determination, but one on which medical opinion is increasingly being sought. The difficulties inherent within capacity assessments are exacerbated by the ad hoc approaches adopted by legal and medical professionals based on individual knowledge and skill, as well as the numerous assessment paradigms that exist. This can negatively affect the quality of assessments, and results in confusion as to the best way to assess capacity. This article begins by assessing the nature of capacity. The most common general assessment models used in Australia are then discussed, as are the practical challenges associated with capacity assessment. The article concludes by suggesting a way forward to satisfactorily assess legal capacity given the significant ramifications of getting it wrong.

  15. Performability assessment by model checking of Markov reward models

    NARCIS (Netherlands)

    Baier, Christel; Cloth, L.; Haverkort, Boudewijn R.H.M.; Hermanns, H.; Katoen, Joost P.

    2010-01-01

    This paper describes efficient procedures for model checking Markov reward models, that allow us to evaluate, among others, the performability of computer-communication systems. We present the logic CSRL (Continuous Stochastic Reward Logic) to specify performability measures. It provides flexibility

  16. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    Science.gov (United States)

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  17. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  18. Modelling fog in probabilistic consequence assessment

    International Nuclear Information System (INIS)

    Underwood, B.Y.

    1993-02-01

    Earlier work examined the potential influence of foggy weather conditions on the probabilistic assessment of the consequences of accidental releases of radioactive material to the atmosphere (PCA), in particular the impact of a fraction of the released aerosol becoming incorporated into droplets. A major uncertainty emerging from the initial scoping study concerned estimation of the fraction of the released material that would be taken up into droplets. An objective is to construct a method for handling in a PCA context the effect of fog on deposition, basing the method on the experience gained from prior investigations. There are two aspects to explicitly including the effect of fog in PCA: estimating the probability of occurrence of various types of foggy condition and calculating the impact on the conventional end-points of consequence assessment. For the first, a brief outline is given of the use of meteorological data by PCA computer codes, followed by a discussion of some routinely-recorded meteorological parameters that are pertinent to fog, such as the presentweather code and horizontal visibility. Four stylized scenarios are defined to cover a wide range of situations in which particle growth by uptake of water may have an important impact on deposition. A description is then given of the way in which routine meteorological data could be used to flag the presence of each of these conditions in the meteorological data file used by the PCA code. The approach developed to calculate the impact on deposition is pitched at a level of complexity appropriate to the PCA context and reflects the physical constraints of the system and accounts for the specific characteristics of the released aerosol. (Author)

  19. A new assessment model and tool for pediatric nurse practitioners.

    Science.gov (United States)

    Burns, C

    1992-01-01

    This article presents a comprehensive assessment model for pediatric nurse practitioner (PNP) practice that integrates familiar elements of the classical medical history, Gordon's Functional Health Patterns, and developmental fields into one system. This model drives the diagnostic reasoning process toward consideration of a broad range of disease, daily living (nursing diagnosis), and developmental diagnoses, which represents PNP practice better than the medical model does.

  20. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  1. The importance of trajectory modelling in accident consequence assessments

    International Nuclear Information System (INIS)

    Jones, J.A.; Williams, J.A.; Hill, M.D.

    1988-01-01

    Most atmospheric dispersion models used at present or probabilistic risk assessment (PRA) are linear: they take account of the wind speed but not the direction after the first hour. Therefore, the trajectory model is a more realistic description of the cloud's behaviour. However, the extra complexity means that the computing costs increase. This is an important factor for the MARIA code which is intended to be run on computers of varying power. The numbers of early effects predicted by a linear model and a trajectory model in a probabilistic risk assessment were compared to see which model should be preferred. The trajectory model predicted about 25% fewer expected early deaths and 30% more people evacuated than the linear model. However, the trajectory model took about ten times longer to calculate its results. The choice between the two models may depend on the speed of the computer available

  2. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  3. Modeling and assessing international climate financing

    Science.gov (United States)

    Wu, Jing; Tang, Lichun; Mohamed, Rayman; Zhu, Qianting; Wang, Zheng

    2016-06-01

    Climate financing is a key issue in current negotiations on climate protection. This study establishes a climate financing model based on a mechanism in which donor countries set up funds for climate financing and recipient countries use the funds exclusively for carbon emission reduction. The burden-sharing principles are based on GDP, historical emissions, and consumptionbased emissions. Using this model, we develop and analyze a series of scenario simulations, including a financing program negotiated at the Cancun Climate Change Conference (2010) and several subsequent programs. Results show that sustained climate financing can help to combat global climate change. However, the Cancun Agreements are projected to result in a reduction of only 0.01°C in global warming by 2100 compared to the scenario without climate financing. Longer-term climate financing programs should be established to achieve more significant benefits. Our model and simulations also show that climate financing has economic benefits for developing countries. Developed countries will suffer a slight GDP loss in the early stages of climate financing, but the longterm economic growth and the eventual benefits of climate mitigation will compensate for this slight loss. Different burden-sharing principles have very similar effects on global temperature change and economic growth of recipient countries, but they do result in differences in GDP changes for Japan and the FSU. The GDP-based principle results in a larger share of financial burden for Japan, while the historical emissions-based principle results in a larger share of financial burden for the FSU. A larger burden share leads to a greater GDP loss.

  4. Computational model for the assessment of oil spill damages

    Energy Technology Data Exchange (ETDEWEB)

    Seip, K L; Heiberg, A B; Brekke, K A

    1985-06-01

    A description is given of the method and the required data of a model for calculating oil spill damages. Eleven damage attributes are defined: shorelength contaminated, shore restitution time, birds dead, restitution time for three groups of birds, open sea damages-two types, damages to recreation, economy and fisheries. The model has been applied in several cases of oil pollution assessments: in an examination of alternative models for the organization of oil spill combat in Norway, in the assessment of the damages coused by a blowout at Tromsoeflaket and in assessing a possible increase in oil spill preparedness for Svalbard. 56 references.

  5. Mathematical modeling in biology: A critical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Buiatti, M. [Florence, Univ. (Italy). Dipt. di Biologia Animale e Genetica

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented `lead forward` of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. `Autistic`, monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve `selfish` problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally `top.down` (deductive) and `bottom up` (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples.

  6. Mathematical modeling in biology: A critical assessment

    International Nuclear Information System (INIS)

    Buiatti, M.

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented 'lead forward' of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. 'Autistic', monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve 'selfish' problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally 'top.down' (deductive) and 'bottom up' (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples

  7. Assessing work disability for social security benefits: international models for the direct assessment of work capacity.

    Science.gov (United States)

    Geiger, Ben Baumberg; Garthwaite, Kayleigh; Warren, Jon; Bambra, Clare

    2017-08-25

    It has been argued that social security disability assessments should directly assess claimants' work capacity, rather than relying on proxies such as on functioning. However, there is little academic discussion of how such assessments could be conducted. The article presents an account of different models of direct disability assessments based on case studies of the Netherlands, Germany, Denmark, Norway, the United States of America, Canada, Australia, and New Zealand, utilising over 150 documents and 40 expert interviews. Three models of direct work disability assessments can be observed: (i) structured assessment, which measures the functional demands of jobs across the national economy and compares these to claimants' functional capacities; (ii) demonstrated assessment, which looks at claimants' actual experiences in the labour market and infers a lack of work capacity from the failure of a concerned rehabilitation attempt; and (iii) expert assessment, based on the judgement of skilled professionals. Direct disability assessment within social security is not just theoretically desirable, but can be implemented in practice. We have shown that there are three distinct ways that this can be done, each with different strengths and weaknesses. Further research is needed to clarify the costs, validity/legitimacy, and consequences of these different models. Implications for rehabilitation It has recently been argued that social security disability assessments should directly assess work capacity rather than simply assessing functioning - but we have no understanding about how this can be done in practice. Based on case studies of nine countries, we show that direct disability assessment can be implemented, and argue that there are three different ways of doing it. These are "demonstrated assessment" (using claimants' experiences in the labour market), "structured assessment" (matching functional requirements to workplace demands), and "expert assessment" (the

  8. Pipeline modeling and assessment in unstable slopes

    Energy Technology Data Exchange (ETDEWEB)

    Caceres, Carlos Nieves [Oleoducto Central S.A., Bogota, Cundinamarca (Colombia); Ordonez, Mauricio Pereira [SOLSIN S.A.S, Bogota, Cundinamarca (Colombia)

    2010-07-01

    The OCENSA pipeline system is vulnerable to geotechnical problems such as faults, landslides or creeping slopes, which are well-known in the Andes Mountains and tropical countries like Colombia. This paper proposes a methodology to evaluate the pipe behaviour during the soil displacements of slow landslides. Three different cases of analysis are examined, according to site characteristics. The process starts with a simplified analytical model and develops into 3D finite element numerical simulations applied to the on-site geometry of soil and pipe. Case 1 should be used when the unstable site is subject to landslides impacting significant lengths of pipeline, pipeline is straight, and landslide is simple from the geotechnical perspective. Case 2 should be used when pipeline is straight and landslide is complex (creeping slopes and non-conventional stabilization solutions). Case 3 should be used if the pipeline presents vertical or horizontal bends.

  9. A Multi-Actor Dynamic Integrated Assessment Model (MADIAM)

    OpenAIRE

    Weber, Michael

    2004-01-01

    The interactions between climate and the socio-economic system are investigated with a Multi-Actor Dynamic Integrated Assessment Model (MADIAM) obtained by coupling a nonlinear impulse response model of the climate sub-system (NICCS) to a multi-actor dynamic economic model (MADEM). The main goal is to initiate a model development that is able to treat the dynamics of the coupled climate socio-economic system, including endogenous technological change, in a non-equilibrium situation, thereby o...

  10. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  11. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1991-10-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues are raised by various critics. In this paper I provide some perspective on issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass-action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes

  12. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  13. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  14. Assessment and development of implementation models of health ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Assessment and development of implementation models of health-related ... The Contribution of Civil Society Organizations in Achieving Health for All ... Health Information for Maternal and Child Health Planning in Urban Bangladesh.

  15. Route Assessment for Unmanned Aerial Vehicle Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Xixia Sun

    2014-01-01

    Full Text Available An integrated route assessment approach based on cloud model is proposed in this paper, where various sources of uncertainties are well kept and modeled by cloud theory. Firstly, a systemic criteria framework incorporating models for scoring subcriteria is developed. Then, the cloud model is introduced to represent linguistic variables, and survivability probability histogram of each route is converted into normal clouds by cloud transformation, enabling both randomness and fuzziness in the assessment environment to be managed simultaneously. Finally, a new way to measure the similarity between two normal clouds satisfying reflexivity, symmetry, transitivity, and overlapping is proposed. Experimental results demonstrate that the proposed route assessment approach outperforms fuzzy logic based assessment approach with regard to feasibility, reliability, and consistency with human thinking.

  16. Model of environmental life cycle assessment for coal mining operations.

    Science.gov (United States)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  18. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  19. Adaptation in integrated assessment modeling: where do we stand?

    OpenAIRE

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We analyze how modelers have chosen to describe adaptation within an integrated framework, and suggest many ways they could improve the treatment of adaptation by considering more of its bottom-up cha...

  20. Semantic modeling of portfolio assessment in e-learning environment

    Directory of Open Access Journals (Sweden)

    Lucila Romero

    2017-01-01

    Full Text Available In learning environment, portfolio is used as a tool to keep track of learner’s progress. Particularly, when it comes to e-learning, continuous assessment allows greater customization and efficiency in learning process and prevents students lost interest in their study. Also, each student has his own characteristics and learning skills that must be taken into account in order to keep learner`s interest. So, personalized monitoring is the key to guarantee the success of technology-based education. In this context, portfolio assessment emerge as the solution because is an easy way to allow teacher organize and personalize assessment according to students characteristic and need. A portfolio assessment can contain various types of assessment like formative assessment, summative assessment, hetero or self-assessment and use different instruments like multiple choice questions, conceptual maps, and essay among others. So, a portfolio assessment represents a compilation of all assessments must be solved by a student in a course, it documents progress and set targets. In previous work, it has been proposed a conceptual framework that consist of an ontology network named AOnet which is a semantic tool conceptualizing different types of assessments. Continuing that work, this paper presents a proposal to implement portfolios assessment in e-learning environments. The proposal consists of a semantic model that describes key components and relations of this domain to set the bases to develop a tool to generate, manage and perform portfolios assessment.

  1. Adaptation in integrated assessment modeling: where do we stand?

    NARCIS (Netherlands)

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We

  2. Performance and Cognitive Assessment in 3-D Modeling

    Science.gov (United States)

    Fahrer, Nolan E.; Ernst, Jeremy V.; Branoff, Theodore J.; Clark, Aaron C.

    2011-01-01

    The purpose of this study was to investigate identifiable differences between performance and cognitive assessment scores in a 3-D modeling unit of an engineering drafting course curriculum. The study aimed to provide further investigation of the need of skill-based assessments in engineering/technical graphics courses to potentially increase…

  3. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  4. Using models in Integrated Ecosystem Assessment of coastal areas

    Science.gov (United States)

    Solidoro, Cosimo; Bandelj, Vinko; Cossarini, Gianpiero; Melaku Canu, Donata; Libralato, Simone

    2014-05-01

    Numerical Models can greatly contribute to integrated ecological assessment of coastal and marine systems. Indeed, models can: i) assist in the identification of efficient sampling strategy; ii) provide space interpolation and time extrapolation of experiemtanl data which are based on the knowedge on processes dynamics and causal realtionships which is coded within the model, iii) provide estimates of hardly measurable indicators. Furthermore model can provide indication on potential effects of implementation of alternative management policies. Finally, by providing a synthetic representation of an ideal system, based on its essential dynamic, model return a picture of ideal behaviour of a system in the absence of external perturbation, alteration, noise, which might help in the identification of reference behaivuor. As an important example, model based reanalyses of biogeochemical and ecological properties are an urgent need for the estimate of the environmental status and the assessment of efficacy of conservation and environmental policies, also with reference to the enforcement of the European MSFD. However, the use of numerical models, and particularly of ecological models, in modeling and in environmental management still is far from be the rule, possibly because of a lack in realizing the benefits which a full integration of modeling and montoring systems might provide, possibly because of a lack of trust in modeling results, or because many problems still exists in the development, validation and implementation of models. For istance, assessing the validity of model results is a complex process that requires the definition of appropriate indicators, metrics, methodologies and faces with the scarcity of real-time in-situ biogeochemical data. Furthermore, biogeochemical models typically consider dozens of variables which are heavily undersampled. Here we show how the integration of mathematical model and monitoring data can support integrated ecosystem

  5. The Effect of Computer Models as Formative Assessment on Student Understanding of the Nature of Models

    Science.gov (United States)

    Park, Mihwa; Liu, Xiufeng; Smith, Erica; Waight, Noemi

    2017-01-01

    This study reports the effect of computer models as formative assessment on high school students' understanding of the nature of models. Nine high school teachers integrated computer models and associated formative assessments into their yearlong high school chemistry course. A pre-test and post-test of students' understanding of the nature of…

  6. Users guide to REGIONAL-1: a regional assessment model

    International Nuclear Information System (INIS)

    Davis, W.E.; Eadie, W.J.; Powell, D.C.

    1979-09-01

    A guide was prepared to allow a user to run the PNL long-range transport model, REGIONAL 1. REGIONAL 1 is a computer model set up to run atmospheric assessments on a regional basis. The model has the capability of being run in three modes for a single time period. The three modes are: (1) no deposition, (2) dry deposition, (3) wet and dry deposition. The guide provides the physical and mathematical basis used in the model for calculating transport, diffusion, and deposition for all three modes. Also the guide includes a program listing with an explanation of the listings and an example in the form of a short-term assessment for 48 hours. The purpose of the example is to allow a person who has past experience with programming and meteorology to operate the assessment model and compare his results with the guide results. This comparison will assure the user that the program is operating in a proper fashion

  7. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  8. Accident consequence assessments with different atmospheric dispersion models

    International Nuclear Information System (INIS)

    Panitz, H.J.

    1989-11-01

    An essential aim of the improvements of the new program system UFOMOD for Accident Consequence Assessments (ACAs) was to substitute the straight-line Gaussian plume model conventionally used in ACA models by more realistic atmospheric dispersion models. To identify improved models which can be applied in ACA codes and to quantify the implications of different dispersion models on the results of an ACA, probabilistic comparative calculations with different atmospheric dispersion models have been performed. The study showed that there are trajectory models available which can be applied in ACAs and that they provide more realistic results of ACAs than straight-line Gaussian models. This led to a completely novel concept of atmospheric dispersion modelling in which two different distance ranges of validity are distinguished: the near range of some ten kilometres distance and the adjacent far range which are assigned to respective trajectory models. (orig.) [de

  9. Specialty Payment Model Opportunities and Assessment: Oncology Model Design Report.

    Science.gov (United States)

    Huckfeldt, Peter J; Chan, Chris; Hirshman, Samuel; Kofner, Aaron; Liu, Jodi L; Mulcahy, Andrew W; Popescu, Ioana; Stevens, Clare; Timbie, Justin W; Hussey, Peter S

    2015-07-15

    This article describes research related to the design of a payment model for specialty oncology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare & Medicaid Services (CMS). Cancer is a common and costly condition. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model for oncology care. Episode-based payment systems can provide flexibility to health care providers to select among the most effective and efficient treatment alternatives, including activities that are not currently reimbursed under Medicare payment policies. However, the model design also needs to ensure that high-quality care is delivered and that beneficial treatments are not withheld from patients. CMS asked MITRE and RAND to conduct analyses to inform design decisions related to an episode-based oncology model for Medicare beneficiaries undergoing chemotherapy treatment for cancer. In particular, this study focuses on analyses of Medicare claims data related to the definition of the initiation of an episode of chemotherapy, patterns of spending during and surrounding episodes of chemotherapy, and attribution of episodes of chemotherapy to physician practices. We found that the time between the primary cancer diagnosis and chemotherapy initiation varied widely across patients, ranging from one day to over seven years, with a median of 2.4 months. The average level of total monthly payments varied considerably across cancers, with the highest spending peak of $9,972 for lymphoma, and peaks of $3,109 for breast cancer and $2,135 for prostate cancer.

  10. A parsimonious dynamic model for river water quality assessment.

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  11. A Process Model for Assessing Adolescent Risk for Suicide.

    Science.gov (United States)

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  12. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  13. EASETECH – A LCA model for assessment of environmental technologies

    DEFF Research Database (Denmark)

    Damgaard, Anders; Baumeister, Hubert; Astrup, Thomas Fruergaard

    2014-01-01

    EASETECH is a new model for the environmental assessment of environmental technologies developed in collaboration between DTU Environment and DTU Compute. EASETECH is based on experience gained in the field of waste management modelling over the last decade and applies the same concepts to systems...

  14. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  15. Review of early assessment models of innovative medical technologies

    DEFF Research Database (Denmark)

    Fasterholdt, Iben; Krahn, Murray D; Kidholm, Kristian

    2017-01-01

    INTRODUCTION: Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models...

  16. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  17. Application of the cognitive therapy model to initial crisis assessment.

    Science.gov (United States)

    Calvert, Patricia; Palmer, Christine

    2003-03-01

    This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.

  18. Assessment of Teacher Perceived Skill in Classroom Assessment Practices Using IRT Models

    Science.gov (United States)

    Koloi-Keaikitse, Setlhomo

    2017-01-01

    The purpose of this study was to assess teacher perceived skill in classroom assessment practices. Data were collected from a sample of (N = 691) teachers selected from government primary, junior secondary, and senior secondary schools in Botswana. Item response theory models were used to identify teacher response on items that measured their…

  19. Model summary report for the safety assessment SR-Site

    International Nuclear Information System (INIS)

    Vahlund, Fredrik; Zetterstroem Evins, Lena; Lindgren, Maria

    2010-12-01

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  20. Model summary report for the safety assessment SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik; Zetterstroem Evins, Lena (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Lindgren, Maria (Kemakta Konsult AB, Stockholm (Sweden))

    2010-12-15

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  1. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  2. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  3. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  4. A test-bed modeling study for wave resource assessment

    Science.gov (United States)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  5. Economic assessment model architecture for AGC/AVLIS selection

    International Nuclear Information System (INIS)

    Hoglund, R.L.

    1984-01-01

    The economic assessment model architecture described provides the flexibility and completeness in economic analysis that the selection between AGC and AVLIS demands. Process models which are technology-specific will provide the first-order responses of process performance and cost to variations in process parameters. The economics models can be used to test the impacts of alternative deployment scenarios for a technology. Enterprise models provide global figures of merit for evaluating the DOE perspective on the uranium enrichment enterprise, and business analysis models compute the financial parameters from the private investor's viewpoint

  6. Fire models for assessment of nuclear power plant fires

    International Nuclear Information System (INIS)

    Nicolette, V.F.; Nowlen, S.P.

    1989-01-01

    This paper reviews the state-of-the-art in available fire models for the assessment of nuclear power plants fires. The advantages and disadvantages of three basic types of fire models (zone, field, and control volume) and Sandia's experience with these models will be discussed. It is shown that the type of fire model selected to solve a particular problem should be based on the information that is required. Areas of concern which relate to all nuclear power plant fire models are identified. 17 refs., 6 figs

  7. A model for assessing human cognitive reliability in PRA studies

    International Nuclear Information System (INIS)

    Hannaman, G.W.; Spurgin, A.J.; Lukic, Y.

    1985-01-01

    This paper summarizes the status of a research project sponsored by EPRI as part of the Probabilistic Risk Assessment (PRA) technology improvement program and conducted by NUS Corporation to develop a model of Human Cognitive Reliability (HCR). The model was synthesized from features identified in a review of existing models. The model development was based on the hypothesis that the key factors affecting crew response times are separable. The inputs to the model consist of key parameters the values of which can be determined by PRA analysts for each accident situation being assessed. The output is a set of curves which represent the probability of control room crew non-response as a function of time for different conditions affecting their performance. The non-response probability is then a contributor to the overall non-success of operating crews to achieve a functional objective identified in the PRA study. Simulator data and some small scale tests were utilized to illustrate the calibration of interim HCR model coefficients for different types of cognitive processing since the data were sparse. The model can potentially help PRA analysts make human reliability assessments more explicit. The model incorporates concepts from psychological models of human cognitive behavior, information from current collections of human reliability data sources and crew response time data from simulator training exercises

  8. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  9. Modelling the pre-assessment learning effects of assessment: evidence in the validity chain.

    Science.gov (United States)

    Cilliers, Francois J; Schuwirth, Lambert W T; van der Vleuten, Cees P M

    2012-11-01

    We previously developed a model of the pre-assessment learning effects of consequential assessment and started to validate it. The model comprises assessment factors, mechanism factors and learning effects. The purpose of this study was to continue the validation process. For stringency, we focused on a subset of assessment factor-learning effect associations that featured least commonly in a baseline qualitative study. Our aims were to determine whether these uncommon associations were operational in a broader but similar population to that in which the model was initially derived. A cross-sectional survey of 361 senior medical students at one medical school was undertaken using a purpose-made questionnaire based on a grounded theory and comprising pairs of written situational tests. In each pair, the manifestation of an assessment factor was varied. The frequencies at which learning effects were selected were compared for each item pair, using an adjusted alpha to assign significance. The frequencies at which mechanism factors were selected were calculated. There were significant differences in the learning effect selected between the two scenarios of an item pair for 13 of this subset of 21 uncommon associations, even when a p-value of value. For a subset of uncommon associations in the model, the role of most assessment factor-learning effect associations and the mechanism factors involved were supported in a broader but similar population to that in which the model was derived. Although model validation is an ongoing process, these results move the model one step closer to the stage of usefully informing interventions. Results illustrate how factors not typically included in studies of the learning effects of assessment could confound the results of interventions aimed at using assessment to influence learning. © Blackwell Publishing Ltd 2012.

  10. Utility of Social Modeling for Proliferation Assessment - Preliminary Findings

    International Nuclear Information System (INIS)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-01-01

    Often the methodologies for assessing proliferation risk are focused around the inherent vulnerability of nuclear energy systems and associated safeguards. For example an accepted approach involves ways to measure the intrinsic and extrinsic barriers to potential proliferation. This paper describes preliminary investigation into non-traditional use of social and cultural information to improve proliferation assessment and advance the approach to assessing nuclear material diversion. Proliferation resistance assessment, safeguard assessments and related studies typically create technical information about the vulnerability of a nuclear energy system to diversion of nuclear material. The purpose of this research project is to find ways to integrate social information with technical information by explicitly considering the role of culture, groups and/or individuals to factors that impact the possibility of proliferation. When final, this work is expected to describe and demonstrate the utility of social science modeling in proliferation and proliferation risk assessments.

  11. Integrated assessment models of climate change. An incomplete overview

    International Nuclear Information System (INIS)

    Dowlatabadi, H.

    1995-01-01

    Integrated assessment is a trendy phrase that has recently entered the vocabulary of folks in Washington, DC and elsewhere. The novelty of the term in policy analysis and policy making circles belies the longevity of this approach in the sciences and past attempts at their application to policy issues. This paper is an attempt at providing an overview of integrated assessment with a special focus on policy motivated integrated assessments of climate change. The first section provides an introduction to integrated assessments in general, followed by a discussion of the bounds to the climate change issue. The next section is devoted to a taxonomy of the policy motivated models. Then the integrated assessment effort at Carnegie Mellon is described briefly. A perspective on the challenges ahead in successful representation of natural and social dynamics in integrated assessments of global climate change is presented in the final section. (Author)

  12. Model of environmental life cycle assessment for coal mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Burchart-Korol, Dorota, E-mail: dburchart@gig.eu; Fugiel, Agata, E-mail: afugiel@gig.eu; Czaplicka-Kolarz, Krystyna, E-mail: kczaplicka@gig.eu; Turek, Marian, E-mail: mturek@gig.eu

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  13. Model of environmental life cycle assessment for coal mining operations

    International Nuclear Information System (INIS)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-01-01

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  14. Model summary report for the safety assessment SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik

    2006-10-15

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met.

  15. Model summary report for the safety assessment SR-Can

    International Nuclear Information System (INIS)

    Vahlund, Fredrik

    2006-10-01

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met

  16. PARALLEL MODELS OF ASSESSMENT: INFANT MENTAL HEALTH AND THERAPEUTIC ASSESSMENT MODELS INTERSECT THROUGH EARLY CHILDHOOD CASE STUDIES.

    Science.gov (United States)

    Gart, Natalie; Zamora, Irina; Williams, Marian E

    2016-07-01

    Therapeutic Assessment (TA; S.E. Finn & M.E. Tonsager, 1997; J.D. Smith, 2010) is a collaborative, semistructured model that encourages self-discovery and meaning-making through the use of assessment as an intervention approach. This model shares core strategies with infant mental health assessment, including close collaboration with parents and caregivers, active participation of the family, a focus on developing new family stories and increasing parents' understanding of their child, and reducing isolation and increasing hope through the assessment process. The intersection of these two theoretical approaches is explored, using case studies of three infants/young children and their families to illustrate the application of TA to infant mental health. The case of an 18-month-old girl whose parents fear that she has bipolar disorder illustrates the core principles of the TA model, highlighting the use of assessment intervention sessions and the clinical approach to preparing assessment feedback. The second case follows an infant with a rare genetic syndrome from ages 2 to 24 months, focusing on the assessor-parent relationship and the importance of a developmental perspective. Finally, assessment of a 3-year-old boy illustrates the development and use of a fable as a tool to provide feedback to a young child about assessment findings and recommendations. © 2016 Michigan Association for Infant Mental Health.

  17. NEW MODEL OF QUALITY ASSESSMENT IN PUBLIC ADMINISTRATION - UPGRADING THE COMMON ASSESSMENT FRAMEWORK (CAF

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2017-01-01

    Full Text Available In our study, we developed new model of quality assessment in public administration. The Common Assessment Framework (CAF is frequently used in continental Europe for this purpose. Its use has many benefits, however we believe its assessment logic is not adequate for public administration. Upgraded version of CAF is conceptually different: instead of analytical and linear CAF we get the instrument that measures organisation as a network of complex processes. Original and upgraded assessment approaches are presented in the paper and compared in the case of self-assessment of selected public administration organisation. The two approaches produced different, sometimes contradictory results. The upgraded model proved to be logically more consistent and it produced higher interpretation capacity.

  18. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  19. Addressing challenges in single species assessments via a simple state-space assessment model

    DEFF Research Database (Denmark)

    Nielsen, Anders

    Single-species and age-structured fish stock assessments still remains the main tool for managing fish stocks. A simple state-space assessment model is presented as an alternative to (semi) deterministic procedures and the full parametric statistical catch at age models. It offers a solution...... to some of the key challenges of these models. Compared to the deterministic procedures it solves a list of problems originating from falsely assuming that age classified catches are known without errors and allows quantification of uncertainties of estimated quantities of interest. Compared to full...

  20. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  1. Combining catchment and instream modelling to assess physical habitat quality

    DEFF Research Database (Denmark)

    Olsen, Martin

    Study objectives After the implementation of EU's Water Framework Directive (WFD) in Denmark ecological impacts from groundwater exploitation on surface waters has to receive additional consideration. Small streams in particular are susceptible to changes in run-off but have only recieved little...... attention in past studies of run-off impact on the quality of stream physical habitats. This study combined catchment and instream models with instream habitat observations to assess the ecological impacts from groundwater exploitation on a small stream. The main objectives of this study was; • to assess...... which factors are controlling the run-off conditions in stream Ledreborg and to what degree • to assess the run-off reference condition of stream Ledreborg where intensive groundwater abstraction has taken place in 67 years using a simple rainfall-run-off-model • to assess how stream run-off affect...

  2. A mathematical model for environmental risk assessment in manufacturing industry

    Institute of Scientific and Technical Information of China (English)

    何莉萍; 徐盛明; 陈大川; 党创寅

    2002-01-01

    Environmental conscious manufacturing has become an important issue in industry because of market pressure and environmental regulations. An environmental risk assessment model was developed based on the network analytic method and fuzzy set theory. The "interval analysis method" was applied to deal with the on-site monitoring data as basic information for assessment. In addition, the fuzzy set theory was employed to allow uncertain, interactive and dynamic information to be effectively incorporated into the environmental risk assessment. This model is a simple, practical and effective tool for evaluating the environmental risk of manufacturing industry and for analyzing the relative impacts of emission wastes, which are hazardous to both human and ecosystem health. Furthermore, the model is considered useful for design engineers and decision-maker to design and select processes when the costs, environmental impacts and performances of a product are taken into consideration.

  3. Environmental impact assessments and geological repositories: A model process

    International Nuclear Information System (INIS)

    Webster, S.

    2000-01-01

    In a recent study carried out for the European Commission, the scope and application of environmental impact assessment (EIA) legislation and current EIA practice in European Union Member States and applicant countries of Central and Eastern Europe was investigated, specifically in relation to the geological disposal of radioactive waste. This paper reports the study's investigations into a model approach to EIA in the context of geological repositories, including the role of the assessment in the overall decision processes and public involvement. (author)

  4. Proposing an Environmental Excellence Self-Assessment Model

    DEFF Research Database (Denmark)

    Meulengracht Jensen, Peter; Johansen, John; Wæhrens, Brian Vejrum

    2013-01-01

    that the EEA model can be used in global organizations to differentiate environmental efforts depending on the maturity stage of the individual sites. Furthermore, the model can be used to support the decision-making process regarding when organizations should embark on more complex environmental efforts......This paper presents an Environmental Excellence Self-Assessment (EEA) model based on the structure of the European Foundation of Quality Management Business Excellence Framework. Four theoretical scenarios for deploying the model are presented as well as managerial implications, suggesting...

  5. Report on the model developments in the sectoral assessments

    DEFF Research Database (Denmark)

    Iglesias, Ana; Termansen, Mette; Bouwer, Laurens

    2014-01-01

    into the economic assessments. At the same time, the models will link to the case studies in two ways. First, they use the data in the case studies for model validation and then they provide information to inform stakeholders on adaptation strategies. Therefore, Deliverable 3.2 aims to address three main questions......The Objective of this Deliverable D3.2 is to describe the models developed in BASE that is, the experimental setup for the sectoral modelling. The model development described in this deliverable will then be implemented in the adaptation and economic analysis in WP6 in order to integrate adaptation......: How to address climate adaptation options with the sectoral bottom-up models? - This includes a quantification of the costs of adaptation with the sectoral models, in monetary terms or in other measures of costs. The benefits in this framework will be the avoided damages, therefore a measure...

  6. Predictive assessment of models for dynamic functional connectivity

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Schmidt, Mikkel Nørgaard; Madsen, Kristoffer Hougaard

    2018-01-01

    represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state......In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature...... dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework...

  7. Agricultural climate impacts assessment for economic modeling and decision support

    Science.gov (United States)

    Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.

    2013-12-01

    A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a

  8. Consensus-based training and assessment model for general surgery.

    Science.gov (United States)

    Szasz, P; Louridas, M; de Montbrun, S; Harris, K A; Grantcharov, T P

    2016-05-01

    Surgical education is becoming competency-based with the implementation of in-training milestones. Training guidelines should reflect these changes and determine the specific procedures for such milestone assessments. This study aimed to develop a consensus view regarding operative procedures and tasks considered appropriate for junior and senior trainees, and the procedures that can be used as technical milestone assessments for trainee progression in general surgery. A Delphi process was followed where questionnaires were distributed to all 17 Canadian general surgery programme directors. Items were ranked on a 5-point Likert scale, with consensus defined as Cronbach's α of at least 0·70. Items rated 4 or above on the 5-point Likert scale by 80 per cent of the programme directors were included in the models. Two Delphi rounds were completed, with 14 programme directors taking part in round one and 11 in round two. The overall consensus was high (Cronbach's α = 0·98). The training model included 101 unique procedures and tasks, 24 specific to junior trainees, 68 specific to senior trainees, and nine appropriate to all. The assessment model included four procedures. A system of operative procedures and tasks for junior- and senior-level trainees has been developed along with an assessment model for trainee progression. These can be used as milestones in competency-based assessments. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  9. Assessment of the assessment: Evaluation of the model quality estimates in CASP10

    KAUST Repository

    Kryshtafovych, Andriy

    2013-08-31

    The article presents an assessment of the ability of the thirty-seven model quality assessment (MQA) methods participating in CASP10 to provide an a priori estimation of the quality of structural models, and of the 67 tertiary structure prediction groups to provide confidence estimates for their predicted coordinates. The assessment of MQA predictors is based on the methods used in previous CASPs, such as correlation between the predicted and observed quality of the models (both at the global and local levels), accuracy of methods in distinguishing between good and bad models as well as good and bad regions within them, and ability to identify the best models in the decoy sets. Several numerical evaluations were used in our analysis for the first time, such as comparison of global and local quality predictors with reference (baseline) predictors and a ROC analysis of the predictors\\' ability to differentiate between the well and poorly modeled regions. For the evaluation of the reliability of self-assessment of the coordinate errors, we used the correlation between the predicted and observed deviations of the coordinates and a ROC analysis of correctly identified errors in the models. A modified two-stage procedure for testing MQA methods in CASP10 whereby a small number of models spanning the whole range of model accuracy was released first followed by the release of a larger number of models of more uniform quality, allowed a more thorough analysis of abilities and inabilities of different types of methods. Clustering methods were shown to have an advantage over the single- and quasi-single- model methods on the larger datasets. At the same time, the evaluation revealed that the size of the dataset has smaller influence on the global quality assessment scores (for both clustering and nonclustering methods), than its diversity. Narrowing the quality range of the assessed models caused significant decrease in accuracy of ranking for global quality predictors but

  10. Radioactive waste disposal assessment - overview of biosphere processes and models

    International Nuclear Information System (INIS)

    Coughtrey, P.J.

    1992-09-01

    This report provides an overview of biosphere processes and models in the general context of the radiological assessment of radioactive waste disposal as a basis for HMIP's response to biosphere aspects of Nirex's submissions for disposal of radioactive wastes in a purpose-built repository at Sellafield, Cumbria. The overview takes into account published information from the UK as available from Nirex's safety and assessment research programme and HMIP's disposal assessment programme, as well as that available from studies in the UK and elsewhere. (Author)

  11. Testing of an accident consequence assessment model using field data

    International Nuclear Information System (INIS)

    Homma, Toshimitsu; Matsubara, Takeshi; Tomita, Kenichi

    2007-01-01

    This paper presents the results obtained from the application of an accident consequence assessment model, OSCAAR to the Iput dose reconstruction scenario of BIOMASS and also to the Chernobyl 131 I fallout scenario of EMRAS, both organized by International Atomic Energy Agency. The Iput Scenario deals with 137 Cs contamination of the catchment basin and agricultural area in the Bryansk Region of Russia, which was heavily contaminated after the Chernobyl accident. This exercise was used to test the chronic exposure pathway models in OSCAAR with actual measurements and to identify the most important sources of uncertainty with respect to each part of the assessment. The OSCAAR chronic exposure pathway models had some limitations but the refined model, COLINA almost successfully reconstructed the whole 10-year time course of 137 Cs activity concentrations in most requested types of agricultural products and natural foodstuffs. The Plavsk scenario provides a good opportunity to test not only the food chain transfer model of 131 I but also the method of assessing 131 I thyroid burden. OSCAAR showed in general good capabilities for assessing the important 131 I exposure pathways. (author)

  12. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    2009-06-01

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  13. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    2008-12-15

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  14. Guide for developing conceptual models for ecological risk assessments

    International Nuclear Information System (INIS)

    Suter, G.W., II.

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs

  15. Persistent hemifacial spasm after microvascular decompression: a risk assessment model.

    Science.gov (United States)

    Shah, Aalap; Horowitz, Michael

    2017-06-01

    Microvascular decompression (MVD) for hemifacial spasm (HFS) provides resolution of disabling symptoms such as eyelid twitching and muscle contractions of the entire hemiface. The primary aim of this study was to evaluate the predictive value of patient demographics and spasm characteristics on long-term outcomes, with or without intraoperative lateral spread response (LSR) as an additional variable in a risk assessment model. A retrospective study was undertaken to evaluate the associations of pre-operative patient characteristics, as well as intraoperative LSR and need for a staged procedure on the presence of persistent or recurrent HFS at the time of hospital discharge and at follow-up. A risk assessment model was constructed with the inclusion of six clinically or statistically significant variables from the univariate analyses. A receiving operator characteristic curve was generated, and area under the curve was calculated to determine the strength of the predictive model. A risk assessment model was first created consisting of significant pre-operative variables (Model 1) (age >50, female gender, history of botulinum toxin use, platysma muscle involvement). This model demonstrated borderline predictive value for persistent spasm at discharge (AUC .60; p=.045) and fair predictive value at follow-up (AUC .75; p=.001). Intraoperative variables (e.g. LSR persistence) demonstrated little additive value (Model 2) (AUC .67). Patients with a higher risk score (three or greater) demonstrated greater odds of persistent HFS at the time of discharge (OR 1.5 [95%CI 1.16-1.97]; p=.035), as well as greater odds of persistent or recurrent spasm at the time of follow-up (OR 3.0 [95%CI 1.52-5.95]; p=.002) Conclusions: A risk assessment model consisting of pre-operative clinical characteristics is useful in prognosticating HFS persistence at follow-up.

  16. Modeling Composite Assessment Data Using Item Response Theory

    Science.gov (United States)

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  17. Assessing biocomputational modelling in transforming clinical guidelines for osteoporosis management.

    Science.gov (United States)

    Thiel, Rainer; Viceconti, Marco; Stroetmann, Karl

    2011-01-01

    Biocomputational modelling as developed by the European Virtual Physiological Human (VPH) Initiative is the area of ICT most likely to revolutionise in the longer term the practice of medicine. Using the example of osteoporosis management, a socio-economic assessment framework is presented that captures how the transformation of clinical guidelines through VPH models can be evaluated. Applied to the Osteoporotic Virtual Physiological Human Project, a consequent benefit-cost analysis delivers promising results, both methodologically and substantially.

  18. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    International Nuclear Information System (INIS)

    Klos, Richard

    2008-03-01

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  19. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  20. Mesorad dose assessment model. Volume 1. Technical basis

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; Bander, T.J.; Athey, G.F.; Ramsdell, J.V.

    1986-03-01

    MESORAD is a dose assessment model for emergency response applications. Using release data for as many as 50 radionuclides, the model calculates: (1) external doses resulting from exposure to radiation emitted by radionuclides contained in elevated or deposited material; (2) internal dose commitment resulting from inhalation; and (3) total whole-body doses. External doses from airborne material are calculated using semi-infinite and finite cloud approximations. At each stage in model execution, the appropriate approximation is selected after considering the cloud dimensions. Atmospheric processes are represented in MESORAD by a combination of Lagrangian puff and Gaussian plume dispersion models, a source depletion (deposition velocity) dry deposition model, and a wet deposition model using washout coefficients based on precipitation rates

  1. Biosphere models for safety assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Proehl, G.; Olyslaegers, G.; Zeevaert, T.; Kanyar, B.; Bergstroem, U.; Hallberg, B.; Mobbs, S.; Chen, Q.; Kowe, R.

    2004-01-01

    The aim of the BioMoSA project has been to contribute in the confidence building of biosphere models, for application in performance assessments of radioactive waste disposal. The detailed objectives of this project are: development and test of practical biosphere models for application in long-term safety studies of radioactive waste disposal to different European locations, identification of features, events and processes that need to be modelled on a site-specific rather than on a generic base, comparison of the results and quantification of the variability of site-specific models developed according to the reference biosphere methodology, development of a generic biosphere tool for application in long term safety studies, comparison of results from site-specific models to those from generic one, Identification of possibilities and limitations for the application of the generic biosphere model. (orig.)

  2. Skill and independence weighting for multi-model assessments

    International Nuclear Information System (INIS)

    Sanderson, Benjamin M.; Wehner, Michael; Knutti, Reto

    2017-01-01

    We present a weighting strategy for use with the CMIP5 multi-model archive in the fourth National Climate Assessment, which considers both skill in the climatological performance of models over North America as well as the inter-dependency of models arising from common parameterizations or tuning practices. The method exploits information relating to the climatological mean state of a number of projection-relevant variables as well as metrics representing long-term statistics of weather extremes. The weights, once computed can be used to simply compute weighted means and significance information from an ensemble containing multiple initial condition members from potentially co-dependent models of varying skill. Two parameters in the algorithm determine the degree to which model climatological skill and model uniqueness are rewarded; these parameters are explored and final values are defended for the assessment. The influence of model weighting on projected temperature and precipitation changes is found to be moderate, partly due to a compensating effect between model skill and uniqueness. However, more aggressive skill weighting and weighting by targeted metrics is found to have a more significant effect on inferred ensemble confidence in future patterns of change for a given projection.

  3. A Corrosion Risk Assessment Model for Underground Piping

    Science.gov (United States)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  4. An analytical model for the assessment of airline expansion strategies

    Directory of Open Access Journals (Sweden)

    Mauricio Emboaba Moreira

    2014-01-01

    Full Text Available Purpose: The purpose of this article is to develop an analytical model to assess airline expansion strategies by combining generic business strategy models with airline business models. Methodology and approach: A number of airline business models are examined, as are Porter’s (1983 industry five forces that drive competition, complemented by Nalebuff/ Brandenburger’s  (1996 sixth force, and the basic elements of the general environment in which the expansion process takes place.  A system of points and weights is developed to create a score among the 904,736 possible combinations considered. The model’s outputs are generic expansion strategies with quantitative assessments for each specific combination of elements inputted. Originality and value: The analytical model developed is original because it combines for the first time and explicitly elements of the general environment, industry environment, airline business models and the generic expansion strategy types. Besides it creates a system of scores that may be used to drive the decision process toward the choice of a specific strategic expansion path. Research implications: The analytical model may be adapted to other industries apart from the airline industry by substituting the element “airline business model” by other industries corresponding elements related to the different specific business models.

  5. A Methodology to Assess Ionospheric Models for GNSS

    Science.gov (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos

    2015-04-01

    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  6. Risk assessment and remedial policy evaluation using predictive modeling

    International Nuclear Information System (INIS)

    Linkov, L.; Schell, W.R.

    1996-01-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment

  7. Peer Assessment with Online Tools to Improve Student Modeling

    Science.gov (United States)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  8. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol.2

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  9. A critique of recent models for human error rate assessment

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    This paper critically reviews two groups of models for assessing human error rates under accident conditions. The first group, which includes the US Nuclear Regulatory Commission (NRC) handbook model and the human cognitive reliability (HCR) model, considers as fundamental the time that is available to the operators to act. The second group, which is represented by the success likelihood index methodology multiattribute utility decomposition (SLIM-MAUD) model, relies on ratings of the human actions with respect to certain qualitative factors and the subsequent derivation of error rates. These models are evaluated with respect to two criteria: the treatment of uncertainties and the internal coherence of the models. In other words, this evaluation focuses primarily on normative aspects of these models. The principal findings are as follows: (1) Both of the time-related models provide human error rates as a function of the available time for action and the prevailing conditions. However, the HCR model ignores the important issue of state-of-knowledge uncertainties, dealing exclusively with stochastic uncertainty, whereas the model presented in the NRC handbook handles both types of uncertainty. (2) SLIM-MAUD provides a highly structured approach for the derivation of human error rates under given conditions. However, the treatment of the weights and ratings in this model is internally inconsistent. (author)

  10. Assessment of the Eu migration experiments and their modelling

    International Nuclear Information System (INIS)

    Klotz, D.

    2001-01-01

    The humic acid transport of heavy metals in underground water was investigated in laboratory experiments using the lanthanide Eu in the form of 152 Eu 3+ , which is both a model heavy metal and an indicator for assessing the potential hazards of ultimate storage sites for radioactive waste [de

  11. Application of mixed models for the assessment genotype and ...

    African Journals Online (AJOL)

    Application of mixed models for the assessment genotype and environment interactions in cotton ( Gossypium hirsutum ) cultivars in Mozambique. ... The cultivars ISA 205, STAM 42 and REMU 40 showed superior productivity when they were selected by the Harmonic Mean of Genotypic Values (HMGV) criterion in relation ...

  12. Groundwater Impacts of Radioactive Wastes and Associated Environmental Modeling Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Rui; Zheng, Chunmiao; Liu, Chongxuan

    2012-11-01

    This article provides a review of the major sources of radioactive wastes and their impacts on groundwater contamination. The review discusses the major biogeochemical processes that control the transport and fate of radionuclide contaminants in groundwater, and describe the evolution of mathematical models designed to simulate and assess the transport and transformation of radionuclides in groundwater.

  13. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  14. Confidence Intervals for Assessing Heterogeneity in Generalized Linear Mixed Models

    Science.gov (United States)

    Wagler, Amy E.

    2014-01-01

    Generalized linear mixed models are frequently applied to data with clustered categorical outcomes. The effect of clustering on the response is often difficult to practically assess partly because it is reported on a scale on which comparisons with regression parameters are difficult to make. This article proposes confidence intervals for…

  15. Modeling current climate conditions for forest pest risk assessment

    Science.gov (United States)

    Frank H. Koch; John W. Coulston

    2010-01-01

    Current information on broad-scale climatic conditions is essential for assessing potential distribution of forest pests. At present, sophisticated spatial interpolation approaches such as the Parameter-elevation Regressions on Independent Slopes Model (PRISM) are used to create high-resolution climatic data sets. Unfortunately, these data sets are based on 30-year...

  16. Assessment of the Quality Management Models in Higher Education

    Science.gov (United States)

    Basar, Gulsun; Altinay, Zehra; Dagli, Gokmen; Altinay, Fahriye

    2016-01-01

    This study involves the assessment of the quality management models in Higher Education by explaining the importance of quality in higher education and by examining the higher education quality assurance system practices in other countries. The qualitative study was carried out with the members of the Higher Education Planning, Evaluation,…

  17. Model assessment of protective barrier designs: Part 2

    International Nuclear Information System (INIS)

    Fayer, M.J.

    1987-11-01

    Protective barriers are being considered for use at the Hanford Site to enhance the isolation of radioactive wastes from water, plant, and animal intrusion. This study assesses the effectiveness of protective barriers for isolation of wastes from water. In this report, barrier designs are reviewed and several barrier modeling assumptions are tested. 20 refs., 16 figs., 6 tabs

  18. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  19. Heuristic Model Of The Composite Quality Index Of Environmental Assessment

    Science.gov (United States)

    Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.

    2017-01-01

    The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.

  20. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  1. A model for assessing Medicago Sativa L. hay quality | Scholtz ...

    African Journals Online (AJOL)

    A study was conducted to identify chemical parameters and/or models for assessing. Medicago sativa L. (L) hay quality, using near infrared reflectance spectroscopy (NIRS) analysis and Cornell Net Carbohydrate and Protein System (CNCPS) milk prediction as a criterion of accuracy. Milk yield (MY) derived from the ...

  2. A Comprehensive Assessment Model for Critical Infrastructure Protection

    Directory of Open Access Journals (Sweden)

    Häyhtiö Markus

    2017-12-01

    Full Text Available International business demands seamless service and IT-infrastructure throughout the entire supply chain. However, dependencies between different parts of this vulnerable ecosystem form a fragile web. Assessment of the financial effects of any abnormalities in any part of the network is demanded in order to protect this network in a financially viable way. Contractual environment between the actors in a supply chain, different business domains and functions requires a management model, which enables a network wide protection for critical infrastructure. In this paper authors introduce such a model. It can be used to assess financial differences between centralized and decentralized protection of critical infrastructure. As an end result of this assessment business resilience to unknown threats can be improved across the entire supply chain.

  3. Validation study of safety assessment model for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Munakata, Masahiro; Takeda, Seiji; Kimura, Hideo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    The JAERI-AECL collaboration research program has been conducted to validate a groundwater flow and radionuclide transport models for safety assessment. JAERI have developed a geostatistical model for radionuclide transport through a heterogeneous geological media and verify using experimental results of field tracer tests. The simulated tracer plumes explain favorably the experimental tracer plumes. A regional groundwater flow and transport model using site-scale parameter obtained from tracer tests have been verified by comparing simulation results with observation ones of natural environmental tracer. (author)

  4. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Yang, Zhaoqing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Wang, Taiping [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Gunawan, Budi [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Dallman, Ann Renee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies

    2016-03-01

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending on the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.

  5. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  6. Assessing groundwater policy with coupled economic-groundwater hydrologic modeling

    Science.gov (United States)

    Mulligan, Kevin B.; Brown, Casey; Yang, Yi-Chen E.; Ahlfeld, David P.

    2014-03-01

    This study explores groundwater management policies and the effect of modeling assumptions on the projected performance of those policies. The study compares an optimal economic allocation for groundwater use subject to streamflow constraints, achieved by a central planner with perfect foresight, with a uniform tax on groundwater use and a uniform quota on groundwater use. The policies are compared with two modeling approaches, the Optimal Control Model (OCM) and the Multi-Agent System Simulation (MASS). The economic decision models are coupled with a physically based representation of the aquifer using a calibrated MODFLOW groundwater model. The results indicate that uniformly applied policies perform poorly when simulated with more realistic, heterogeneous, myopic, and self-interested agents. In particular, the effects of the physical heterogeneity of the basin and the agents undercut the perceived benefits of policy instruments assessed with simple, single-cell groundwater modeling. This study demonstrates the results of coupling realistic hydrogeology and human behavior models to assess groundwater management policies. The Republican River Basin, which overlies a portion of the Ogallala aquifer in the High Plains of the United States, is used as a case study for this analysis.

  7. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  8. Assessment of health surveys: fitting a multidimensional graded response model.

    Science.gov (United States)

    Depaoli, Sarah; Tiemensma, Jitske; Felt, John M

    The multidimensional graded response model, an item response theory (IRT) model, can be used to improve the assessment of surveys, even when sample sizes are restricted. Typically, health-based survey development utilizes classical statistical techniques (e.g. reliability and factor analysis). In a review of four prominent journals within the field of Health Psychology, we found that IRT-based models were used in less than 10% of the studies examining scale development or assessment. However, implementing IRT-based methods can provide more details about individual survey items, which is useful when determining the final item content of surveys. An example using a quality of life survey for Cushing's syndrome (CushingQoL) highlights the main components for implementing the multidimensional graded response model. Patients with Cushing's syndrome (n = 397) completed the CushingQoL. Results from the multidimensional graded response model supported a 2-subscale scoring process for the survey. All items were deemed as worthy contributors to the survey. The graded response model can accommodate unidimensional or multidimensional scales, be used with relatively lower sample sizes, and is implemented in free software (example code provided in online Appendix). Use of this model can help to improve the quality of health-based scales being developed within the Health Sciences.

  9. Modeling human intention formation for human reliability assessment

    International Nuclear Information System (INIS)

    Woods, D.D.; Roth, E.M.; Pople, H. Jr.

    1988-01-01

    This paper describes a dynamic simulation capability for modeling how people form intentions to act in nuclear power plant emergency situations. This modeling tool, Cognitive Environment Simulation or CES, was developed based on techniques from artificial intelligence. It simulates the cognitive processes that determine situation assessment and intention formation. It can be used to investigate analytically what situations and factors lead to intention failures, what actions follow from intention failures (e.g. errors of omission, errors of commission, common mode errors), the ability to recover from errors or additional machine failures, and the effects of changes in the NPP person machine system. One application of the CES modeling environment is to enhance the measurement of the human contribution to risk in probabilistic risk assessment studies. (author)

  10. Connecting single-stock assessment models through correlated survival

    DEFF Research Database (Denmark)

    Albertsen, Christoffer Moesgaard; Nielsen, Anders; Thygesen, Uffe Høgsbro

    2017-01-01

    times. We propose a simple alternative. In three case studies each with two stocks, we improve the single-stock models, as measured by Akaike information criterion, by adding correlation in the cohort survival. To limit the number of parameters, the correlations are parameterized through...... the corresponding partial correlations. We consider six models where the partial correlation matrix between stocks follows a band structure ranging from independent assessments to complex correlation structures. Further, a simulation study illustrates the importance of handling correlated data sufficiently...... by investigating the coverage of confidence intervals for estimated fishing mortality. The results presented will allow managers to evaluate stock statuses based on a more accurate evaluation of model output uncertainty. The methods are directly implementable for stocks with an analytical assessment and do...

  11. Dynamic model based on Bayesian method for energy security assessment

    International Nuclear Information System (INIS)

    Augutis, Juozas; Krikštolaitis, Ričardas; Pečiulytė, Sigita; Žutautaitė, Inga

    2015-01-01

    Highlights: • Methodology for dynamic indicator model construction and forecasting of indicators. • Application of dynamic indicator model for energy system development scenarios. • Expert judgement involvement using Bayesian method. - Abstract: The methodology for the dynamic indicator model construction and forecasting of indicators for the assessment of energy security level is presented in this article. An indicator is a special index, which provides numerical values to important factors for the investigated area. In real life, models of different processes take into account various factors that are time-dependent and dependent on each other. Thus, it is advisable to construct a dynamic model in order to describe these dependences. The energy security indicators are used as factors in the dynamic model. Usually, the values of indicators are obtained from statistical data. The developed dynamic model enables to forecast indicators’ variation taking into account changes in system configuration. The energy system development is usually based on a new object construction. Since the parameters of changes of the new system are not exactly known, information about their influences on indicators could not be involved in the model by deterministic methods. Thus, dynamic indicators’ model based on historical data is adjusted by probabilistic model with the influence of new factors on indicators using the Bayesian method

  12. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    Science.gov (United States)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  13. Agent Model Development for Assessing Climate-Induced Geopolitical Instability.

    Energy Technology Data Exchange (ETDEWEB)

    Boslough, Mark B.; Backus, George A.

    2005-12-01

    We present the initial stages of development of new agent-based computational methods to generate and test hypotheses about linkages between environmental change and international instability. This report summarizes the first year's effort of an originally proposed three-year Laboratory Directed Research and Development (LDRD) project. The preliminary work focused on a set of simple agent-based models and benefited from lessons learned in previous related projects and case studies of human response to climate change and environmental scarcity. Our approach was to define a qualitative model using extremely simple cellular agent models akin to Lovelock's Daisyworld and Schelling's segregation model. Such models do not require significant computing resources, and users can modify behavior rules to gain insights. One of the difficulties in agent-based modeling is finding the right balance between model simplicity and real-world representation. Our approach was to keep agent behaviors as simple as possible during the development stage (described herein) and to ground them with a realistic geospatial Earth system model in subsequent years. This work is directed toward incorporating projected climate data--including various C02 scenarios from the Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report--and ultimately toward coupling a useful agent-based model to a general circulation model.3

  14. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  15. Assessing uncertainty in SRTM elevations for global flood modelling

    Science.gov (United States)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  16. GEMA3D - landscape modelling for dose assessments

    International Nuclear Information System (INIS)

    Klos, Richard

    2010-08-01

    Concerns have been raised about SKB's interpretation of landscape objects in their radiological assessment models, specifically in relation to the size of the objects represented - leading to excessive volumetric dilution - and to the interpretation of local hydrology - leading to non-conservative hydrologic dilution. Developed from the Generic Ecosystem Modelling Approach, GEMA3D is an attempt to address these issues in a simple radiological assessment landscape model. In GEMA3D landscape features are model led as landscape elements (lels) based on a three compartment structure which is able to represent both terrestrial and aquatic lels. The area of the lels can be chosen to coincide with the bedrock fracture from which radionuclides are assumed to be released and the dispersion of radionuclides through out the landscape can be traced. Result indicate that released contaminants remain localised close to the release location and follow the main flow axis of the surface drainage system. This is true even for relatively weakly sorbing species. An interpretation of the size of landscape elements suitable to represent dilution in the biosphere for radiological assessment purposes is suggested, though the concept remains flexible. For reference purposes an agricultural area of one hectare is the baseline. The Quaternary deposits (QD) at the Forsmark site are only a few metres thick above the crystalline bedrock in which the planned repository for spent fuel will be constructed. The biosphere model is assumed to be the upper one metre of the QD. A further model has been implemented for advective - dispersive transport in the deeper QD. The effects of chemical zonation have been briefly investigated. The results confirm the importance of retention close to the release point from the bedrock and clearly indicate that there is a need for a better description of the hydrology of the QD on the spatial scales relevant to the lels required for radiological assessments

  17. GEMA3D - landscape modelling for dose assessments

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard (Aleksandria Sciences (United Kingdom))

    2010-08-15

    Concerns have been raised about SKB's interpretation of landscape objects in their radiological assessment models, specifically in relation to the size of the objects represented - leading to excessive volumetric dilution - and to the interpretation of local hydrology - leading to non-conservative hydrologic dilution. Developed from the Generic Ecosystem Modelling Approach, GEMA3D is an attempt to address these issues in a simple radiological assessment landscape model. In GEMA3D landscape features are model led as landscape elements (lels) based on a three compartment structure which is able to represent both terrestrial and aquatic lels. The area of the lels can be chosen to coincide with the bedrock fracture from which radionuclides are assumed to be released and the dispersion of radionuclides through out the landscape can be traced. Result indicate that released contaminants remain localised close to the release location and follow the main flow axis of the surface drainage system. This is true even for relatively weakly sorbing species. An interpretation of the size of landscape elements suitable to represent dilution in the biosphere for radiological assessment purposes is suggested, though the concept remains flexible. For reference purposes an agricultural area of one hectare is the baseline. The Quaternary deposits (QD) at the Forsmark site are only a few metres thick above the crystalline bedrock in which the planned repository for spent fuel will be constructed. The biosphere model is assumed to be the upper one metre of the QD. A further model has been implemented for advective - dispersive transport in the deeper QD. The effects of chemical zonation have been briefly investigated. The results confirm the importance of retention close to the release point from the bedrock and clearly indicate that there is a need for a better description of the hydrology of the QD on the spatial scales relevant to the lels required for radiological assessments

  18. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  19. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  20. Avian collision risk models for wind energy impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk [Environmental Research Institute, North Highland College-UHI, University of the Highlands and Islands, Ormlie Road, Thurso, Caithness KW14 7EE (United Kingdom); Cook, A.S.C.P. [British Trust for Ornithology, The Nunnery, Thetford IP24 2PU (United Kingdom)

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  1. Avian collision risk models for wind energy impact assessments

    International Nuclear Information System (INIS)

    Masden, E.A.; Cook, A.S.C.P.

    2016-01-01

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  2. Modeling risk assessment for nuclear processing plants with LAVA

    International Nuclear Information System (INIS)

    Smith, S.T.; Tisinger, R.M.

    1988-01-01

    Using the Los Alamos Vulnerability and Risk Assessment (LAVA) methodology, the authors developed a model for assessing risks associated with nuclear processing plants. LAVA is a three-part systematic approach to risk assessment. The first part is the mathematical methodology; the second is the general personal computer-based software engine; and the third is the application itself. The methodology provides a framework for creating applications for the software engine to operate upon; all application-specific information is data. Using LAVA, the authors build knowledge-based expert systems to assess risks in applications systems comprising a subject system and a safeguards system. The subject system model is sets of threats, assets, and undesirable outcomes. The safeguards system model is sets of safeguards functions for protecting the assets from the threats by preventing or ameliorating the undesirable outcomes, sets of safeguards subfunctions whose performance determine whether the function is adequate and complete, and sets of issues, appearing as interactive questionnaires, whose measures (in both monetary and linguistic terms) define both the weaknesses in the safeguards system and the potential costs of an undesirable outcome occurring

  3. Radionuclide release rates from spent fuel for performance assessment modeling

    International Nuclear Information System (INIS)

    Curtis, D.B.

    1994-01-01

    In a scenario of aqueous transport from a high-level radioactive waste repository, the concentration of radionuclides in water in contact with the waste constitutes the source term for transport models, and as such represents a fundamental component of all performance assessment models. Many laboratory experiments have been done to characterize release rates and understand processes influencing radionuclide release rates from irradiated nuclear fuel. Natural analogues of these waste forms have been studied to obtain information regarding the long-term stability of potential waste forms in complex natural systems. This information from diverse sources must be brought together to develop and defend methods used to define source terms for performance assessment models. In this manuscript examples of measures of radionuclide release rates from spent nuclear fuel or analogues of nuclear fuel are presented. Each example represents a very different approach to obtaining a numerical measure and each has its limitations. There is no way to obtain an unambiguous measure of this or any parameter used in performance assessment codes for evaluating the effects of processes operative over many millennia. The examples are intended to suggest by example that in the absence of the ability to evaluate accuracy and precision, consistency of a broadly based set of data can be used as circumstantial evidence to defend the choice of parameters used in performance assessments

  4. Training courses on integrated safety assessment modelling for waste repositories

    International Nuclear Information System (INIS)

    Mallants, D.

    2007-01-01

    Near-surface or deep repositories of radioactive waste are being developed and evaluated all over the world. Also, existing repositories for low- and intermediate-level waste often need to be re-evaluated to extend their license or to obtain permission for final closure. The evaluation encompasses both a technical feasibility as well as a safety analysis. The long term safety is usually demonstrated by means of performance or safety assessment. For this purpose computer models are used that calculate the migration of radionuclides from the conditioned radioactive waste, through engineered barriers to the environment (groundwater, surface water, and biosphere). Integrated safety assessment modelling addresses all relevant radionuclide pathways from source to receptor (man), using in combination various computer codes in which the most relevant physical, chemical, mechanical, or even microbiological processes are mathematically described. SCK-CEN organizes training courses in Integrated safety assessment modelling that are intended for individuals who have either a controlling or supervising role within the national radwaste agencies or regulating authorities, or for technical experts that carry out the actual post-closure safety assessment for an existing or new repository. Courses are organised by the Department of Waste and Disposal

  5. Comparison of models used for ecological risk assessment and human health risk assessment

    International Nuclear Information System (INIS)

    Ryti, R.T.; Gallegos, A.F.

    1994-01-01

    Models are used to derive action levels for site screening, or to estimate potential ecological or human health risks posed by potentially hazardous sites. At the Los Alamos National Laboratory (LANL), which is RCRA-regulated, the human-health screening action levels are based on hazardous constituents described in RCRA Subpart S and RESRAD-derived soil guidelines (based on 10 mRem/year) for radiological constituents. Also, an ecological risk screening model was developed for a former firing site, where the primary constituents include depleted uranium, beryllium and lead. Sites that fail the screening models are evaluated with site-specific human risk assessment (using RESRAD and other approaches) and a detailed ecological effect model (ECOTRAN). ECOTRAN is based on pharmacokinetics transport modeling within a multitrophic-level biological-growth dynamics model. ECOTRAN provides detailed temporal records of contaminant concentrations in biota, and annual averages of these body burdens are compared to equivalent site-specific runs of the RESRAD model. The results show that thoughtful interpretation of the results of these models must be applied before they can be used for evaluation of current risk posed by sites and the benefits of various remedial options. This presentation compares the concentrations of biological media in the RESRAD screening runs to the concentrations in ecological endpoints predicted by the ecological screening model. The assumptions and limitations of these screening models and the decision process where these are screening models are applied are discussed

  6. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  7. Student Generated Rubrics: An Assessment Model To Help All Students Succeed. Assessment Bookshelf Series.

    Science.gov (United States)

    Ainsworth, Larry; Christinson, Jan

    The assessment model described in this guide was initially developed by a team of fifth-grade teachers who wrote objectives of integrating social studies and language arts. It helps the teacher guide students to create a task-specific rubric that they use to evaluate their own and peers' work. Teachers review the student evaluations, determine the…

  8. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    Science.gov (United States)

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  9. AgMIP: Next Generation Models and Assessments

    Science.gov (United States)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6

  10. Cost Model for Risk Assessment of Company Operation in Audit

    Directory of Open Access Journals (Sweden)

    S. V.

    2017-12-01

    Full Text Available This article explores the approach to assessing the risk of company activities termination by building a cost model. This model gives auditors information on managers’ understanding of factors influencing change in the value of assets and liabilities, and the methods to identify it in more effective and reliable ways. Based on this information, the auditor can assess the adequacy of use of the assumption on continuity of company operation by management personnel when preparing financial statements. Financial uncertainty entails real manifestations of factors creating risks of the occurrence of costs, revenue losses due their manifestations, which in the long run can be a reason for termination of company operation, and, therefore, need to be foreseen in the auditor’s assessment of the adequacy of use of the continuity assumption when preparing financial statements by company management. The purpose of the study is to explore and develop a methodology for use of cost models to assess the risk of termination of company operation in audit. The issue of methodology for assessing the audit risk through analyzing methods for company valuation has not been dealt with. The review of methodologies for assessing the risks of termination of company operation in course of audit gives grounds for the conclusion that use of cost models can be an effective methodology for identification and assessment of such risks. The analysis of the above methods gives understanding of the existing system for company valuation, integrated into the management system, and the consequences of its use, i. e. comparison of the asset price data with the accounting data and the market value of the asset data. Overvalued or undervalued company assets may be a sign of future sale or liquidation of a company, which may signal on high probability of termination of company operation. A wrong choice or application of valuation methods can be indicative of the risk of non

  11. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  12. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  13. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.; Katzfuss, M.; Hu, J.; Johnson, V. E.

    2014-01-01

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  14. Assessing Local Model Adequacy in Bayesian Hierarchical Models Using the Partitioned Deviance Information Criterion

    Science.gov (United States)

    Wheeler, David C.; Hickson, DeMarc A.; Waller, Lance A.

    2010-01-01

    Many diagnostic tools and goodness-of-fit measures, such as the Akaike information criterion (AIC) and the Bayesian deviance information criterion (DIC), are available to evaluate the overall adequacy of linear regression models. In addition, visually assessing adequacy in models has become an essential part of any regression analysis. In this paper, we focus on a spatial consideration of the local DIC measure for model selection and goodness-of-fit evaluation. We use a partitioning of the DIC into the local DIC, leverage, and deviance residuals to assess local model fit and influence for both individual observations and groups of observations in a Bayesian framework. We use visualization of the local DIC and differences in local DIC between models to assist in model selection and to visualize the global and local impacts of adding covariates or model parameters. We demonstrate the utility of the local DIC in assessing model adequacy using HIV prevalence data from pregnant women in the Butare province of Rwanda during 1989-1993 using a range of linear model specifications, from global effects only to spatially varying coefficient models, and a set of covariates related to sexual behavior. Results of applying the diagnostic visualization approach include more refined model selection and greater understanding of the models as applied to the data. PMID:21243121

  15. A multi-model assessment of terrestrial biosphere model data needs

    Science.gov (United States)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial

  16. Comparative Assessment of Nonlocal Continuum Solvent Models Exhibiting Overscreening

    Directory of Open Access Journals (Sweden)

    Ren Baihua

    2017-01-01

    Full Text Available Nonlocal continua have been proposed to offer a more realistic model for the electrostatic response of solutions such as the electrolyte solvents prominent in biology and electrochemistry. In this work, we review three nonlocal models based on the Landau-Ginzburg framework which have been proposed but not directly compared previously, due to different expressions of the nonlocal constitutive relationship. To understand the relationships between these models and the underlying physical insights from which they are derive, we situate these models into a single, unified Landau-Ginzburg framework. One of the models offers the capacity to interpret how temperature changes affect dielectric response, and we note that the variations with temperature are qualitatively reasonable even though predictions at ambient temperatures are not quantitatively in agreement with experiment. Two of these models correctly reproduce overscreening (oscillations between positive and negative polarization charge densities, and we observe small differences between them when we simulate the potential between parallel plates held at constant potential. These computations require reformulating the two models as coupled systems of local partial differential equations (PDEs, and we use spectral methods to discretize both problems. We propose further assessments to discriminate between the models, particularly in regards to establishing boundary conditions and comparing to explicit-solvent molecular dynamics simulations.

  17. Plasma-safety assessment model and safety analyses of ITER

    International Nuclear Information System (INIS)

    Honda, T.; Okazaki, T.; Bartels, H.-H.; Uckan, N.A.; Sugihara, M.; Seki, Y.

    2001-01-01

    A plasma-safety assessment model has been provided on the basis of the plasma physics database of the International Thermonuclear Experimental Reactor (ITER) to analyze events including plasma behavior. The model was implemented in a safety analysis code (SAFALY), which consists of a 0-D dynamic plasma model and a 1-D thermal behavior model of the in-vessel components. Unusual plasma events of ITER, e.g., overfueling, were calculated using the code and plasma burning is found to be self-bounded by operation limits or passively shut down due to impurity ingress from overheated divertor targets. Sudden transition of divertor plasma might lead to failure of the divertor target because of a sharp increase of the heat flux. However, the effects of the aggravating failure can be safely handled by the confinement boundaries. (author)

  18. Modeling issues associated with production reactor safety assessment

    International Nuclear Information System (INIS)

    Stack, D.W.; Thomas, W.R.

    1990-01-01

    This paper describes several Probabilistic Safety Assessment (PSA) modeling issues that are related to the unique design and operation of the production reactors. The identification of initiating events and determination of a set of success criteria for the production reactors is of concern because of their unique design. The modeling of accident recovery must take into account the unique operation of these reactors. Finally, a more thorough search and evaluation of common-cause events is required to account for combinations of unique design features and operation that might otherwise not be included in the PSA. It is expected that most of these modeling issues also would be encountered when modeling some of the other more unique reactor and nonreactor facilities that are part of the DOE nuclear materials production complex. 9 refs., 2 figs

  19. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    International Nuclear Information System (INIS)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-01-01

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator

  20. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  1. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  2. Modeling Of Construction Noise For Environmental Impact Assessment

    Directory of Open Access Journals (Sweden)

    Mohamed F. Hamoda

    2008-06-01

    Full Text Available This study measured the noise levels generated at different construction sites in reference to the stage of construction and the equipment used, and examined the methods to predict such noise in order to assess the environmental impact of noise. It included 33 construction sites in Kuwait and used artificial neural networks (ANNs for the prediction of noise. A back-propagation neural network (BPNN model was compared with a general regression neural network (GRNN model. The results obtained indicated that the mean equivalent noise level was 78.7 dBA which exceeds the threshold limit. The GRNN model was superior to the BPNN model in its accuracy of predicting construction noise due to its ability to train quickly on sparse data sets. Over 93% of the predictions were within 5% of the observed values. The mean absolute error between the predicted and observed data was only 2 dBA. The ANN modeling proved to be a useful technique for noise predictions required in the assessment of environmental impact of construction activities.

  3. Empirical assessment of a threshold model for sylvatic plague

    DEFF Research Database (Denmark)

    Davis, Stephen; Leirs, Herwig; Viljugrein, H.

    2007-01-01

    Plague surveillance programmes established in Kazakhstan, Central Asia, during the previous century, have generated large plague archives that have been used to parameterize an abundance threshold model for sylvatic plague in great gerbil (Rhombomys opimus) populations. Here, we assess the model...... examine six hypotheses that could explain the resulting false positive predictions, namely (i) including end-of-outbreak data erroneously lowers the estimated threshold, (ii) too few gerbils were tested, (iii) plague becomes locally extinct, (iv) the abundance of fleas was too low, (v) the climate...

  4. Model-based pH monitor for sensor assessment.

    Science.gov (United States)

    van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert

    2009-01-01

    Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.

  5. Risk Assessment of Engineering Project Financing Based on PPP Model

    Directory of Open Access Journals (Sweden)

    Ma Qiuli

    2017-01-01

    Full Text Available At present, the project financing channel is single, and the urban facilities are in short supply, and the risk assessment and prevention mechanism of financing should be further improved to reduce the risk of project financing. In view of this, the fuzzy comprehensive evaluation model of project financing risk which combined the method of fuzzy comprehensive evaluation and analytic hierarchy process is established. The scientificalness and effectiveness of the model are verified by the example of the world port project in Luohe city, and it provides basis and reference for engineering project financing based on PPP mode.

  6. Radiological assessments of land disposal options: recent model developments

    International Nuclear Information System (INIS)

    Fearn, H.S.; Pinner, A.V.; Hemming, C.R.

    1984-10-01

    This report describes progress in the development of methodologies and models for assessing the radiological impact of the disposal of low and intermediate level wastes by (i) shallow land burial in simple trenches (land 1), (ii) shallow land burial in engineered facilities (land 2), and (iii) emplacement in mined repositories or existing cavities (land 3/4). In particular the report describes wasteform leaching models, for unconditioned and cemented waste, the role of engineered barriers of a shallow land burial facility in reducing the magnitude of doses arising from groundwater contact and a detailed consideration of the interactions between radioactive carbon and various media. (author)

  7. A Fuzzy Knowledge Representation Model for Student Performance Assessment

    DEFF Research Database (Denmark)

    Badie, Farshad

    Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth/completene......Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth....../completeness about vague or imprecise information. This paper tackles the issue of representing fuzzy classes using OWL2 in a dataset describing Performance Assessment Results of Students (PARS)....

  8. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    This dissertation addresses the issues related to icing of structures with special emphasis on bridge cables. Cable supported bridges in cold climate suffers for ice accreting on the cables, this poses three different undesirable situations. Firstly the changed shape of the cable due to ice...... preliminary framework is modified for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. Different probabilistic models are utilized for the representation of the meteorological variables and their appropriateness is evaluated both through goodness-of-fit tests...... are influencing the two icing mechanisms and their duration. The model is found to be more sensitive to changes in the discretization levels of the input variables. Thirdly the developed operational probabilistic framework for the assessment of the expected number of occurrences of ice/snow accretion on bridge...

  9. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  10. A model for assessing social impacts of nuclear technology

    International Nuclear Information System (INIS)

    Suzuki, Atsuyuki; Kiyose, Ryohei

    1981-01-01

    A theoretical framework is given for assessing the social or environmental impacts of nuclear technology. A two-act problem concerning the incentive-penalty system is supposed to formulate the principle of ALAP. An observation plan to make decision on the problem is optimized with the Bayseian decision theory. The optimized solution resting on the amount of incentive or penalty is compared with an actual or practical plan. Then, by finding the indifference between the two plans, an impact is assessed in monetary terms. As regards the third step, the model does not provide the details since it is beyond the scope of the description. If there exists an actual plan, it can be easily compared with the results from this theory. If there does not or in the process of making it, its feasibility must be studied by another model or by different approaches. (J.P.N.)

  11. Regional Persistent Organic Pollutants' Environmental Impact Assessment and Control Model

    Directory of Open Access Journals (Sweden)

    Jurgis Staniskis

    2008-10-01

    Full Text Available The sources of formation, environmental distribution and fate of persistent organic pollutants (POPs are increasingly seen as topics to be addressed and solved at the global scale. Therefore, there are already two international agreements concerning persistent organic pollutants: the Protocol of 1998 to the 1979 Convention on the Long-Range Transboundary Air Pollution on Persistent Organic Pollutants (Aarhus Protocol; and the Stockholm Convention on Persistent Organic Pollutants. For the assessment of environmental pollution of POPs, for the risk assessment, for the evaluation of new pollutants as potential candidates to be included in the POPs list of the Stokholmo or/and Aarhus Protocol, a set of different models are developed or under development. Multimedia models help describe and understand environmental processes leading to global contamination through POPs and actual risk to the environment and human health. However, there is a lack of the tools based on a systematic and integrated approach to POPs management difficulties in the region.

  12. Model error assessment of burst capacity models for energy pipelines containing surface cracks

    International Nuclear Information System (INIS)

    Yan, Zijian; Zhang, Shenwei; Zhou, Wenxing

    2014-01-01

    This paper develops the probabilistic characteristics of the model errors associated with five well-known burst capacity models/methodologies for pipelines containing longitudinally-oriented external surface cracks, namely the Battelle and CorLAS™ models as well as the failure assessment diagram (FAD) methodologies recommended in the BS 7910 (2005), API RP579 (2007) and R6 (Rev 4, Amendment 10). A total of 112 full-scale burst test data for cracked pipes subjected internal pressure only were collected from the literature. The model error for a given burst capacity model is evaluated based on the ratios of the test to predicted burst pressures for the collected data. Analysis results suggest that the CorLAS™ model is the most accurate model among the five models considered and the Battelle, BS 7910, API RP579 and R6 models are in general conservative; furthermore, the API RP579 and R6 models are markedly more accurate than the Battelle and BS 7910 models. The results will facilitate the development of reliability-based structural integrity management of pipelines. - Highlights: • Model errors for five burst capacity models for pipelines containing surface cracks are characterized. • Basic statistics of the model errors are obtained based on test-to-predicted ratios. • Results will facilitate reliability-based design and assessment of energy pipelines

  13. Exploring harmonization between integrated assessment and capacity expansion models

    Science.gov (United States)

    Iyer, G.; Brown, M.; Cohen, S.; Macknick, J.; Patel, P.; Wise, M. A.; Horing, J.

    2017-12-01

    Forward-looking quantitative models of the electric sector are extensively used to provide science-based strategic decision support to national, international and private-sector entities. Given that these models are used to inform a wide-range of stakeholders and influence policy decisions, it is vital to examine how the models' underlying data and structure influence their outcomes. We conduct several experiments harmonizing key model characteristics between ReEDS—an electric sector only model, and GCAM—an integrated assessment model—to understand how different degrees of harmonization impact model outcomes. ReEDS has high spatial, temporal, and process detail but lacks electricity demand elasticity and endogenous representations of other economic sectors, while GCAM has internally consistent representations of energy (including the electric sector), agriculture, and land-use systems but relatively aggregate representations of the factors influencing electric sector investments . We vary the degree of harmonization in electricity demand, fuel prices, technology costs and performance, and variable renewable energy resource characteristics. We then identify the prominent sources of divergence in key outputs (electricity capacity, generation, and price) across the models and study how the convergence between models can be improved with permutations of harmonized characteristics. The remaining inconsistencies help to establish how differences in the models' underlying data, construction, perspective, and methodology play into each model's outcome. There are three broad contributions of this work. First, our study provides a framework to link models with similar scope but different resolutions. Second, our work provides insight into how the harmonization of assumptions contributes to a unified and robust portrayal of the US electricity sector under various potential futures. Finally, our study enhances the understanding of the influence of structural uncertainty

  14. Modeling marine surface microplastic transport to assess optimal removal locations

    OpenAIRE

    Sherman, Peter; Van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the ...

  15. A maturity model to assess organisational readiness for change

    OpenAIRE

    Zephir, Olivier; Minel, Stéphanie; Chapotot, Emilie

    2011-01-01

    International audience; The presented model which is developed in a European project allows project management teams to assess the organisational maturity to integrate new practices under structural or technological change. Maturity for change is defined here as workforce capability to operate effectively in transformed processes. This methodology is addressed to tackle organisational readiness to fulfil business objectives through technological and structural improvements. The tool integrate...

  16. Melodie: A global risk assessment model for radioactive waste repositories

    International Nuclear Information System (INIS)

    Lewi, J.; Assouline, M.; Bareau, J.; Raimbault, P.

    1987-03-01

    The Institute of Protection and Nuclear Safety (IPSN), which is part of the French Atomic Energy Commission (C.E.A.) develops since 1984 in collaboration with different groups inside and outside the C.E.A. a computer model for risk assessment of nuclear waste repositories in deep geological formations. The main characteristics of the submodels, the data processing structure and some examples of applications are presented

  17. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  18. Efficiency assessment models of higher education institution staff activity

    Directory of Open Access Journals (Sweden)

    K. A. Dyusekeyev

    2016-01-01

    Full Text Available The paper substantiates the necessity of improvement of university staff incentive system under the conditions of competition in the field of higher education, the necessity to develop a separate model for the evaluation of the effectiveness of the department heads. The authors analysed the methods for assessing production function of units. The advantage of the application of the methods to assess the effectiveness of border economic structures in the field of higher education is shown. The choice of the data envelopment analysis method to solve the problem has proved. The model for evaluating of university departments activity on the basis of the DEAmethodology has developed. On the basis of operating in Russia, Kazakhstan and other countries universities staff pay systems the structure of the criteria system for university staff activity evaluation has been designed. For clarification and specification of the departments activity efficiency criteria a strategic map has been developed that allowed us to determine the input and output parameters of the model. DEA-methodology using takes into account a large number of input and output parameters, increases the assessment objectivity by excluding experts, receives interim data to identify the strengths and weaknesses of the evaluated object.

  19. Ensemble atmospheric dispersion modeling for emergency response consequence assessments

    International Nuclear Information System (INIS)

    Addis, R.P.; Buckley, R.L.

    2003-01-01

    Full text: Prognostic atmospheric dispersion models are used to generate consequence assessments, which assist decision-makers in the event of a release from a nuclear facility. Differences in the forecast wind fields generated by various meteorological agencies, differences in the transport and diffusion models themselves, as well as differences in the way these models treat the release source term, all may result in differences in the simulated plumes. This talk will address the U.S. participation in the European ENSEMBLE project, and present a perspective an how ensemble techniques may be used to enable atmospheric modelers to provide decision-makers with a more realistic understanding of how both the atmosphere and the models behave. Meteorological forecasts generated by numerical models from national and multinational meteorological agencies provide individual realizations of three-dimensional, time dependent atmospheric wind fields. These wind fields may be used to drive atmospheric dispersion (transport and diffusion) models, or they may be used to initiate other, finer resolution meteorological models, which in turn drive dispersion models. Many modeling agencies now utilize ensemble-modeling techniques to determine how sensitive the prognostic fields are to minor perturbations in the model parameters. However, the European Union programs RTMOD and ENSEMBLE are the first projects to utilize a WEB based ensemble approach to interpret the output from atmospheric dispersion models. The ensembles produced are different from those generated by meteorological forecasting centers in that they are ensembles of dispersion model outputs from many different atmospheric transport and diffusion models utilizing prognostic atmospheric fields from several different forecast centers. As such, they enable a decision-maker to consider the uncertainty in the plume transport and growth as a result of the differences in the forecast wind fields as well as the differences in the

  20. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  1. Modelling Tradescantia fluminensis to assess long term survival

    Directory of Open Access Journals (Sweden)

    Alex James

    2015-06-01

    Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

  2. An integrated model for the assessment of global water resources – Part 2: Applications and assessments

    Directory of Open Access Journals (Sweden)

    N. Hanasaki

    2008-07-01

    Full Text Available To assess global water resources from the perspective of subannual variation in water availability and water use, an integrated water resources model was developed. In a companion report, we presented the global meteorological forcing input used to drive the model and six modules, namely, the land surface hydrology module, the river routing module, the crop growth module, the reservoir operation module, the environmental flow requirement module, and the anthropogenic withdrawal module. Here, we present the results of the model application and global water resources assessments. First, the timing and volume of simulated agriculture water use were examined because agricultural use composes approximately 85% of total consumptive water withdrawal in the world. The estimated crop calendar showed good agreement with earlier reports for wheat, maize, and rice in major countries of production. In major countries, the error in the planting date was ±1 mo, but there were some exceptional cases. The estimated irrigation water withdrawal also showed fair agreement with country statistics, but tended to be underestimated in countries in the Asian monsoon region. The results indicate the validity of the model and the input meteorological forcing because site-specific parameter tuning was not used in the series of simulations. Finally, global water resources were assessed on a subannual basis using a newly devised index. This index located water-stressed regions that were undetected in earlier studies. These regions, which are indicated by a gap in the subannual distribution of water availability and water use, include the Sahel, the Asian monsoon region, and southern Africa. The simulation results show that the reservoir operations of major reservoirs (>1 km3 and the allocation of environmental flow requirements can alter the population under high water stress by approximately −11% to +5% globally. The integrated model is applicable to

  3. Confidence assessment. Site descriptive modelling SDM-Site Forsmark

    International Nuclear Information System (INIS)

    2008-09-01

    The objective of this report is to assess the confidence that can be placed in the Forsmark site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Forsmark). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface based investigations or more usefully by explorations underground made during construction of the repository. The confidence in the Forsmark site descriptive model, based on the data available at the conclusion of the surface-based site investigations, have been assessed by exploring: Confidence in the site characterisation data base; Key remaining issues and their handling; Handling of alternative models; Consistency between disciplines; and, Main reasons for confidence and lack of confidence in the model. It is generally found that the key aspects of importance for safety assessment and repository engineering of the Forsmark site descriptive model are associated with a high degree of confidence. Because of the robust geological model that describes the site, the overall confidence in Forsmark site descriptive model is judged to be high. While some aspects have lower confidence this lack of confidence is handled by providing wider uncertainty ranges, bounding estimates and/or alternative models. Most, but not all, of the low confidence aspects have little impact on repository engineering design or for long-term safety. Poor precision in the measured data are judged to have limited impact on uncertainties on the site descriptive model, with the exceptions of inaccuracy in determining the position of some boreholes at depth in 3-D space, as well as the poor precision of the orientation of BIPS images in some boreholes, and the poor precision of stress data determined by overcoring at the locations where the pre

  4. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  5. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    Science.gov (United States)

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  6. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  7. Modeling exposure to persistent chemicals in hazard and risk assessment.

    Science.gov (United States)

    Cowan-Ellsberry, Christina E; McLachlan, Michael S; Arnot, Jon A; Macleod, Matthew; McKone, Thomas E; Wania, Frank

    2009-10-01

    Fate and exposure modeling has not, thus far, been explicitly used in the risk profile documents prepared for evaluating the significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of persistent organic pollutants (POP) and persistent, bioaccumulative, and toxic (PBT) chemicals in the environment. The goal of this publication is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include 1) benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk; 2) directly estimating the exposure of the environment, biota, and humans to provide information to complement measurements or where measurements are not available or are limited; 3) to identify the key processes and chemical or environmental parameters that determine the exposure, thereby allowing the effective prioritization of research or measurements to improve the risk profile; and 4) forecasting future time trends, including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and

  8. Surrogacy assessment using principal stratification and a Gaussian copula model.

    Science.gov (United States)

    Conlon, Asc; Taylor, Jmg; Elliott, M R

    2017-02-01

    In clinical trials, a surrogate outcome ( S) can be measured before the outcome of interest ( T) and may provide early information regarding the treatment ( Z) effect on T. Many methods of surrogacy validation rely on models for the conditional distribution of T given Z and S. However, S is a post-randomization variable, and unobserved, simultaneous predictors of S and T may exist, resulting in a non-causal interpretation. Frangakis and Rubin developed the concept of principal surrogacy, stratifying on the joint distribution of the surrogate marker under treatment and control to assess the association between the causal effects of treatment on the marker and the causal effects of treatment on the clinical outcome. Working within the principal surrogacy framework, we address the scenario of an ordinal categorical variable as a surrogate for a censored failure time true endpoint. A Gaussian copula model is used to model the joint distribution of the potential outcomes of T, given the potential outcomes of S. Because the proposed model cannot be fully identified from the data, we use a Bayesian estimation approach with prior distributions consistent with reasonable assumptions in the surrogacy assessment setting. The method is applied to data from a colorectal cancer clinical trial, previously analyzed by Burzykowski et al.

  9. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  10. Improving treatment outcome assessment in a mouse tuberculosis model.

    Science.gov (United States)

    Mourik, Bas C; Svensson, Robin J; de Knegt, Gerjo J; Bax, Hannelore I; Verbon, Annelies; Simonsson, Ulrika S H; de Steenwinkel, Jurriaan E M

    2018-04-09

    Preclinical treatment outcome evaluation of tuberculosis (TB) occurs primarily in mice. Current designs compare relapse rates of different regimens at selected time points, but lack information about the correlation between treatment length and treatment outcome, which is required to efficiently estimate a regimens' treatment-shortening potential. Therefore we developed a new approach. BALB/c mice were infected with a Mycobacterium tuberculosis Beijing genotype strain and were treated with rifapentine-pyrazinamide-isoniazid-ethambutol (R p ZHE), rifampicin-pyrazinamide-moxifloxacin-ethambutol (RZME) or rifampicin-pyrazinamide-moxifloxacin-isoniazid (RZMH). Treatment outcome was assessed in n = 3 mice after 9 different treatment lengths between 2-6 months. Next, we created a mathematical model that best fitted the observational data and used this for inter-regimen comparison. The observed data were best described by a sigmoidal E max model in favor over linear or conventional E max models. Estimating regimen-specific parameters showed significantly higher curative potentials for RZME and R p ZHE compared to RZMH. In conclusion, we provide a new design for treatment outcome evaluation in a mouse TB model, which (i) provides accurate tools for assessment of the relationship between treatment length and predicted cure, (ii) allows for efficient comparison between regimens and (iii) adheres to the reduction and refinement principles of laboratory animal use.

  11. Arc-related porphyry molybdenum deposit model: Chapter D in Mineral deposit models for resource assessment

    Science.gov (United States)

    Taylor, Ryan D.; Hammarstrom, Jane M.; Piatak, Nadine M.; Seal, Robert R.

    2012-01-01

    This report provides a descriptive model for arc-related porphyry molybdenum deposits. Presented within are geological, geochemical, and mineralogical characteristics that differentiate this deposit type from porphyry copper and alkali-feldspar rhyolite-granite porphyry molybdenum deposits. The U.S. Geological Survey's effort to update existing mineral deposit models spurred this research, which is intended to supplement previously published models for this deposit type that help guide mineral-resource and mineral-environmental assessments.

  12. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  13. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  14. Evaluating intersectoral collaboration: a model for assessment by service users

    Directory of Open Access Journals (Sweden)

    Bengt Ahgren

    2009-02-01

    Full Text Available Introduction: DELTA was launched as a project in 1997 to improve intersectoral collaboration in the rehabilitation field. In 2005 DELTA was transformed into a local association for financial co-ordination between the institutions involved. Based on a study of the DELTA service users, the purpose of this article is to develop and to validate a model that can be used to assess the integration of welfare services from the perspective of the service users. Theory: The foundation of integration is a well functioning structure of integration. Without such structural conditions, it is difficult to develop a process of integration that combines the resources and competences of the collaborating organisations to create services advantageous for the service users. In this way, both the structure and the process will contribute to the outcome of integration. Method: The study was carried out as a retrospective cross-sectional survey during two weeks, including all the current service users of DELTA. The questionnaire contained 32 questions, which were derived from the theoretical framework and research on service users, capturing perceptions of integration structure, process and outcome. Ordinal scales and open questions where used for the assessment. Results: The survey had a response rate of 82% and no serious biases of the results were detected. The study shows that the users of the rehabilitation services perceived the services as well integrated, relevant and adapted to their needs. The assessment model was tested for reliability and validity and a few modifications were suggested. Some key measurement themes were derived from the study. Conclusion: The model developed in this study is an important step towards an assessment of service integration from the perspective of the service users. It needs to be further refined, however, before it can be used in other evaluations of collaboration in the provision of integrated welfare services.

  15. Are revised models better models? A skill score assessment of regional interannual variability

    Science.gov (United States)

    Sperber, Kenneth R.; Participating AMIP Modelling Groups

    1999-05-01

    Various skill scores are used to assess the performance of revised models relative to their original configurations. The interannual variability of all-India, Sahel and Nordeste rainfall and summer monsoon windshear is examined in integrations performed under the experimental design of the Atmospheric Model Intercomparison Project. For the indices considered, the revised models exhibit greater fidelity at simulating the observed interannual variability. Interannual variability of all-India rainfall is better simulated by models that have a more realistic rainfall climatology in the vicinity of India, indicating the beneficial effect of reducing systematic model error.

  16. Habitat hydraulic models - a tool for Danish stream quality assessment?

    DEFF Research Database (Denmark)

    Olsen, Martin

    and hydromorphological and chemical characteristics has to be enlightened (EUROPA, 2005). This study links catchment hydrology, stream discharge and physical habitat in a small Danish stream, the stream Ledreborg, and discusses the utility of habitat hydraulic models in relation to the present criteria and methods used......).  Hydromorphological conditions in the stream are measured through field study, using a habitat mapping approach and modelled using a habitat hydraulic model (RHYHABSIM). Using RHYHABSIM and both "site-specific" and general HSI's, Weighted Usable Area (WUA) for the trout population at different discharges is assessed...... and differences between simulated WUA using "site-specific" and general habitat preferences are discussed. In RHYHABSIM it is possible to use two different approaches to investigate the hydromorphological conditions in a river, the habitat mapping approach used in this project and the representative reach...

  17. Permafrost Degradation Risk Zone Assessment using Simulation Models

    DEFF Research Database (Denmark)

    Daanen, R.P.; Ingeman-Nielsen, Thomas; Marchenko, S.

    2011-01-01

    In this proof-of-concept study we focus on linking large scale climate and permafrost simulations to small scale engineering projects by bridging the gap between climate and permafrost sciences on the one hand and on the other technical recommendation for adaptation of planned infrastructures...... to climate change in a region generally underlain by permafrost. We present the current and future state of permafrost in Greenland as modelled numerically with the GIPL model driven by HIRHAM climate projections up to 2080. We develop a concept called Permafrost Thaw Potential (PTP), defined...... as the potential active layer increase due to climate warming and surface alterations. PTP is then used in a simple risk assessment procedure useful for engineering applications. The modelling shows that climate warming will result in continuing wide-spread permafrost warming and degradation in Greenland...

  18. Application of a leakage model to assess exfiltration from sewers.

    Science.gov (United States)

    Karpf, C; Krebs, P

    2005-01-01

    The exfiltration of wastewater from sewer systems in urban areas causes a deterioration of soil and possibly groundwater quality. Beside the simulation of transport and degradation processes in the unsaturated zone and in the aquifer the analysis of the potential impact requires the estimation of quantity and temporal variation of wastewater exfiltration. Exfiltration can be assessed by the application of a leakage model. The hydrological approach was originally developed to simulate the interactions between the groundwater and surface water, it was adapted to allow for modelling of interactions between groundwater and sewer system. In order to approximate the exfiltration specific model parameters infiltration specific parameters were used as a basis. Scenario analysis of the exfiltration in the City of Dresden from 1997 to 1999 and during the flood event in August 2002 shows the variation and the extent of exfiltration rates.

  19. Amazon rainforest responses to elevated CO2: Deriving model-based hypotheses for the AmazonFACE experiment

    Science.gov (United States)

    Rammig, A.; Fleischer, K.; Lapola, D.; Holm, J.; Hoosbeek, M.

    2017-12-01

    Increasing atmospheric CO2 concentration is assumed to have a stimulating effect ("CO2 fertilization effect") on forest growth and resilience. Empirical evidence, however, for the existence and strength of such a tropical CO2 fertilization effect is scarce and thus a major impediment for constraining the uncertainties in Earth System Model projections. The implications of the tropical CO2 effect are far-reaching, as it strongly influences the global carbon and water cycle, and hence future global climate. In the scope of the Amazon Free Air CO2 Enrichment (FACE) experiment, we addressed these uncertainties by assessing the CO2 fertilization effect at ecosystem scale. AmazonFACE is the first FACE experiment in an old-growth, highly diverse tropical rainforest. Here, we present a priori model-based hypotheses for the experiment derived from a set of 12 ecosystem models. Model simulations identified key uncertainties in our understanding of limiting processes and derived model-based hypotheses of expected ecosystem responses to elevated CO2 that can directly be tested during the experiment. Ambient model simulations compared satisfactorily with in-situ measurements of ecosystem carbon fluxes, as well as carbon, nitrogen, and phosphorus stocks. Models consistently predicted an increase in photosynthesis with elevated CO2, which declined over time due to developing limitations. The conversion of enhanced photosynthesis into biomass, and hence ecosystem carbon sequestration, varied strongly among the models due to different assumptions on nutrient limitation. Models with flexible allocation schemes consistently predicted an increased investment in belowground structures to alleviate nutrient limitation, in turn accelerating turnover rates of soil organic matter. The models diverged on the prediction for carbon accumulation after 10 years of elevated CO2, mainly due to contrasting assumptions in their phosphorus cycle representation. These differences define the expected

  20. Assessment of realizability constraints in v2-f turbulence models

    International Nuclear Information System (INIS)

    Sveningsson, A.; Davidson, L.

    2004-01-01

    The use of the realizability constraint in v 2 -f turbulence models is assessed by computing a stator vane passage flow. In this flow the stagnation region is large and it is shown that the time scale bound suggested by [Int. J. Heat Fluid Flow 17 (1995) 89] is well suited to prevent unphysical growth of turbulence kinetic energy. However, this constraint causes numerical instabilities when used in the equation for the relaxation parameter, f. It is also shown that the standard use of the realizability constraint in the v 2 -f model is inconsistent and some modifications are suggested. These changes of the v 2 -f model are examined and shown to have negligible effect on the overall performance of the v 2 -f model. In this work two different versions of the v 2 -f model are investigated and the results obtained are compared with experimental data. The model on a form similar to that originally suggested by Durbin (e.g. [AIAA J. 33 (1995) 659]) produced the overall best agreement with stator vane heat transfer data

  1. ACCURACY ASSESSMENT OF RECENT GLOBAL OCEAN TIDE MODELS AROUND ANTARCTICA

    Directory of Open Access Journals (Sweden)

    J. Lei

    2017-09-01

    Full Text Available Due to the coverage limitation of T/P-series altimeters, the lack of bathymetric data under large ice shelves, and the inaccurate definitions of coastlines and grounding lines, the accuracy of ocean tide models around Antarctica is poorer than those in deep oceans. Using tidal measurements from tide gauges, gravimetric data and GPS records, the accuracy of seven state-of-the-art global ocean tide models (DTU10, EOT11a, GOT4.8, FES2012, FES2014, HAMTIDE12, TPXO8 is assessed, as well as the most widely-used conventional model FES2004. Four regions (Antarctic Peninsula region, Amery ice shelf region, Filchner-Ronne ice shelf region and Ross ice shelf region are separately reported. The standard deviations of eight main constituents between the selected models are large in polar regions, especially under the big ice shelves, suggesting that the uncertainty in these regions remain large. Comparisons with in situ tidal measurements show that the most accurate model is TPXO8, and all models show worst performance in Weddell sea and Filchner-Ronne ice shelf regions. The accuracy of tidal predictions around Antarctica is gradually improving.

  2. Task-based dermal exposure models for regulatory risk assessment.

    Science.gov (United States)

    Warren, Nicholas D; Marquart, Hans; Christopher, Yvette; Laitinen, Juha; VAN Hemmen, Joop J

    2006-07-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of new measurements of dermal exposure together with detailed contextual information. This article describes the development of a set of generic task-based models capable of predicting potential dermal exposure to both solids and liquids in a wide range of situations. To facilitate modelling of the wide variety of dermal exposure situations six separate models were made for groupings of exposure scenarios called Dermal Exposure Operation units (DEO units). These task-based groupings cluster exposure scenarios with regard to the expected routes of dermal exposure and the expected influence of exposure determinants. Within these groupings linear mixed effect models were used to estimate the influence of various exposure determinants and to estimate components of variance. The models predict median potential dermal exposure rates for the hands and the rest of the body from the values of relevant exposure determinants. These rates are expressed as mg or microl product per minute. Using these median potential dermal exposure rates and an accompanying geometric standard deviation allows a range of exposure percentiles to be calculated.

  3. Dose Assessment Model for Chronic Atmospheric Releases of Tritium

    International Nuclear Information System (INIS)

    Shen Huifang; Yao Rentai

    2010-01-01

    An improved dose assessment model for chronic atmospheric releases of tritium was proposed. The proposed model explicitly considered two chemical forms of tritium.It was based on conservative assumption of transfer of tritiated water (HTO) from air to concentration of HTO and organic beam tritium (OBT) in vegetable and animal products.The concentration of tritium in plant products was calculated based on considering dividedly leafy plant and not leafy plant, meanwhile the concentration contribution of tritium in the different plants from the tritium in soil was taken into account.Calculating the concentration of HTO in animal products, average water fraction of animal products and the average weighted tritium concentration of ingested water based on the fraction of water supplied by each source were considered,including skin absorption, inhalation, drinking water and food.Calculating the annual doses, the ingestion doses were considered, at the same time the contribution of inhalation and skin absorption to the dose was considered. Concentrations in foodstuffs and dose of annual adult calculated with the specific activity model, NEWTRI model and the model proposed by the paper were compared. The results indicate that the model proposed by the paper can predict accurately tritium doses through the food chain from chronic atmospheric releases. (authors)

  4. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  5. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  6. Accuracy Assessment of Recent Global Ocean Tide Models around Antarctica

    Science.gov (United States)

    Lei, J.; Li, F.; Zhang, S.; Ke, H.; Zhang, Q.; Li, W.

    2017-09-01

    Due to the coverage limitation of T/P-series altimeters, the lack of bathymetric data under large ice shelves, and the inaccurate definitions of coastlines and grounding lines, the accuracy of ocean tide models around Antarctica is poorer than those in deep oceans. Using tidal measurements from tide gauges, gravimetric data and GPS records, the accuracy of seven state-of-the-art global ocean tide models (DTU10, EOT11a, GOT4.8, FES2012, FES2014, HAMTIDE12, TPXO8) is assessed, as well as the most widely-used conventional model FES2004. Four regions (Antarctic Peninsula region, Amery ice shelf region, Filchner-Ronne ice shelf region and Ross ice shelf region) are separately reported. The standard deviations of eight main constituents between the selected models are large in polar regions, especially under the big ice shelves, suggesting that the uncertainty in these regions remain large. Comparisons with in situ tidal measurements show that the most accurate model is TPXO8, and all models show worst performance in Weddell sea and Filchner-Ronne ice shelf regions. The accuracy of tidal predictions around Antarctica is gradually improving.

  7. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  8. Nutrient limitation and vegetation changes in a coastal dune slack

    NARCIS (Netherlands)

    Lammerts, EJ; Pegtel, DM; Grootjans, AP; van der Veen, A.

    Basiphilous pioneer species are among the most endangered plant species in The Netherlands. They find most of their refuges in young coastal dune slacks, especially on the Wadden Sea islands. For the purpose of nature management it is important to know which processes control the presence of

  9. Nutrient limitation of phytoplankton in five impoundments on the ...

    African Journals Online (AJOL)

    2011-02-08

    Feb 8, 2011 ... The Manyame River, which rises near the town of Marondera about 65 km east .... A 20 ℓ sample of depth integrated water ..... so because the local authorities responsible for sewage treat- ... rural catchment the 2 arms of the.

  10. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    International Nuclear Information System (INIS)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-01-01

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow's milk, sheep's milk, goat's milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  11. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-07-20

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow’s milk, sheep’s milk, goat’s milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  12. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  13. Assessing global vegetation activity using spatio-temporal Bayesian modelling

    Science.gov (United States)

    Mulder, Vera L.; van Eck, Christel M.; Friedlingstein, Pierre; Regnier, Pierre A. G.

    2016-04-01

    This work demonstrates the potential of modelling vegetation activity using a hierarchical Bayesian spatio-temporal model. This approach allows modelling changes in vegetation and climate simultaneous in space and time. Changes of vegetation activity such as phenology are modelled as a dynamic process depending on climate variability in both space and time. Additionally, differences in observed vegetation status can be contributed to other abiotic ecosystem properties, e.g. soil and terrain properties. Although these properties do not change in time, they do change in space and may provide valuable information in addition to the climate dynamics. The spatio-temporal Bayesian models were calibrated at a regional scale because the local trends in space and time can be better captured by the model. The regional subsets were defined according to the SREX segmentation, as defined by the IPCC. Each region is considered being relatively homogeneous in terms of large-scale climate and biomes, still capturing small-scale (grid-cell level) variability. Modelling within these regions is hence expected to be less uncertain due to the absence of these large-scale patterns, compared to a global approach. This overall modelling approach allows the comparison of model behavior for the different regions and may provide insights on the main dynamic processes driving the interaction between vegetation and climate within different regions. The data employed in this study encompasses the global datasets for soil properties (SoilGrids), terrain properties (Global Relief Model based on SRTM DEM and ETOPO), monthly time series of satellite-derived vegetation indices (GIMMS NDVI3g) and climate variables (Princeton Meteorological Forcing Dataset). The findings proved the potential of a spatio-temporal Bayesian modelling approach for assessing vegetation dynamics, at a regional scale. The observed interrelationships of the employed data and the different spatial and temporal trends support

  14. Assessing climate change impact by integrated hydrological modelling

    Science.gov (United States)

    Lajer Hojberg, Anker; Jørgen Henriksen, Hans; Olsen, Martin; der Keur Peter, van; Seaby, Lauren Paige; Troldborg, Lars; Sonnenborg, Torben; Refsgaard, Jens Christian

    2013-04-01

    showed some unexpected results, where climate models predicting the largest increase in net precipitation did not result in the largest increase in groundwater heads. This was found to be the result of different initial conditions (1990 - 2010) for the various climate models. In some areas a combination of a high initial groundwater head and an increase in precipitation towards 2021 - 2050 resulted in a groundwater head raise that reached the drainage or the surface water system. This will increase the exchange from the groundwater to the surface water system, but reduce the raise in groundwater heads. An alternative climate model, with a lower initial head can thus predict a higher increase in the groundwater head, although the increase in precipitation is lower. This illustrates an extra dimension in the uncertainty assessment, namely the climate models capability of simulating the current climatic conditions in a way that can reproduce the observed hydrological response. Højberg, AL, Troldborg, L, Stisen, S, et al. (2012) Stakeholder driven update and improvement of a national water resources model - http://www.sciencedirect.com/science/article/pii/S1364815212002423 Seaby, LP, Refsgaard, JC, Sonnenborg, TO, et al. (2012) Assessment of robustness and significance of climate change signals for an ensemble of distribution-based scaled climate projections (submitted) Journal of Hydrology Stisen, S, Højberg, AL, Troldborg, L et al., (2012): On the importance of appropriate rain-gauge catch correction for hydrological modelling at mid to high latitudes - http://www.hydrol-earth-syst-sci.net/16/4157/2012/

  15. Development of good modelling practice for phsiologically based pharmacokinetic models for use in risk assessment: The first steps

    Science.gov (United States)

    The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...

  16. Psychometric model for safety culture assessment in nuclear research facilities

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Andrade, D.A.; Mesquita, R.N. de

    2017-01-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  17. Psychometric model for safety culture assessment in nuclear research facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, C.S. do, E-mail: claudio.souza@ctmsp.mar.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 São Paulo, SP (Brazil); Andrade, D.A., E-mail: delvonei@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: rnavarro@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil)

    2017-04-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  18. Potential of 3D City Models to assess flood vulnerability

    Science.gov (United States)

    Schröter, Kai; Bochow, Mathias; Schüttig, Martin; Nagel, Claus; Ross, Lutz; Kreibich, Heidi

    2016-04-01

    Vulnerability, as the product of exposure and susceptibility, is a key factor of the flood risk equation. Furthermore, the estimation of flood loss is very sensitive to the choice of the vulnerability model. Still, in contrast to elaborate hazard simulations, vulnerability is often considered in a simplified manner concerning the spatial resolution and geo-location of exposed objects as well as the susceptibility of these objects at risk. Usually, area specific potential flood loss is quantified on the level of aggregated land-use classes, and both hazard intensity and resistance characteristics of affected objects are represented in highly simplified terms. We investigate the potential of 3D City Models and spatial features derived from remote sensing data to improve the differentiation of vulnerability in flood risk assessment. 3D City Models are based on CityGML, an application scheme of the Geography Markup Language (GML), which represents the 3D geometry, 3D topology, semantics and appearance of objects on different levels of detail. As such, 3D City Models offer detailed spatial information which is useful to describe the exposure and to characterize the susceptibility of residential buildings at risk. This information is further consolidated with spatial features of the building stock derived from remote sensing data. Using this database a spatially detailed flood vulnerability model is developed by means of data-mining. Empirical flood damage data are used to derive and to validate flood susceptibility models for individual objects. We present first results from a prototype application in the city of Dresden, Germany. The vulnerability modeling based on 3D City Models and remote sensing data is compared i) to the generally accepted good engineering practice based on area specific loss potential and ii) to a highly detailed representation of flood vulnerability based on a building typology using urban structure types. Comparisons are drawn in terms of

  19. Assessing women's lacrosse head impacts using finite element modelling.

    Science.gov (United States)

    Clark, J Michio; Hoshizaki, T Blaine; Gilchrist, Michael D

    2018-04-01

    Recently studies have assessed the ability of helmets to reduce peak linear and rotational acceleration for women's lacrosse head impacts. However, such measures have had low correlation with injury. Maximum principal strain interprets loading curves which provide better injury prediction than peak linear and rotational acceleration, especially in compliant situations which create low magnitude accelerations but long impact durations. The purpose of this study was to assess head and helmet impacts in women's lacrosse using finite element modelling. Linear and rotational acceleration loading curves from women's lacrosse impacts to a helmeted and an unhelmeted Hybrid III headform were input into the University College Dublin Brain Trauma Model. The finite element model was used to calculate maximum principal strain in the cerebrum. The results demonstrated for unhelmeted impacts, falls and ball impacts produce higher maximum principal strain values than stick and shoulder collisions. The strain values for falls and ball impacts were found to be within the range of concussion and traumatic brain injury. The results also showed that men's lacrosse helmets reduced maximum principal strain for follow-through slashing, falls and ball impacts. These findings are novel and demonstrate that for high risk events, maximum principal strain can be reduced by implementing the use of helmets if the rules of the sport do not effectively manage such situations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    Science.gov (United States)

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  1. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  2. An ethical assessment model for digital disease detection technologies.

    Science.gov (United States)

    Denecke, Kerstin

    2017-09-20

    Digital epidemiology, also referred to as digital disease detection (DDD), successfully provided methods and strategies for using information technology to support infectious disease monitoring and surveillance or understand attitudes and concerns about infectious diseases. However, Internet-based research and social media usage in epidemiology and healthcare pose new technical, functional and formal challenges. The focus of this paper is on the ethical issues to be considered when integrating digital epidemiology with existing practices. Taking existing ethical guidelines and the results from the EU project M-Eco and SORMAS as starting point, we develop an ethical assessment model aiming at providing support in identifying relevant ethical concerns in future DDD projects. The assessment model has four dimensions: user, application area, data source and methodology. The model supports in becoming aware, identifying and describing the ethical dimensions of DDD technology or use case and in identifying the ethical issues on the technology use from different perspectives. It can be applied in an interdisciplinary meeting to collect different viewpoints on a DDD system even before the implementation starts and aims at triggering discussions and finding solutions for risks that might not be acceptable even in the development phase. From the answers, ethical issues concerning confidence, privacy, data and patient security or justice may be judged and weighted.

  3. Accuracy of virtual models in the assessment of maxillary defects

    International Nuclear Information System (INIS)

    Kamburoglu, Kivanc; Kursun, Sebnem; Kilic, Cenk; Eozen, Tuncer

    2015-01-01

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm 3 (FOV 60 ); 2) 80 X 80 mm FOV, 0.160 mm 3 (FOV 80 ); and 3) 100 X 100 mm FOV, 0.250 mm 3 (FOV 100 ). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  4. Accuracy of virtual models in the assessment of maxillary defects

    Energy Technology Data Exchange (ETDEWEB)

    Kamburoglu, Kivanc [Dept. of Dentomaxillofacial Radiology, Faculty of Dentistry, Ankara University, Ankara (Turkmenistan); Kursun, Sebnem [Division of Dentomaxillofacial Radiology, Ministry of Health, Oral and Dental Health Center, Bolu (Turkmenistan); Kilic, Cenk; Eozen, Tuncer [Gealhane Military Medical Academy, Ankara, (Turkmenistan)

    2015-03-15

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm{sup 3} (FOV{sub 60}); 2) 80 X 80 mm FOV, 0.160 mm{sup 3} (FOV{sub 80}); and 3) 100 X 100 mm FOV, 0.250 mm{sup 3} (FOV{sub 100}). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  5. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    Science.gov (United States)

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  7. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  8. Assessing the limitations of the Banister model in monitoring training

    Science.gov (United States)

    Hellard, Philippe; Avalos, Marta; Lacoste, Lucien; Barale, Frédéric; Chatard, Jean-Claude; Millet, Grégoire P.

    2006-01-01

    The aim of this study was to carry out a statistical analysis of the Banister model to verify how useful it is in monitoring the training programmes of elite swimmers. The accuracy, the ill-conditioning and the stability of this model were thus investigated. Training loads of nine elite swimmers, measured over one season, were related to performances with the Banister model. Firstly, to assess accuracy, the 95% bootstrap confidence interval (95% CI) of parameter estimates and modelled performances were calculated. Secondly, to study ill-conditioning, the correlation matrix of parameter estimates was computed. Finally, to analyse stability, iterative computation was performed with the same data but minus one performance, chosen randomly. Performances were significantly related to training loads in all subjects (R2= 0.79 ± 0.13, P < 0.05) and the estimation procedure seemed to be stable. Nevertheless, the 95% CI of the most useful parameters for monitoring training were wide τa =38 (17, 59), τf =19 (6, 32), tn =19 (7, 35), tg =43 (25, 61). Furthermore, some parameters were highly correlated making their interpretation worthless. The study suggested possible ways to deal with these problems and reviewed alternative methods to model the training-performance relationships. PMID:16608765

  9. Korean risk assessment model for breast cancer risk prediction.

    Science.gov (United States)

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K

    2013-01-01

    We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (pKorean women, especially urban women.

  10. A transportable system of models for natural resource damage assessment

    International Nuclear Information System (INIS)

    Reed, M.; French, D.

    1992-01-01

    A system of computer models has been developed for assessment of natural resource economic damages resulting from spills of oil and hazardous materials in marine and fresh water environments. Under USA federal legislation, the results of the model system are presumed correct in damage litigation proceedings. The model can address a wide range of spatial and temporal scales. The equations describing the motion of both pollutants and biota are solved in three dimensions. The model can simulate continuous releases of a contaminant, with representation of complex coastal boundaries, variable bathymetry, multiple shoreline types, and spatially variable ecosystem habitats. A graphic user interface provides easy control of the system in addition to the ability to display elements of the underlying geographical information system data base. The model is implemented on a personal computer and on a UNIX workstation. The structure of the system is such that transport to new geographic regions can be accomplished relatively easily, requiring only the development of the appropriate physical, toxicological, biological, and economic data sets. Applications are currently in progress for USA inland and coastal waters, the Adriatic Sea, the Strait of Sicily, the Gulf of Suez, and the Baltic Sea. 4 refs., 2 figs

  11. Assessment of a Low-Cost Ultrasound Pericardiocentesis Model

    Directory of Open Access Journals (Sweden)

    Marco Campo dell'Orto

    2013-01-01

    Full Text Available Introduction. The use of ultrasound during resuscitation is emphasized in the latest European resuscitation council guidelines of 2013 to identify treatable conditions such as pericardial tamponade. The recommended standard treatment of tamponade in various guidelines is pericardiocentesis. As ultrasound guidance lowers the complication rates and increases the patient’s safety, pericardiocentesis should be performed under ultrasound guidance. Acute care physicians actually need to train emergency pericardiocentesis. Methods. We describe in detail a pericardiocentesis ultrasound model, using materials at a cost of about 60 euros. During training courses of focused echocardiography n=67, participants tested the phantom and completed a 16-item questionnaire, assessing the model using a visual analogue scale (VAS. Results. Eleven of fourteen questions were answered with a mean VAS score higher than 60% and thus regarded as showing the strengths of the model. Unrealistically outer appearance and heart shape were rated as weakness of the model. A total mean VAS score of all questions of 63% showed that participants gained confidence for further interventions. Conclusions. Our low-cost pericardiocentesis model, which can be easily constructed, may serve as an effective training tool of ultrasound-guided pericardiocentesis for acute and critical care physicians.

  12. Assessment of a Low-Cost Ultrasound Pericardiocentesis Model

    Science.gov (United States)

    Campo dell'Orto, Marco; Hempel, Dorothea; Starzetz, Agnieszka; Seibel, Armin; Hannemann, Ulf; Walcher, Felix; Breitkreutz, Raoul

    2013-01-01

    Introduction. The use of ultrasound during resuscitation is emphasized in the latest European resuscitation council guidelines of 2013 to identify treatable conditions such as pericardial tamponade. The recommended standard treatment of tamponade in various guidelines is pericardiocentesis. As ultrasound guidance lowers the complication rates and increases the patient's safety, pericardiocentesis should be performed under ultrasound guidance. Acute care physicians actually need to train emergency pericardiocentesis. Methods. We describe in detail a pericardiocentesis ultrasound model, using materials at a cost of about 60 euros. During training courses of focused echocardiography n = 67, participants tested the phantom and completed a 16-item questionnaire, assessing the model using a visual analogue scale (VAS). Results. Eleven of fourteen questions were answered with a mean VAS score higher than 60% and thus regarded as showing the strengths of the model. Unrealistically outer appearance and heart shape were rated as weakness of the model. A total mean VAS score of all questions of 63% showed that participants gained confidence for further interventions. Conclusions. Our low-cost pericardiocentesis model, which can be easily constructed, may serve as an effective training tool of ultrasound-guided pericardiocentesis for acute and critical care physicians. PMID:24288616

  13. Energy-based numerical models for assessment of soil liquefaction

    Directory of Open Access Journals (Sweden)

    Amir Hossein Alavi

    2012-07-01

    Full Text Available This study presents promising variants of genetic programming (GP, namely linear genetic programming (LGP and multi expression programming (MEP to evaluate the liquefaction resistance of sandy soils. Generalized LGP and MEP-based relationships were developed between the strain energy density required to trigger liquefaction (capacity energy and the factors affecting the liquefaction characteristics of sands. The correlations were established based on well established and widely dispersed experimental results obtained from the literature. To verify the applicability of the derived models, they were employed to estimate the capacity energy values of parts of the test results that were not included in the analysis. The external validation of the models was verified using statistical criteria recommended by researchers. Sensitivity and parametric analyses were performed for further verification of the correlations. The results indicate that the proposed correlations are effectively capable of capturing the liquefaction resistance of a number of sandy soils. The developed correlations provide a significantly better prediction performance than the models found in the literature. Furthermore, the best LGP and MEP models perform superior than the optimal traditional GP model. The verification phases confirm the efficiency of the derived correlations for their general application to the assessment of the strain energy at the onset of liquefaction.

  14. An integrated urban drainage system model for assessing renovation scheme.

    Science.gov (United States)

    Dong, X; Zeng, S; Chen, J; Zhao, D

    2012-01-01

    Due to sustained economic growth in China over the last three decades, urbanization has been on a rapidly expanding track. In recent years, regional industrial relocations were also accelerated across the country from the east coast to the west inland. These changes have led to a large-scale redesign of urban infrastructures, including the drainage system. To help the reconstructed infrastructures towards a better sustainability, a tool is required for assessing the efficiency and environmental performance of different renovation schemes. This paper developed an integrated dynamic modeling tool, which consisted of three models for describing the sewer, the wastewater treatment plant (WWTP) and the receiving water body respectively. Three auxiliary modules were also incorporated to conceptualize the model, calibrate the simulations, and analyze the results. The developed integrated modeling tool was applied to a case study in Shenzhen City, which is one of the most dynamic cities and facing considerable challenges for environmental degradation. The renovation scheme proposed to improve the environmental performance of Shenzhen City's urban drainage system was modeled and evaluated. The simulation results supplied some suggestions for the further improvement of the renovation scheme.

  15. Assessing the Validity of the Simplified Potential Energy Clock Model for Modeling Glass-Ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strong, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dai, Steve Xunhu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Glass-ceramic seals may be the future of hermetic connectors at Sandia National Laboratories. They have been shown capable of surviving higher temperatures and pressures than amorphous glass seals. More advanced finite-element material models are required to enable model-based design and provide evidence that the hermetic connectors can meet design requirements. Glass-ceramics are composite materials with both crystalline and amorphous phases. The latter gives rise to (non-linearly) viscoelastic behavior. Given their complex microstructures, glass-ceramics may be thermorheologically complex, a behavior outside the scope of currently implemented constitutive models at Sandia. However, it was desired to assess if the Simplified Potential Energy Clock (SPEC) model is capable of capturing the material response. Available data for SL 16.8 glass-ceramic was used to calibrate the SPEC model. Model accuracy was assessed by comparing model predictions with shear moduli temperature dependence and high temperature 3-point bend creep data. It is shown that the model can predict the temperature dependence of the shear moduli and 3- point bend creep data. Analysis of the results is presented. Suggestions for future experiments and model development are presented. Though further calibration is likely necessary, SPEC has been shown capable of modeling glass-ceramic behavior in the glass transition region but requires further analysis below the transition region.

  16. Assessment of RANS CFD modelling for pressurised thermal shock analysis

    International Nuclear Information System (INIS)

    Sander M Willemsen; Ed MJ Komen; Sander Willemsen

    2005-01-01

    Full text of publication follows: The most severe Pressurised Thermal Shock (PTS) scenario is a cold water Emergency Core Coolant (ECC) injection into the cold leg during a LOCA. The injected ECC water mixes with the hot fluid present in the cold leg and flows towards the downcomer where further mixing takes place. When the cold mixture comes into contact with the Reactor Pressure Vessel (RPV) wall, it may lead to large temperature gradients and consequently to high stresses in the RPV wall. Knowledge of these thermal loads is important for RPV remnant life assessments. The existing thermal-hydraulic system codes currently applied for this purpose are based on one-dimensional approximations and can, therefore, not predict the complex three-dimensional flows occurring during ECC injection. Computational Fluid Dynamics (CFD) can be applied to predict these phenomena, with the ultimate benefit of improved remnant RPV life assessment. The present paper presents an assessment of various Reynolds Averaged Navier Stokes (RANS) CFD approaches for modeling the complex mixing phenomena occurring during ECC injection. This assessment has been performed by comparing the numerical results obtained using advanced turbulence models available in the CFX 5.6 CFD code in combination with a hybrid meshing strategy with experimental results of the Upper Plenum Test Facility (UPTF). The UPTF was a full-scale 'simulation' of the primary system of the four loop 1300 MWe Siemens/KWU Pressurised Water Reactor at Grafenrheinfeld. The test vessel upper plenum internals, downcomer and primary coolant piping were replicas of the reference plant, while other components, such as core, coolant pump and steam generators were replaced by simulators. From the extensive test programme, a single-phase fluid-fluid mixing experiment in the cold leg and downcomer was selected. Prediction of the mixing and stratification is assessed by comparison with the measured temperature profiles at several locations

  17. Simplified Predictive Models for CO2 Sequestration Performance Assessment

    Science.gov (United States)

    Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis

    2014-05-01

    We present results from an ongoing research project that seeks to develop and validate a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formation. The overall research goal is to provide tools for predicting: (a) injection well and formation pressure buildup, and (b) lateral and vertical CO2 plume migration. Simplified modeling approaches that are being developed in this research fall under three categories: (1) Simplified physics-based modeling (SPM), where only the most relevant physical processes are modeled, (2) Statistical-learning based modeling (SLM), where the simulator is replaced with a "response surface", and (3) Reduced-order method based modeling (RMM), where mathematical approximations reduce the computational burden. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. In the first category (SPM), we use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. In the second category (SLM), we develop statistical "proxy models" using the simulation domain described previously with two different approaches: (a) classical Box-Behnken experimental design with a quadratic response surface fit, and (b) maximin Latin Hypercube sampling (LHS) based design with a Kriging metamodel fit using a quadratic trend and Gaussian correlation structure. For roughly the same number of

  18. Assessment of the five-factor model of personality.

    Science.gov (United States)

    Widiger, T A; Trull, T J

    1997-04-01

    The five-factor model (FFM) of personality is obtaining construct validation, recognition, and practical consideration across a broad domain of fields, including clinical psychology, industrial-organizational psychology, and health psychology. As a result, an array of instruments have been developed and existing instruments are being modified to assess the FFM. In this article, we present an overview and critique of five such instruments (the Goldberg Big Five Markers, the revised NEO Personality Inventory, the Interpersonal Adjective Scales-Big Five, the Personality Psychopathology-Five, and the Hogan Personality Inventory), focusing in particular on their representation of the lexical FFM and their practical application.

  19. Human Factor Modelling in the Risk Assessment of Port Manoeuvers

    Directory of Open Access Journals (Sweden)

    Teresa Abramowicz-Gerigk

    2015-09-01

    Full Text Available The documentation of human factor influence on the scenario development in maritime accidents compared with expert methods is commonly used as a basis in the process of setting up safety regulations and instructions. The new accidents and near misses show the necessity for further studies in determining the human factor influence on both risk acceptance criteria and development of risk control options for the manoeuvers in restricted waters. The paper presents the model of human error probability proposed for the assessment of ship masters and marine pilots' error decision and its influence on the risk of port manoeuvres.

  20. Modelling and performance assessment of an antenna-control system

    Science.gov (United States)

    Burrows, C. R.

    1982-03-01

    An assessment is made of a surveillance-radar control system designed to provide a sector-search capability and continuous control of antenna speed without unwanted torque-reaction on the supporting mast. These objectives are attained by utilizing regenerative braking, and control is exercised through Perbury CVTs. A detailed analysis of the system is given. The models derived for the Perbury CVTs supplement the qualitative data contained in earlier papers. Some results from a computer simulation are presented. Although the paper is concerned with a particular problem, the analysis of the CVTs, and the concept of using energy transfer to control large inertial loads, are of more general interest.

  1. Predicting Performance on MOOC Assessments using Multi-Regression Models

    OpenAIRE

    Ren, Zhiyun; Rangwala, Huzefa; Johri, Aditya

    2016-01-01

    The past few years has seen the rapid growth of data min- ing approaches for the analysis of data obtained from Mas- sive Open Online Courses (MOOCs). The objectives of this study are to develop approaches to predict the scores a stu- dent may achieve on a given grade-related assessment based on information, considered as prior performance or prior ac- tivity in the course. We develop a personalized linear mul- tiple regression (PLMR) model to predict the grade for a student, prior to attempt...

  2. Usage models in reliability assessment of software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland)

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.).

  3. Assessing ecological sustainability in urban planning - EcoBalance model

    Energy Technology Data Exchange (ETDEWEB)

    Wahlgren, I., Email: irmeli.wahlgren@vtt.fi

    2012-06-15

    Urban planning solutions and decisions have large-scale significance for ecological sustainability (eco-efficiency) the consumption of energy and other natural resources, the production of greenhouse gas and other emissions and the costs caused by urban form. Climate change brings new and growing challenges for urban planning. The EcoBalance model was developed to assess the sustainability of urban form and has been applied at various planning levels: regional plans, local master plans and detailed plans. The EcoBalance model estimates the total consumption of energy and other natural resources, the production of emissions and wastes and the costs caused directly and indirectly by urban form on a life cycle basis. The results of the case studies provide information about the ecological impacts of various solutions in urban development. (orig.)

  4. Operation quality assessment model for video conference system

    Science.gov (United States)

    Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian

    2018-01-01

    Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.

  5. Considerations on assessment of different time depending models adequacy

    International Nuclear Information System (INIS)

    Constantinescu, C.

    2015-01-01

    The operating period of nuclear power plants can be prolonged if it can be shown that their safety has remained on a high level, and for this, it is necessary to estimate how the aged systems, structures and components (SSCs) influence the NPP reliability and safety. To emphasize the ageing aspects the case study presented in this paper will assess different time depending models for rate of occurrence of failures with the goal to obtain the best fitting model. A sensitivity analysis for the impact of burn-in failures was performed to improve the result of the goodness of fit test. Based on the analysis results, a conclusion about the existence or the absence of an ageing trend could be developed. A sensitivity analysis regarding of the reliability parameters was performed, and the results were used to observe the impact over the time-dependent rate of occurrence of failures. (authors)

  6. Assessing policies towards sustainable transport in Europe: an integrated model

    International Nuclear Information System (INIS)

    Zachariadis, Theodoros

    2005-01-01

    A transport simulation and forecast model is presented, which is designed for the assessment of policy options aiming to achieve sustainability in transportation. Starting from a simulation of the economic behaviour of consumers and producers within a microeconomic optimisation framework and the resulting calculation of the modal split, the allocation of the vehicle stock into vintages and technological groups is modelled. In a third step, a technology-oriented algorithm, which incorporates the relevant state-of-the-art knowledge in Europe, calculates emissions of air pollutants and greenhouse gases as well as appropriate indicators for traffic congestion, noise and road accidents. The paper outlines the methodology and the basic data sources used in connection with work done so far in Europe, presents the outlook according to a 'reference case' run for the 15 current European Union Member States up to 2030, displays aggregate results from a number of alternative scenarios and outlines elements of future work

  7. Lysimeter data as input to performance assessment models

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.

    1998-01-01

    The Field Lysimeter Investigations: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste forms in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-117 prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. The program includes reviewing radionuclide releases from those waste forms in the first 7 years of sampling and examining the relationship between code input parameters and lysimeter data. Also, lysimeter data are applied to performance assessment source term models, and initial results from use of data in two models are presented

  8. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  9. Assessment of tropospheric delay mapping function models in Egypt: Using PTD database model

    Science.gov (United States)

    Abdelfatah, M. A.; Mousa, Ashraf E.; El-Fiky, Gamal S.

    2018-06-01

    For space geodetic measurements, estimates of tropospheric delays are highly correlated with site coordinates and receiver clock biases. Thus, it is important to use the most accurate models for the tropospheric delay to reduce errors in the estimates of the other parameters. Both the zenith delay value and mapping function should be assigned correctly to reduce such errors. Several mapping function models can treat the troposphere slant delay. The recent models were not evaluated for the Egyptian local climate conditions. An assessment of these models is needed to choose the most suitable one. The goal of this paper is to test the quality of global mapping function which provides high consistency with precise troposphere delay (PTD) mapping functions. The PTD model is derived from radiosonde data using ray tracing, which consider in this paper as true value. The PTD mapping functions were compared, with three recent total mapping functions model and another three separate dry and wet mapping function model. The results of the research indicate that models are very close up to zenith angle 80°. Saastamoinen and 1/cos z model are behind accuracy. Niell model is better than VMF model. The model of Black and Eisner is a good model. The results also indicate that the geometric range error has insignificant effect on slant delay and the fluctuation of azimuth anti-symmetric is about 1%.

  10. Erosion Assessment Modeling Using the Sateec Gis Model on the Prislop Catchment

    Directory of Open Access Journals (Sweden)

    Damian Gheorghe

    2014-05-01

    Full Text Available The Sediment Assessment Tool for Effective Erosion Control (SATEEC acts as an extension for ArcView GIS 3, with easy to use commands. The erosion assessment is divided into two modules that consist of Universal Soil Loss Equation (USLE for sheet/rill erosion and the nLS/USPED modeling for gully head erosion. The SATEEC erosion modules can be successfully implemented for areas where sheet, rill and gully erosion occurs, such as the Prislop Catchment. The enhanced SATEEC system does not require experienced GIS users to operate the system therefore it is suitable for local authorities and/or students not so familiar with erosion modeling.

  11. Assessment of bullet effectiveness based on a human vulnerability model.

    Science.gov (United States)

    Liu, Susu; Xu, C; Wen, Y; Li, G; Zhou, J

    2017-12-25

    Penetrating wounds from explosively propelled fragments and bullets are the most common causes of combat injury. There is a requirement to assess the potential effectiveness of bullets penetrating human tissues in order to optimise preventive measures and wound trauma management. An advanced voxel model based on the Chinese Visible Human data was built. A digital human vulnerability model was established in combination with wound reconstruction and vulnerability assessment rules, in which wound penetration profiles were obtained by recreating the penetration of projectiles into ballistic gelatin. An effectiveness evaluation method of bullet penetration using the Abbreviated Injury Scale (AIS) was developed and solved using the Monte Carlo sampling method. The effectiveness of rifle bullets was demonstrated to increase with increasing velocity in the range of 300-700 m/s. When imparting the same energy, the effectiveness of the 5.56 mm bullet was higher than the 7.62 mm bullet in this model. The superimposition of simulant penetration profiles produced from ballistic gelatin simulant has been used to predict wound tracts in damaged tissues. The authors recognise that determining clinical effectiveness based on the AIS scores alone without verification of outcome by review of clinical hospital records means that this technique should be seen more as a manner of comparing the effectiveness of bullets than an injury prediction model. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Quality assessment in higher education using the SERVQUALQ model

    Directory of Open Access Journals (Sweden)

    Sabina Đonlagić

    2015-01-01

    Full Text Available Economy in Bosnia and Herzegovina is striving towards growth and increased employment and it has been proven by empirical studies worldwide that higher education contributes to socio-economic development of a country. Universities are important for generation, preservation and dissemination of knowledge in order to contribute to socio-economic benefits of a country. Higher education institutions are being pressured to improve value for their activities and providing quality higher education service to students should be taken seriously. In this paper we will address the emerging demand for quality in higher education. Higher education institutions should assess quality of their services and establish methods for improving quality. Activities of quality assurance should be integrated into the management process at higher education institutions. This paper is addressing the issue of service quality measurement in higher education institutions. The most frequently used model in this context is the SERVQUAL model. This model is measuring quality from the students' point of view, since students are considered to be one of the most important stakeholders for a higher education institution. The main objective of this research is to provide empirical evidence that the adapted SERVQAL model can be used in higher education and to identify the service quality gap based on its application at one institution of higher education (Faculty of Economics in Bosnia and Herzegovina. Furthermore, results of the gap analysis using the SERVQUAL methodology provide relevant information in which areas improvement is necessary in order to enhance service quality.

  13. SCORING ASSESSMENT AND FORECASTING MODELS BANKRUPTCY RISK OF COMPANIES

    Directory of Open Access Journals (Sweden)

    SUSU Stefanita

    2014-07-01

    Full Text Available Bankruptcy risk made the subject of many research studies that aim at identifying the time of the bankruptcy, the factors that compete to achieve this state, the indicators that best express this orientation (the bankruptcy. The threats to enterprises require the managers knowledge of continually economic and financial situations, and vulnerable areas with development potential. Managers need to identify and properly manage the threats that would prevent achieving the targets. In terms of methods known in the literature of assessment and evaluation of bankruptcy risk they are static, functional, strategic, and scoring nonfinancial models. This article addresses Altman and Conan-Holder-known internationally as the model developed at national level by two teachers from prestigious universities in our country-the Robu-Mironiuc model. Those models are applied to data released by the profit and loss account and balance sheet Turism Covasna company over which bankruptcy risk analysis is performed. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  14. Biosphere model for assessing doses from nuclear waste disposal

    International Nuclear Information System (INIS)

    Zach, R.; Amiro, B.D.; Davis, P.A.; Sheppard, S.C.; Szekeley, J.G.

    1994-01-01

    The biosphere model, BIOTRAC, for predicting long term nuclide concentrations and radiological doses from Canada's nuclear fuel waste disposal concept of a vault deep in plutonic rock of the Canadian Shield is presented. This generic, boreal zone biosphere model is based on scenario analysis and systems variability analysis using Monte Carlo simulation techniques. Conservatism is used to bridge uncertainties, even though this creates a small amount of extra nuclide mass. Environmental change over the very long assessment period is mainly handled through distributed parameter values. The dose receptors are a critical group of humans and four generic non-human target organisms. BIOTRAC includes six integrated submodels and it interfaces smoothly with a geosphere model. This interface includes a bedrock well. The geosphere model defines the discharge zones of deep groundwater where nuclides released from the vault enter the biosphere occupied by the dose receptors. The size of one of these zones is reduced when water is withdrawn from the bedrock well. Sensitivity analysis indicates 129 I is by far the most important radionuclide. Results also show bedrock-well water leads to higher doses to man than lake water, but the former doses decrease with the size of the critical group. Under comparable circumstances, doses to the non-human biota are greater than those for man

  15. Spatially Informed Plant PRA Models for Security Assessment

    International Nuclear Information System (INIS)

    Wheeler, Timothy A.; Thomas, Willard; Thornsbury, Eric

    2006-01-01

    Traditional risk models can be adapted to evaluate plant response for situations where plant systems and structures are intentionally damaged, such as from sabotage or terrorism. This paper describes a process by which traditional risk models can be spatially informed to analyze the effects of compound and widespread harsh environments through the use of 'damage footprints'. A 'damage footprint' is a spatial map of regions of the plant (zones) where equipment could be physically destroyed or disabled as a direct consequence of an intentional act. The use of 'damage footprints' requires that the basic events from the traditional probabilistic risk assessment (PRA) be spatially transformed so that the failure of individual components can be linked to the destruction of or damage to specific spatial zones within the plant. Given the nature of intentional acts, extensive modifications must be made to the risk models to account for the special nature of the 'initiating events' associated with deliberate adversary actions. Intentional acts might produce harsh environments that in turn could subject components and structures to one or more insults, such as structural, fire, flood, and/or vibration and shock damage. Furthermore, the potential for widespread damage from some of these insults requires an approach that addresses the impacts of these potentially severe insults even when they occur in locations distant from the actual physical location of a component or structure modeled in the traditional PRA. (authors)

  16. Intrinsic ethics regarding integrated assessment models for climate management.

    Science.gov (United States)

    Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus

    2011-09-01

    In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.

  17. The Use of Logistic Model in RUL Assessment

    Science.gov (United States)

    Gumiński, R.; Radkowski, S.

    2017-12-01

    The paper takes on the issue of assessment of remaining useful life (RUL). The goal of the paper was to develop a method, which would enable use of diagnostic information in the task of reducing the uncertainty related to technical risk. Prediction of the remaining useful life (RUL) of the system is a very important task for maintenance strategy. In the literature RUL of an engineering system is defined as the first future time instant in which thresholds of conditions (safety, operational quality, maintenance cost, etc) are violated. Knowledge of RUL offers the possibility of planning the testing and repair activities. Building models of damage development is important in this task. In the presented work, logistic function will be used to model fatigue crack development. It should be remembered that modeling of every phase of damage development is very difficult, yet modeling of every phase of damage separately, especially including on-line diagnostic information is more effective. Particular attention was paid to the possibility of forecasting the occurrence of damage due to fatigue while relying on the analysis of the structure of a vibroacoustic signal.

  18. Assessing the Hydrogeomorphic Effects of Environmental Flows using Hydrodynamic Modeling.

    Science.gov (United States)

    Gregory, Angela; Morrison, Ryan R; Stone, Mark

    2018-04-13

    Water managers are increasingly using environmental flows (e-flows) as a tool to improve ecological conditions downstream from impoundments. Recent studies have called for e-flow approaches that explicitly consider impacts on hydrogeomorphic processes when developing management alternatives. Process-based approaches are particularly relevant in river systems that have been highly modified and where water supplies are over allocated. One-dimensional (1D) and two-dimensional (2D) hydrodynamic models can be used to resolve hydrogeomorphic processes at different spatial and temporal scales to support the development, testing, and refinement of e-flow hypotheses. Thus, the objective of this paper is to demonstrate the use of hydrodynamic models as a tool for assisting stakeholders in targeting and assessing environmental flows within a decision-making framework. We present a case study of e-flows on the Rio Chama in northern New Mexico, USA, where 1D and 2D hydrodynamic modeling was used within a collaborative process to implement an e-flow experiment. A specific goal of the e-flow process was to improve spawning habitat for brown trout by flushing fine sediments from gravel features. The results revealed that the 2D hydrodynamic model provided much greater insight with respect to hydrodynamic and sediment transport processes, which led to a reduction in the recommended e-flow discharge. The results suggest that 2D hydrodynamic models can be useful tools for improving process understanding, developing e-flow recommendations, and supporting adaptive management even when limited or no data are available for model calibration and validation.

  19. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G. Saulnier and W. Statham

    2006-04-16

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table.

  20. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G. Saulnier; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table

  1. Analytical Modeling for Underground Risk Assessment in Smart Cities

    Directory of Open Access Journals (Sweden)

    Israr Ullah

    2018-06-01

    Full Text Available In the developed world, underground facilities are increasing day-by-day, as it is considered as an improved utilization of available space in smart cities. Typical facilities include underground railway lines, electricity lines, parking lots, water supply systems, sewerage network, etc. Besides its utility, these facilities also pose serious threats to citizens and property. To preempt accidental loss of precious human lives and properties, a real time monitoring system is highly desirable for conducting risk assessment on continuous basis and timely report any abnormality before its too late. In this paper, we present an analytical formulation to model system behavior for risk analysis and assessment based on various risk contributing factors. Based on proposed analytical model, we have evaluated three approximation techniques for computing final risk index: (a simple linear approximation based on multiple linear regression analysis; (b hierarchical fuzzy logic based technique in which related risk factors are combined in a tree like structure; and (c hybrid approximation approach which is a combination of (a and (b. Experimental results shows that simple linear approximation fails to accurately estimate final risk index as compared to hierarchical fuzzy logic based system which shows that the latter provides an efficient method for monitoring and forecasting critical issues in the underground facilities and may assist in maintenance efficiency as well. Estimation results based on hybrid approach fails to accurately estimate final risk index. However, hybrid scheme reveals some interesting and detailed information by performing automatic clustering based on location risk index.

  2. Pluripotent stem cells: An in vitro model for nanotoxicity assessments.

    Science.gov (United States)

    Handral, Harish K; Tong, Huei Jinn; Islam, Intekhab; Sriram, Gopu; Rosa, Vinicus; Cao, Tong

    2016-10-01

    The advent of technology has led to an established range of engineered nanoparticles that are used in diverse applications, such as cell-cell interactions, cell-material interactions, medical therapies and the target modulation of cellular processes. The exponential increase in the utilization of nanomaterials and the growing number of associated criticisms has highlighted the potential risks of nanomaterials to human health and the ecosystem. The existing in vivo and in vitro platforms show limitations, with fluctuations being observed in the results of toxicity assessments. Pluripotent stem cells (PSCs) are viable source of cells that are capable of developing into specialized cells of the human body. PSCs can be efficiently used to screen new biomaterials/drugs and are potential candidates for studying impairments of biophysical morphology at both the cellular and tissue levels during interactions with nanomaterials and for diagnosing toxicity. Three-dimensional in vitro models obtained using PSC-derived cells would provide a realistic, patient-specific platform for toxicity assessments and in drug screening applications. The current review focuses on PSCs as an alternative in vitro platform for assessing the hazardous effects of nanomaterials on health systems and highlights the importance of PSC-derived in vitro platforms. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    Science.gov (United States)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  4. EXPERT MODEL OF LAND SUITABILITY ASSESSMENT FOR CROPS

    Directory of Open Access Journals (Sweden)

    Boris Đurđević

    2010-12-01

    Full Text Available A total of 17404 soil samples (2003rd-2009th year were analysed in the eastern Croatia. The largest number of soil samples belongs to the Osijek-Baranya county, which together with both Eastern sugar beet Factories (Osijek and Županja, conduct the soil fertility control (~4200 samples/yr.. Computer model suitability assessment for crops, supported by GIS, proved to be fast, efficient enough reliable in terms of the number of analyzed soil samples. It allows the visualization of the agricultural area and prediction of its production properties for the purposes of analysis, planning and rationalization of agricultural production. With more precise data about the soil (soil, climate and reliable Digital Soil Map of Croatia, the model could be an acceptable, not only to evaluate the suitability for growing different crops but also their need for fertilizer, necessary machinery, repairs (liming, and other measures of organic matter input. The abovementioned aims to eliminate or reduce effects of limiting factors in primary agricultural production. Assessment of the relative benefits of soil presented by computer model for the crops production and geostatistical method kriging in the Osijek-Baranya county showed: 1 Average soil suitability being 60.06 percent. 2 Kriging predicted that 51751 ha (17.16% are of limited resources (N1 for growing crops whereas a 86142 ha (28.57% of land is limited suitably (S3, b 132789 ha (44.04% are moderately suitable (S2 and c 30772 ha (10.28% are of excellent fertility (S1. A large number of eastern Croatian land data showed that the computer-geostatistical model for determination of soil benefits for growing crops was automated, fast and simple to use and suitable for the implementation of GIS and automatically downloading the necessary benefit indicators from the input base (land, analytical and climate as well as data from the digital soil maps able to: a visualize the suitability for soil tillage, b predict the

  5. Developing a Model for Assessing Public Culture Indicators at Universities

    Directory of Open Access Journals (Sweden)

    Meisam Latifi

    2015-06-01

    Full Text Available The present study is aimed to develop a model for assessing public culture at universities and evaluating its indicators at public universities in Mashhad. The research follows an exploratory mixed approach. Research strategies in qualitative and quantitative sections are thematic networks analysis and descriptive- survey method, respectively. In the qualitative section, document analysis and semi-structured interviews with cultural experts are used as research tools. In this section, targeted sampling is carried out. In the quantitative section, a questionnaire which is developed based on the findings of the qualitative section is used as the research tool. Research population of the quantitative section consists of all the students who are admitted to public universities in Mashhad between 2009 and 2012. Sample size was calculated according to Cochran’s formula. Stratified sampling was used to select the sample. The results of the qualitative section led to the identification of 44 basic themes which are referred to as the micro indicators. These themes were clustered into similar groups. Then, 10 organizer themes were identified and recognized as macro indicators. In the next phase, importance factor of each indicator is determined according to the AHP method. The results of the qualitative assessment of indicators at public universities of Mashhad show that the overall cultural index declines during the years the student attends the university. Additionally, the highest correlation exists between national identity and revolutionary identity. The only negative correlations are observed between family and two indicators including social capital and cultural consumption. The results of the present study can be used to assess the state of public culture among university students and also be considered as a basis for assessing cultural planning.

  6. Model of affective assessment of primary school students

    Directory of Open Access Journals (Sweden)

    Amir Syamsudin

    2016-06-01

    Full Text Available This study aims to develop an instrument of affective assessment to measure the social competence of elementary school students in the learning process in schools. This study used the development model of Borg & Gall’s approach which was modified into five phases, including the need analyses, developing draft of the product conducted by experts, developing an affective assessment instrument, trying out the affective assessment instrument conducted by teachers of primary education in Yogyakarta, and the dissemination and implementation of the developed affective assessment instrument. The subjects were elementary school students whose school implemented Curriculum 2013 in the academic year of 2013/2014. The validity and reliability of each construct of the affective instrument were established using the PLS SEM Wrap PLS 3.0 analysis program. The study finds the following results. First, the construct of Honesty, Discipline, Responsibility, Decency, Care, and Self-Confidence in the limited, main, and extended testing has been supported by empirical data. Second, the validity of Honesty, Discipline, Responsibility, Decency, Care, and Self-Confidence in the limited, main, and extended testing meets the criteria above 0.70 for each indicator of the loading factor and the criteria below 0.50 for each indicator score of the cross-loading factor. Third, the reliability of Honesty, Discipline, Responsibility, Decency, Care, and Self-Confidence in limited, main, and extended testing meets the criteria above 0.70 for both composite reliability and Cronbach’s alpha scores. Fourth, the number of indicators at preresearch was 53, and 10 indicators were rejected in the limited testing, and four indicators were rejected in the main testing, and one indicator was rejected in the extended testing.

  7. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  8. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  9. The MARINA model (Model to Assess River Inputs of Nutrients to seAs)

    NARCIS (Netherlands)

    Strokal, Maryna; Kroeze, Carolien; Wang, Mengru; Bai, Zhaohai; Ma, Lin

    2016-01-01

    Chinese agriculture has been developing fast towards industrial food production systems that discharge nutrient-rich wastewater into rivers. As a result, nutrient export by rivers has been increasing, resulting in coastal water pollution. We developed a Model to Assess River Inputs of Nutrients

  10. Environmental assessment of amine-based carbon capture Scenario modelling with life cycle assessment (LCA)

    Energy Technology Data Exchange (ETDEWEB)

    Brekke, Andreas; Askham, Cecilia; Modahl, Ingunn Saur; Vold, Bjoern Ivar; Johnsen, Fredrik Moltu

    2012-07-01

    This report contains a first attempt at introducing the environmental impacts associated with amines and derivatives in a life cycle assessment (LCA) of gas power production with carbon capture and comparing these with other environmental impacts associated with the production system. The report aims to identify data gaps and methodological challenges connected both to modelling toxicity of amines and derivatives and weighting of environmental impacts. A scenario based modelling exercise was performed on a theoretical gas power plant with carbon capture, where emission levels of nitrosamines were varied between zero (gas power without CCS) to a worst case level (outside the probable range of actual carbon capture facilities). Because of extensive research and development in the areas of solvents and emissions from carbon capture facilities in the latter years, data used in the exercise may be outdated and results should therefore not be taken at face value.The results from the exercise showed: According to UseTox, emissions of nitrosamines are less important than emissions of formaldehyde with regard to toxicity related to operation of (i.e. both inputs to and outputs from) a carbon capture facility. If characterisation factors for emissions of metals are included, these outweigh all other toxic emissions in the study. None of the most recent weighting methods in LCA include characterisation factors for nitrosamines, and these are therefore not part of the environmental ranking.These results shows that the EDecIDe project has an important role to play in developing LCA methodology useful for assessing the environmental performance of amine based carbon capture in particular and CCS in general. The EDecIDe project will examine the toxicity models used in LCA in more detail, specifically UseTox. The applicability of the LCA compartment models and site specificity issues for a Norwegian/Arctic situation will be explored. This applies to the environmental compartments

  11. Models for dose assessments. Modules for various biosphere types

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, U.; Nordlinder, S.; Aggeryd, I. [Studsvik Eco and Safety AB, Nykoeping (Sweden)

    1999-12-01

    The main objective of this study was to provide a basis for illustrations of yearly dose rates to the most exposed individual from hypothetical leakages of radionuclides from a deep bedrock repository for spent nuclear fuel and other radioactive waste. The results of this study will be used in the safety assessment SR 97 and in a study on the design and long-term safety for a repository planned to contain long-lived low and intermediate level waste. The repositories will be designed to isolate the radionuclides for several hundred thousands of years. In the SR 97 study, however, hypothetical scenarios for leakage are postulated. Radionuclides are hence assumed to be transported in the geosphere by groundwater, and probably discharge into the biosphere. This may occur in several types of ecosystems. A number of categories of such ecosystems were identified, and turnover of radionuclides was modelled separately for each ecosystem. Previous studies had focused on generic models for wells, lakes and coastal areas. These models were, in this study, developed further to use site-specific data. In addition, flows of groundwater, containing radionuclides, to agricultural land and peat bogs were considered. All these categories are referred to as modules in this report. The forest ecosystems were not included, due to a general lack of knowledge of biospheric processes in connection with discharge of groundwater in forested areas. Examples of each type of module were run with the assumption of a continuous annual release into the biosphere of 1 Bq for each radionuclide during 10 000 years. The results are presented as ecosystem specific dose conversion factors (EDFs) for each nuclide at the year 10 000, assuming stationary ecosystems and prevailing living conditions and habits. All calculations were performed with uncertainty analyses included. Simplifications and assumptions in the modelling of biospheric processes are discussed. The use of modules may be seen as a step

  12. Models for dose assessments. Modules for various biosphere types

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.; Aggeryd, I.

    1999-12-01

    The main objective of this study was to provide a basis for illustrations of yearly dose rates to the most exposed individual from hypothetical leakages of radionuclides from a deep bedrock repository for spent nuclear fuel and other radioactive waste. The results of this study will be used in the safety assessment SR 97 and in a study on the design and long-term safety for a repository planned to contain long-lived low and intermediate level waste. The repositories will be designed to isolate the radionuclides for several hundred thousands of years. In the SR 97 study, however, hypothetical scenarios for leakage are postulated. Radionuclides are hence assumed to be transported in the geosphere by groundwater, and probably discharge into the biosphere. This may occur in several types of ecosystems. A number of categories of such ecosystems were identified, and turnover of radionuclides was modelled separately for each ecosystem. Previous studies had focused on generic models for wells, lakes and coastal areas. These models were, in this study, developed further to use site-specific data. In addition, flows of groundwater, containing radionuclides, to agricultural land and peat bogs were considered. All these categories are referred to as modules in this report. The forest ecosystems were not included, due to a general lack of knowledge of biospheric processes in connection with discharge of groundwater in forested areas. Examples of each type of module were run with the assumption of a continuous annual release into the biosphere of 1 Bq for each radionuclide during 10 000 years. The results are presented as ecosystem specific dose conversion factors (EDFs) for each nuclide at the year 10 000, assuming stationary ecosystems and prevailing living conditions and habits. All calculations were performed with uncertainty analyses included. Simplifications and assumptions in the modelling of biospheric processes are discussed. The use of modules may be seen as a step

  13. Comparative assessment of condensation models for horizontal tubes

    International Nuclear Information System (INIS)

    Schaffrath, A.; Kruessenberg, A.K.; Lischke, W.; Gocht, U.; Fjodorow, A.

    1999-01-01

    The condensation in horizontal tubes plays an important role e.g. for the determination of the operation mode of horizontal steam generators of VVER reactors or passive safety systems for the next generation of nuclear power plants. Two different approaches (HOTKON and KONWAR) for modeling this process have been undertaken by Forschungszentrum Juelich (FZJ) and University for Applied Sciences Zittau/Goerlitz (HTWS) and implemented into the 1D-thermohydraulic code ATHLET, which is developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH for the analysis of anticipated and abnormal transients in light water reactors. Although the improvements of the condensation models are developed for different applications (VVER steam generators - emergency condenser of the SWR1000) with strongly different operation conditions (e.g. the temperature difference over the tube wall in HORUS is up to 30 K and in NOKO up to 250 K, the heat flux density in HORUS is up to 40 kW/m 2 and in NOKO up to 1 GW/m 2 ) both models are now compared and assessed by Forschungszentrum Rossendorf FZR e.V. Therefore, post test calculations of selected HORUS experiments were performed with ATHLET/KONWAR and compared to existing ATHLET and ATHLET/HOTKON calculations of HTWS. It can be seen that the calculations with the extension KONWAR as well as HOTKON improve significantly the agreement between computational and experimental data. (orig.) [de

  14. A hierarchical network modeling method for railway tunnels safety assessment

    Science.gov (United States)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Liu, Xumin

    2017-02-01

    Using network theory to model risk-related knowledge on accidents is regarded as potential very helpful in risk management. A large amount of defects detection data for railway tunnels is collected in autumn every year in China. It is extremely important to discover the regularities knowledge in database. In this paper, based on network theories and by using data mining techniques, a new method is proposed for mining risk-related regularities to support risk management in railway tunnel projects. A hierarchical network (HN) model which takes into account the tunnel structures, tunnel defects, potential failures and accidents is established. An improved Apriori algorithm is designed to rapidly and effectively mine correlations between tunnel structures and tunnel defects. Then an algorithm is presented in order to mine the risk-related regularities table (RRT) from the frequent patterns. At last, a safety assessment method is proposed by consideration of actual defects and possible risks of defects gained from the RRT. This method cannot only generate the quantitative risk results but also reveal the key defects and critical risks of defects. This paper is further development on accident causation network modeling methods which can provide guidance for specific maintenance measure.

  15. Assessing Mental Models of Emergencies Through Two Knowledge Elicitation Tasks.

    Science.gov (United States)

    Whitmer, Daphne E; Sims, Valerie K; Torres, Michael E

    2017-05-01

    The goals of this study were to assess the risk identification aspect of mental models using standard elicitation methods and how university campus alerts were related to these mental models. People fail to follow protective action recommendations in emergency warnings. Past research has yet to examine cognitive processes that influence emergency decision-making. Study 1 examined 2 years of emergency alerts distributed by a large southeastern university. In Study 2, participants listed emergencies in a thought-listing task. Study 3 measured participants' time to decide if a situation was an emergency. The university distributed the most alerts about an armed person, theft, and fire. In Study 2, participants most frequently listed fire, car accident, heart attack, and theft. In Study 3, participants quickly decided a bomb, murder, fire, tornado, and rape were emergencies. They most slowly decided that a suspicious package and identify theft were emergencies. Recent interaction with warnings was only somewhat related to participants' mental models of emergencies. Risk identification precedes decision-making and applying protective actions. Examining these characteristics of people's mental representations of emergencies is fundamental to further understand why some emergency warnings go ignored. Someone must believe a situation is serious to categorize it as an emergency before taking the protective action recommendations in an emergency warning. Present-day research must continue to examine the problem of people ignoring warning communication, as there are important cognitive factors that have not yet been explored until the present research.

  16. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  17. Toxicological risk assessment of complex mixtures through the Wtox model

    Directory of Open Access Journals (Sweden)

    William Gerson Matias

    2015-01-01

    Full Text Available Mathematical models are important tools for environmental management and risk assessment. Predictions about the toxicity of chemical mixtures must be enhanced due to the complexity of eects that can be caused to the living species. In this work, the environmental risk was accessed addressing the need to study the relationship between the organism and xenobiotics. Therefore, ve toxicological endpoints were applied through the WTox Model, and with this methodology we obtained the risk classication of potentially toxic substances. Acute and chronic toxicity, citotoxicity and genotoxicity were observed in the organisms Daphnia magna, Vibrio scheri and Oreochromis niloticus. A case study was conducted with solid wastes from textile, metal-mechanic and pulp and paper industries. The results have shown that several industrial wastes induced mortality, reproductive eects, micronucleus formation and increases in the rate of lipid peroxidation and DNA methylation of the organisms tested. These results, analyzed together through the WTox Model, allowed the classication of the environmental risk of industrial wastes. The evaluation showed that the toxicological environmental risk of the samples analyzed can be classied as signicant or critical.

  18. Distributional aspects of emissions in climate change integrated assessment models

    International Nuclear Information System (INIS)

    Cantore, Nicola

    2011-01-01

    The recent failure of Copenhagen negotiations shows that concrete actions are needed to create the conditions for a consensus over global emission reduction policies. A wide coalition of countries in international climate change agreements could be facilitated by the perceived fairness of rich and poor countries of the abatement sharing at international level. In this paper I use two popular climate change integrated assessment models to investigate the path and decompose components and sources of future inequality in the emissions distribution. Results prove to be consistent with previous empirical studies and robust to model comparison and show that gaps in GDP across world regions will still play a crucial role in explaining different countries contributions to global warming. - Research highlights: → I implement a scenario analysis with two global climate change models. → I analyse inequality in the distribution of emissions. → I decompose emissions inequality components. → I find that GDP per capita is the main Kaya identity source of emissions inequality. → Current rich countries will mostly remain responsible for emissions inequality.

  19. Precast concrete unit assessment through GPR survey and FDTD modelling

    Science.gov (United States)

    Campo, Davide

    2017-04-01

    Precast concrete elements are widely used within United Kingdom house building offering ease in assembly and added values as structural integrity, sound and thermal insulation; most common concrete components include walls, beams, floors, panels, lintels, stairs, etc. The lack of respect of the manufacturer instruction during assembling, however, may induce cracking and short/long term loss of bearing capacity. GPR is a well-established not destructive technique employed in the assessment of structural elements because of real-time imaging, quickness of data collecting and ability to discriminate finest structural details. In this work, GPR has been used to investigate two different precast elements: precast reinforced concrete planks constituting the roof slab of a school and precast wood-cement blocks with insulation material pre-fitted used to build a perimeter wall of a private building. Visible cracks affected both constructions. For the assessment surveys, a GSSI 2.0 GHz GPR antenna has been used because of the high resolution required and the small size of the antenna case (155 by 90 by 105mm) enabling scanning up to 45mm from any obstruction. Finite Difference Time Domain (FDTD) numerical modelling was also performed to build a scenario of the expected GPR signal response for a preliminary real-time interpretation and to help solve uncertainties due to complex reflection patterns: simulated radargrams were built using Reflex Software v. 8.2, reproducing the same GPR pulse used for the surveys in terms of wavelet, nominal frequency, sample frequency and time window. Model geometries were derived from the design projects available both for the planks and the blocks; the electromagnetic properties of the materials (concrete, reinforcing bars, air-filled void, insulation and wooden concrete) were inferred from both values reported in literature and a preliminary interpretation of radargrams where internal layer interfaces were clearly recognizable and

  20. Model for assessing alpha doses for a Reference Japanese Man

    International Nuclear Information System (INIS)

    Kawamura, Hisao

    1993-01-01

    In view of the development of the nuclear fuel cycle in this country, it is urgently important to establish dose assessment models and related human and environmental parameters for long-lived radionuclides. In the current program, intake and body content of actinides (Pu, Th, U) and related alpha-emitting nuclides (Ra and daughters) have been studied as well as physiological aspects of Reference Japanese Man as the basic model of man for dosimetry. The ultimate object is to examine applicability of the existing models particularly recommended by the ICRP for workers to members of the public. The result of an interlaboratory intercomparison of 239 Pu + 240 Pu determination including our result was published. Alpha-spectrometric determinations of 226 Ra in bone yielded repesentative bone concentration level in Tokyo and Ra-Ca O.R. (bone-diet) which appear consistent with the literature value for Sapporo and Kyoto by Ohno using a Rn emanation method. Specific effective energies for alpha radiation from 226 Ra and daughters were calculated using the ICRP dosimetric model for bone incorporating masses of source and target organs of Reference Japanese Man. Reference Japanese data including the adult, adolescent, child and infant of both sexes was extensively and intensively studied by Tanaka as part of the activities of the ICRP Task Group on Reference Man Revision. Normal data for the physical measurements, mass and dimension of internal organs and body surfaces and some of the body composition were analysed viewing the nutritional data in the Japanese population. Some of the above works are to be continued. (author)

  1. Status of thermalhydraulic modelling and assessment: Open issues

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F. [CEA, Grenoble (France)

    1997-07-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situations where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes.

  2. Model quality assessment using distance constraints from alignments

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Karplus, Kevin

    2008-01-01

    that model which is closest to the true structure. In this article, we present a new approach for addressing the MQA problem. It is based on distance constraints extracted from alignments to templates of known structure, and is implemented in the Undertaker program for protein structure prediction. One novel...... feature is that we extract noncontact constraints as well as contact constraints. We describe how the distance constraint extraction is done and we show how they can be used to address the MQA problem. We have compared our method on CASP7 targets and the results show that our method is at least comparable...... with the best MQA methods that were assessed at CASP7. We also propose a new evaluation measure, Kendall's tau, that is more interpretable than conventional measures used for evaluating MQA methods (Pearson's r and Spearman's rho). We show clear examples where Kendall's tau agrees much more with our intuition...

  3. Status of thermalhydraulic modelling and assessment: Open issues

    International Nuclear Information System (INIS)

    Bestion, D.; Barre, F.

    1997-01-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situations where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes

  4. Modelling Global Land Use and Social Implications in the Sustainability Assessment of Biofuels

    DEFF Research Database (Denmark)

    Kløverpris, Jesper; Wenzel, Henrik

    2007-01-01

    Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel......Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel...

  5. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  6. Development of a model to assess orthostatic responses

    Science.gov (United States)

    Rubin, Marilyn

    1993-01-01

    A major change for crewmembers during weightlessness in microgravity is the redistribution of body fluids from the legs into the abdomen, thorax, and head. The fluids continue to be sequestered in these areas throughout the flight. Upon reentry into gravity on landing, these same body fluids are displaced again to their normal locations, however, not without hazardous incidence to the crewmembers. The problem remains that upon landing, crewmembers are subject to orthostasis, that is, the blood flowing into the legs reduces the blood supply to the brain and may result in the crewmember fainting. The purpose of this study was to develop a model of testing orthostatic responses of blood pressure regulating mechanisms of the cardiovascular system, when challenged, to maintain blood pressure to the brain. To accomplish this, subjects' responses were assessed as they proceeded from the supine position of progressive head-up tilt positions of 30 deg, 60 deg, and 90 deg angles. A convenience sample consisted of 21 subjects, females (N=11) and males (N=10), selected from a list of potential subjects available through the NASA subject screening office. The methodology included all non-invasive measurements of blood pressure, heart rate, echocardiograms, cardiac output, cardiac stroke volume, fluid shifts in the thorax, ventricular ejection and velocity times, and skin blood perfusion. The Fischer statistical analysis was done of all data with the significance level at .05. Significant differences were demonstrated in many instances of changes of posture for all variables. Based on the significance of the findings of this study, this model for assessing orthostatic responses does provide an adequate challenge to the blood pressure regulatory systems. While individuals may use different adaptations to incremental changes in gravity, the subjects, in aggregate, demonstrated significant adaptive cardiovascular changes to orthostatic challenges which were presented to them.

  7. Low-level radioactive waste performance assessments: Source term modeling

    International Nuclear Information System (INIS)

    Icenhour, A.S.; Godbee, H.W.; Miller, L.F.

    1995-01-01

    Low-level radioactive wastes (LLW) generated by government and commercial operations need to be isolated from the environment for at least 300 to 500 yr. Most existing sites for the storage or disposal of LLW employ the shallow-land burial approach. However, the U.S. Department of Energy currently emphasizes the use of engineered systems (e.g., packaging, concrete and metal barriers, and water collection systems). Future commercial LLW disposal sites may include such systems to mitigate radionuclide transport through the biosphere. Performance assessments must be conducted for LUW disposal facilities. These studies include comprehensive evaluations of radionuclide migration from the waste package, through the vadose zone, and within the water table. Atmospheric transport mechanisms are also studied. Figure I illustrates the performance assessment process. Estimates of the release of radionuclides from the waste packages (i.e., source terms) are used for subsequent hydrogeologic calculations required by a performance assessment. Computer models are typically used to describe the complex interactions of water with LLW and to determine the transport of radionuclides. Several commonly used computer programs for evaluating source terms include GWSCREEN, BLT (Breach-Leach-Transport), DUST (Disposal Unit Source Term), BARRIER (Ref. 5), as well as SOURCE1 and SOURCE2 (which are used in this study). The SOURCE1 and SOURCE2 codes were prepared by Rogers and Associates Engineering Corporation for the Oak Ridge National Laboratory (ORNL). SOURCE1 is designed for tumulus-type facilities, and SOURCE2 is tailored for silo, well-in-silo, and trench-type disposal facilities. This paper focuses on the source term for ORNL disposal facilities, and it describes improved computational methods for determining radionuclide transport from waste packages

  8. Assessing Mediational Models: Testing and Interval Estimation for Indirect Effects.

    Science.gov (United States)

    Biesanz, Jeremy C; Falk, Carl F; Savalei, Victoria

    2010-08-06

    Theoretical models specifying indirect or mediated effects are common in the social sciences. An indirect effect exists when an independent variable's influence on the dependent variable is mediated through an intervening variable. Classic approaches to assessing such mediational hypotheses ( Baron & Kenny, 1986 ; Sobel, 1982 ) have in recent years been supplemented by computationally intensive methods such as bootstrapping, the distribution of the product methods, and hierarchical Bayesian Markov chain Monte Carlo (MCMC) methods. These different approaches for assessing mediation are illustrated using data from Dunn, Biesanz, Human, and Finn (2007). However, little is known about how these methods perform relative to each other, particularly in more challenging situations, such as with data that are incomplete and/or nonnormal. This article presents an extensive Monte Carlo simulation evaluating a host of approaches for assessing mediation. We examine Type I error rates, power, and coverage. We study normal and nonnormal data as well as complete and incomplete data. In addition, we adapt a method, recently proposed in statistical literature, that does not rely on confidence intervals (CIs) to test the null hypothesis of no indirect effect. The results suggest that the new inferential method-the partial posterior p value-slightly outperforms existing ones in terms of maintaining Type I error rates while maximizing power, especially with incomplete data. Among confidence interval approaches, the bias-corrected accelerated (BC a ) bootstrapping approach often has inflated Type I error rates and inconsistent coverage and is not recommended; In contrast, the bootstrapped percentile confidence interval and the hierarchical Bayesian MCMC method perform best overall, maintaining Type I error rates, exhibiting reasonable power, and producing stable and accurate coverage rates.

  9. Model analysis: Representing and assessing the dynamics of student learning

    Directory of Open Access Journals (Sweden)

    Lei Bao

    2006-02-01

    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  10. Strategic Assessment Model and Its Application:a Case Study

    Institute of Scientific and Technical Information of China (English)

    ZHU Xiu-wen; CAO Meng-xia; ZHU Ning; ZUO M1ng-j1an

    2001-01-01

    Accurate and effective assessment of strategic alternatives of an organization directly affects the decision-making and execution of its development strategy. In evaluation of strategic alternatives, relevant elements from both internal and external environments of an organization must be considered. In this paper we use strategic assessment model to evaluate strategic alternatives of an air-conditioning company. Strategic objectives and alternatives of the company are developed through analysis of the competitive environment,key competitors and internal conditions. The environment factors are classified into internal, task, and general opportunities and threats. Analytical hierarchy process, subjective probabilities, entropy concept,and utility theory are used to enhance decision-maker's ability in evaluating strategic alternatives. The evaluation results show that the most effective strategic alternative for the company is to reduce types of products, concentrate its effort on producing window-type and cupboard-type air-conditioners, enlarge the production scale, and pre-empt the market. The company has made great progress by implementing this alternative. We conclude that SAM is an appropriate tool for evaluating strategic alternatives.

  11. MODEL AUTHENTIC SELF-ASSESSMENT DALAM PENGEMBANGAN EMPLOYABILITY SKILLS MAHASISWA PENDIDIKAN TINGGI VOKASI

    Directory of Open Access Journals (Sweden)

    I Made Suarta

    2015-06-01

    ______________________________________________________________ AUTHENTIC SELF-ASSESSMENT MODEL FOR DEVELOPING EMPLOYABILITY SKILLS STUDENT IN HIGHER VOCATIONAL EDUCATION Abstract The purpose of this research is to develop assessment tools to evaluate achievement of employability skills which are integrated in the learning database applications. The assessment model developed is a combination of self-assessment and authentic assessment, proposed as models of authentic self-assessment. The steps of developing authentic self-assessment models include: identifying the standards, selecting an authentic task, identifying the criteria for the task, and creating the rubric. The results of development assessment tools include: (1 problem solving skills assessment model, (2 self-management skills assessment model, and (3 competence database applications assessment model. This model can be used to assess the cognitive, affective, and psychomotor achievement. The results indicate: achievement of problem solving and self-management ability was in good category, and competencies in designing conceptual and logical database was in high category. This model also has met the basic principles of assessment, i.e.: validity, reliability, focused on competencies, comprehen-sive, objectivity, and the principle of educating. Keywords: authentic assessment, self-assessment, problem solving skills, self-management skills, vocational education

  12. Modelling future impacts of air pollution using the multi-scale UK Integrated Assessment Model (UKIAM).

    Science.gov (United States)

    Oxley, Tim; Dore, Anthony J; ApSimon, Helen; Hall, Jane; Kryza, Maciej

    2013-11-01

    Integrated assessment modelling has evolved to support policy development in relation to air pollutants and greenhouse gases by providing integrated simulation tools able to produce quick and realistic representations of emission scenarios and their environmental impacts without the need to re-run complex atmospheric dispersion models. The UK Integrated Assessment Model (UKIAM) has been developed to investigate strategies for reducing UK emissions by bringing together information on projected UK emissions of SO2, NOx, NH3, PM10 and PM2.5, atmospheric dispersion, criteria for protection of ecosystems, urban air quality and human health, and data on potential abatement measures to reduce emissions, which may subsequently be linked to associated analyses of costs and benefits. We describe the multi-scale model structure ranging from continental to roadside, UK emission sources, atmospheric dispersion of emissions, implementation of abatement measures, integration with European-scale modelling, and environmental impacts. The model generates outputs from a national perspective which are used to evaluate alternative strategies in relation to emissions, deposition patterns, air quality metrics and ecosystem critical load exceedance. We present a selection of scenarios in relation to the 2020 Business-As-Usual projections and identify potential further reductions beyond those currently being planned. © 2013.

  13. A new model of Ishikawa diagram for quality assessment

    Science.gov (United States)

    Liliana, Luca

    2016-11-01

    The paper presents the results of a study concerning the use of the Ishikawa diagram in analyzing the causes that determine errors in the evaluation of theparts precision in the machine construction field. The studied problem was"errors in the evaluation of partsprecision” and this constitutes the head of the Ishikawa diagram skeleton.All the possible, main and secondary causes that could generate the studied problem were identified. The most known Ishikawa models are 4M, 5M, 6M, the initials being in order: materials, methods, man, machines, mother nature, measurement. The paper shows the potential causes of the studied problem, which were firstly grouped in three categories, as follows: causes that lead to errors in assessing the dimensional accuracy, causes that determine errors in the evaluation of shape and position abnormalities and causes for errors in roughness evaluation. We took into account the main components of parts precision in the machine construction field. For each of the three categories of causes there were distributed potential secondary causes on groups of M (man, methods, machines, materials, environment/ medio ambiente-sp.). We opted for a new model of Ishikawa diagram, resulting from the composition of three fish skeletons corresponding to the main categories of parts accuracy.

  14. A Remote Sensing-Derived Corn Yield Assessment Model

    Science.gov (United States)

    Shrestha, Ranjay Man

    be further associated with the actual yield. Utilizing satellite remote sensing products, such as daily NDVI derived from Moderate Resolution Imaging Spectroradiometer (MODIS) at 250 m pixel size, the crop yield estimation can be performed at a very fine spatial resolution. Therefore, this study examined the potential of these daily NDVI products within agricultural studies and crop yield assessments. In this study, a regression-based approach was proposed to estimate the annual corn yield through changes in MODIS daily NDVI time series. The relationship between daily NDVI and corn yield was well defined and established, and as changes in corn phenology and yield were directly reflected by the changes in NDVI within the growing season, these two entities were combined to develop a relational model. The model was trained using 15 years (2000-2014) of historical NDVI and county-level corn yield data for four major corn producing states: Kansas, Nebraska, Iowa, and Indiana, representing four climatic regions as South, West North Central, East North Central, and Central, respectively, within the U.S. Corn Belt area. The model's goodness of fit was well defined with a high coefficient of determination (R2>0.81). Similarly, using 2015 yield data for validation, 92% of average accuracy signified the performance of the model in estimating corn yield at county level. Besides providing the county-level corn yield estimations, the derived model was also accurate enough to estimate the yield at finer spatial resolution (field level). The model's assessment accuracy was evaluated using the randomly selected field level corn yield within the study area for 2014, 2015, and 2016. A total of over 120 plot level corn yield were used for validation, and the overall average accuracy was 87%, which statistically justified the model's capability to estimate plot-level corn yield. Additionally, the proposed model was applied to the impact estimation by examining the changes in corn yield

  15. A Model for Assessing the Gender Aspect in Economic Policy

    Directory of Open Access Journals (Sweden)

    Ona Rakauskienė

    2015-06-01

    Full Text Available The purpose of research is to develop a conceptual model for assessing the impact of the gender aspect on economic policy at macro– and microeconomic levels. The research methodology is based on analysing scientific approaches to the gender aspect in economics and gender–responsive budgeting as well as determining the impact of the gender aspect on GDP, foreign trade, the state budget and the labour market. First, the major findings encompass the main idea of a conceptual model proposing that a socio–economic picture of society can be accepted as completed only when, alongside public and private sectors, includes the care/reproductive sector that is dominated by women and creating added value in the form of educated human resources; second, macroeconomics is not neutral in terms of gender equality. Gender asymmetry is manifested not only at the level of microeconomics (labour market and business but also at the level of macroeconomics (GDP, the state budget and foreign trade, which has a negative impact on economic growth and state budget revenues. In this regard, economic decisions, according to the principles of gender equality and in order to achieve gender equality in economics, must be made, as the gender aspect has to be also implemented at the macroeconomic level.

  16. Modeling marine surface microplastic transport to assess optimal removal locations

    Science.gov (United States)

    Sherman, Peter; van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the impact on ecosystems, using plankton growth as a proxy. The simulations show that the optimal removal locations are primarily located off the coast of China and in the Indonesian Archipelago for both scenarios. Our estimates show that 31% of the modeled microplastic mass can be removed by 2025 using 29 plastic collectors operating at a 45% capture efficiency from these locations, compared to only 17% when the 29 plastic collectors are moored in the North Pacific garbage patch, between Hawaii and California. The overlap of ocean surface microplastics and phytoplankton growth can be reduced by 46% at our proposed locations, while sinks in the North Pacific can only reduce the overlap by 14%. These results are an indication that oceanic plastic removal might be more effective in removing a greater microplastic mass and in reducing potential harm to marine life when closer to shore than inside the plastic accumulation zones in the centers of the gyres.

  17. Revenue Risk Modelling and Assessment on BOT Highway Project

    Science.gov (United States)

    Novianti, T.; Setyawan, H. Y.

    2018-01-01

    The infrastructure project which is considered as a public-private partnership approach under BOT (Build-Operate-Transfer) arrangement, such as a highway, is risky. Therefore, assessment on risk factors is essential as the project have a concession period and is influenced by macroeconomic factors and consensus period. In this study, pre-construction risks of a highway were examined by using a Delphi method to create a space for offline expert discussions; a fault tree analysis to map intuition of experts and to create a model from the underlying risk events; a fuzzy logic to interpret the linguistic data of risk models. The loss of revenue for risk tariff, traffic volume, force majeure, and income were then measured. The results showed that the loss of revenue caused by the risk tariff was 10.5% of the normal total revenue. The loss of revenue caused by the risk of traffic volume was 21.0% of total revenue. The loss of revenue caused by the force majeure was 12.2% of the normal income. The loss of income caused by the non-revenue events was 6.9% of the normal revenue. It was also found that the volume of traffic was the major risk of a highway project because it related to customer preferences.

  18. The modelling and assessment of whale-watching impacts

    Science.gov (United States)

    New, Leslie; Hall, Ailsa J.; Harcourt, Robert; Kaufman, Greg; Parsons, E.C.M.; Pearson, Heidi C.; Cosentino, A. Mel; Schick, Robert S

    2015-01-01

    In recent years there has been significant interest in modelling cumulative effects and the population consequences of individual changes in cetacean behaviour and physiology due to disturbance. One potential source of disturbance that has garnered particular interest is whale-watching. Though perceived as ‘green’ or eco-friendly tourism, there is evidence that whale-watching can result in statistically significant and biologically meaningful changes in cetacean behaviour, raising the question whether whale-watching is in fact a long term sustainable activity. However, an assessment of the impacts of whale-watching on cetaceans requires an understanding of the potential behavioural and physiological effects, data to effectively address the question and suitable modelling techniques. Here, we review the current state of knowledge on the viability of long-term whale-watching, as well as logistical limitations and potential opportunities. We conclude that an integrated, coordinated approach will be needed to further understanding of the possible effects of whale-watching on cetaceans.

  19. Mentalized affectivity: A new model and assessment of emotion regulation.

    Directory of Open Access Journals (Sweden)

    David M Greenberg

    Full Text Available Here we introduce a new assessment of emotion regulation called the Mentalized Affectivity Scale (MAS. A large online adult sample (N = 2,840 completed the 60-item MAS along with a battery of psychological measures. Results revealed a robust three-component structure underlying mentalized affectivity, which we labeled: Identifying emotions (the ability to identify emotions and to reflect on the factors that influence them; Processing emotions (the ability to modulate and distinguish complex emotions; and Expressing emotions (the tendency to express emotions outwardly or inwardly. Hierarchical modeling suggested that Processing emotions delineates from Identifying them, and Expressing emotions delineates from Processing them. We then showed how these components are associated with personality traits, well-being, trauma, and 18 different psychological disorders (including mood, neurological, and personality disorders. Notably, those with anxiety, mood, and personality disorders showed a profile of high Identifying and low Processing compared to controls. Further, results showed how mentalized affectivity scores varied across psychological treatment modalities and years spent in therapy. Taken together, the model of mentalized affectivity advances prior theory and research on emotion regulation and the MAS is a useful and reliable instrument that can be used in both clinical and non-clinical settings in psychology, psychiatry, and neuroscience.

  20. Modeling marine surface microplastic transport to assess optimal removal locations

    International Nuclear Information System (INIS)

    Sherman, Peter; Van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the impact on ecosystems, using plankton growth as a proxy. The simulations show that the optimal removal locations are primarily located off the coast of China and in the Indonesian Archipelago for both scenarios. Our estimates show that 31% of the modeled microplastic mass can be removed by 2025 using 29 plastic collectors operating at a 45% capture efficiency from these locations, compared to only 17% when the 29 plastic collectors are moored in the North Pacific garbage patch, between Hawaii and California. The overlap of ocean surface microplastics and phytoplankton growth can be reduced by 46% at our proposed locations, while sinks in the North Pacific can only reduce the overlap by 14%. These results are an indication that oceanic plastic removal might be more effective in removing a greater microplastic mass and in reducing potential harm to marine life when closer to shore than inside the plastic accumulation zones in the centers of the gyres. (letter)

  1. A model for radiological dose assessment in an urban environment

    International Nuclear Information System (INIS)

    Hwang, Won Tae; Kim, Eun Han; Jeong, Hyo Joon; Suh, Kyung Suk; Han, Moon Hee

    2007-01-01

    A model for radiological dose assessment in an urban environment, METRO-K has been developed. Characteristics of the model are as follows ; 1) mathematical structures are simple (i.e. simplified input parameters) and easy to understand due to get the results by analytical methods using experimental and empirical data, 2) complex urban environment can easily be made up using only 5 types of basic surfaces, 3) various remediation measures can be applied to different surfaces by evaluating the exposure doses contributing from each contamination surface. Exposure doses contributing from each contamination surface at a particular location of a receptor were evaluated using the data library of kerma values as a function of gamma energy and contamination surface. A kerma data library was prepared for 7 representative types of Korean urban building by extending those data given for 4 representative types of European urban buildings. Initial input data are daily radionuclide concentration in air and precipitation, and fraction of chemical type. Final outputs are absorbed dose rate in air contributing from the basic surfaces as a function of time following a radionuclide deposition, and exposure dose rate contributing from various surfaces constituting the urban environment at a particular location of a receptor. As the result of a contaminative scenario for an apartment built-up area, exposure dose rates show a distinct difference for surrounding environment as well as locations of a receptor

  2. Hydrodynamic and Ecological Assessment of Nearshore Restoration: A Modeling Study

    International Nuclear Information System (INIS)

    Yang, Zhaoqing; Sobocinski, Kathryn L.; Heatwole, Danelle W.; Khangaonkar, Tarang; Thom, Ronald M.; Fuller, Roger

    2010-01-01

    Along the Pacific Northwest coast, much of the estuarine habitat has been diked over the last century for agricultural land use, residential and commercial development, and transportation corridors. As a result, many of the ecological processes and functions have been disrupted. To protect coastal habitats that are vital to aquatic species, many restoration projects are currently underway to restore the estuarine and coastal ecosystems through dike breaches, setbacks, and removals. Information on physical processes and hydrodynamic conditions are critical for the assessment of the success of restoration actions. Restoration of a 160- acre property at the mouth of the Stillaguamish River in Puget Sound has been proposed. The goal is to restore native tidal habitats and estuary-scale ecological processes by removing the dike. In this study, a three-dimensional hydrodynamic model was developed for the Stillaguamish River estuary to simulate estuarine processes. The model was calibrated to observed tide, current, and salinity data for existing conditions and applied to simulate the hydrodynamic responses to two restoration alternatives. Responses were evaluated at the scale of the restoration footprint. Model data was combined with biophysical data to predict habitat responses at the site. Results showed that the proposed dike removal would result in desired tidal flushing and conditions that would support four habitat types on the restoration footprint. At the estuary scale, restoration would substantially increase the proportion of area flushed with freshwater (< 5 ppt) at flood tide. Potential implications of predicted changes in salinity and flow dynamics are discussed relative to the distribution of tidal marsh habitat.

  3. Mixture modeling methods for the assessment of normal and abnormal personality, part II: longitudinal models.

    Science.gov (United States)

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.

  4. Towards an Integrated Model for Developing Sustainable Assessment Skills

    Science.gov (United States)

    Fastre, Greet M. J.; van der Klink, Marcel R.; Sluijsmans, Dominique; van Merrienboer, Jeroen J. G.

    2013-01-01

    One of the goals of current education is to ensure that graduates can act as independent lifelong learners. Graduates need to be able to assess their own learning and interpret assessment results. The central question in this article is how to acquire sustainable assessment skills, enabling students to assess their performance and learning…

  5. Dynamic Assessment and Its Implications for RTI Models

    Science.gov (United States)

    Wagner, Richard K.; Compton, Donald L.

    2011-01-01

    Dynamic assessment refers to assessment that combines elements of instruction for the purpose of learning something about an individual that cannot be learned as easily or at all from conventional assessment. The origins of dynamic assessment can be traced to Thorndike (1924), Rey (1934), and Vygotsky (1962), who shared three basic assumptions.…

  6. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  7. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  8. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Two agricultural production data libraries for risk assessment models

    International Nuclear Information System (INIS)

    Baes, C.F. III; Shor, R.W.; Sharp, R.D.; Sjoreen, A.L.

    1985-01-01

    Two data libraries based on the 1974 US Census of Agriculture are described. The data packages (AGDATC and AGDATG) are available from the Radiation Shielding Information Center (RSIC), Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831. Agricultural production and land-use information by county (AGDATC) or by 1/2 by 1/2 degree longitude-latitude grid cell (AGDATG) provide geographical resolution of the data. The libraries were designed for use in risk assessment models that simulate the transport of radionuclides from sources of airborne release through food chains to man. However, they are also suitable for use in the assessment of other airborne pollutants that can affect man from a food ingestion pathway such as effluents from synfuels or coal-fired power plants. The principal significance of the data libraries is that they provide default location-specific food-chain transport parameters when site-specific information are unavailable. Plant food categories in the data libraries include leafy vegetables, vegetables and fruits exposed to direct deposition of airborne pollutants, vegetables and fruits protected from direct deposition, and grains. Livestock feeds are also tabulated in four categories: pasture, grain, hay, and silage. Pasture was estimated by a material balance of cattle and sheep inventories, forage feed requirements, and reported harvested forage. Cattle (Bos spp.), sheep (Ovis aries), goat (Capra hircus), hog (Sus scrofa), chicken (Gallus domesticus), and turkey (Meleagris gallopavo) inventories or sales are also tabulated in the data libraries and can be used to provide estimates of meat, eggs, and milk production. Honey production also is given. Population, irrigation, and meteorological information are also listed

  10. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G.J. Saulnier Jr; W. Statham

    2006-03-10

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 {+-} 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory

  11. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G.J. Saulnier Jr; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 ± 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory analysis

  12. A generic hydroeconomic model to assess future water scarcity

    Science.gov (United States)

    Neverre, Noémie; Dumas, Patrice

    2015-04-01

    We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on

  13. Concepts, methods and models to assess environmental impact

    International Nuclear Information System (INIS)

    Pentreath, R.J.

    2002-01-01

    individual sites, is also planned in Canada. A somewhat conceptually different approach is that of an attempt to develop a hierarchical system for environmental protection based on a narrowly defined set of Reference Fauna and Flora analogous to that of Reference Man - consisting of defined dose models, data sets to estimate exposures, and data on biological effects, to provide a set of 'derived consideration levels' of dose-effect relationships for individual fauna and flora that could be used to help decision making (along with other relevant biological information) in different circumstances. Research work is also underway to produce systematic frameworks - also using a 'reference fauna and flora approach' - for assessing environmental impact in specific geographic areas, such as European and Arctic ecosystems. (author)

  14. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  15. Tsunami Risk Assessment Modelling in Chabahar Port, Iran

    Science.gov (United States)

    Delavar, M. R.; Mohammadi, H.; Sharifi, M. A.; Pirooz, M. D.

    2017-09-01

    The well-known historical tsunami in the Makran Subduction Zone (MSZ) region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC), the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan) reached to 3 km from the coastline. For the two beaches of Gujarat (India) and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST). In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  16. TSUNAMI RISK ASSESSMENT MODELLING IN CHABAHAR PORT, IRAN

    Directory of Open Access Journals (Sweden)

    M. R. Delavar

    2017-09-01

    Full Text Available The well-known historical tsunami in the Makran Subduction Zone (MSZ region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC, the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan reached to 3 km from the coastline. For the two beaches of Gujarat (India and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST. In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  17. Implications of model uncertainty for the practice of risk assessment

    International Nuclear Information System (INIS)

    Laskey, K.B.

    1994-01-01

    A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making

  18. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  19. An Introduction to the Partial Credit Model for Developing Nursing Assessments.

    Science.gov (United States)

    Fox, Christine

    1999-01-01

    Demonstrates how the partial credit model, a variation of the Rasch Measurement Model, can be used to develop performance-based assessments for nursing education. Applies the model using the Practical Knowledge Inventory for Nurses. (SK)

  20. Risk assessment to an integrated planning model for UST programs

    International Nuclear Information System (INIS)

    Ferguson, K.W.

    1993-01-01

    The US Postal Service maintains the largest civilian fleet in the United States totaling approximately 180,000 vehicles. To support the fleets daily energy requirements, the Postal Service also operates one of the largest networks of underground storage tanks nearly 7,500 nationwide. A program to apply risk assessment to planning, budget development and other management actions was implemented during September, 1989. Working closely with a consultant, the postal service developed regulatory and environmental risk criteria and weighting factors for a ranking model. The primary objective was to identify relative risks for each underground tank at individual facilities. Relative risks at each facility were determined central to prioritizing scheduled improvements to the tank network. The survey was conducted on 302 underground tanks in the Northeast Region of the US. An environmental and regulatory risk score was computed for each UST. By ranking the tanks according to their risk score, tanks were classified into management action categories including, but the limited to, underground tank testing, retrofit, repair, replacement and closure

  1. PFI redux? Assessing a new model for financing hospitals.

    Science.gov (United States)

    Hellowell, Mark

    2013-11-01

    There is a growing need for investments in hospital facilities to improve the efficiency and quality of health services. In recent years, publicly financed hospital organisations in many countries have utilised private finance arrangements, variously called private finance initiatives (PFIs), public-private partnerships (PPPs) or P3s, to address their capital requirements. However, such projects have become more difficult to implement since the onset of the global financial crisis, which has led to a reduction in the supply of debt capital and an increase in its price. In December 2012, the government of the United Kingdom outlined a comprehensive set of reforms to the private finance model in order to revive this important source of capital for hospital investments. This article provides a critical assessment of the 'Private Finance 2' reforms, focusing on their likely impact on the supply and cost of capital. It concludes that constraints in supply are likely to continue, in part due to regulatory constraints facing both commercial banks and institutional investors, while the cost of capital is likely to increase, at least in the short term. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  3. Econometric model as a regulatory tool in electricity distribution - Case Network Performance Assessment Model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost-effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent available at the time when analysis was done. However, since NPAM is under development, the parameters have been constantly changing. Therefore slightly changes in the results can occur if calculations were made with latest parameters. However, main conclusions are same and do not depend on exact parameters. (orig.)

  4. Econometric model as a regulatory tool in electricity distribution. Case network performance assessment model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost- effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent ones available at the time when analysis was done. However, since NPAM have been under development, the parameters have been constantly changing. Therefore slight changes might occur in the numerical results of calculations if they were made with the latest set of parameters. However, main conclusions are same and do not depend on exact parameters

  5. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    International Nuclear Information System (INIS)

    Dershowitz, B.; Eiben, T.; Follin, S.; Andersson, Johan

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL -1 ]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT -1 ]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and statistical

  6. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    Energy Technology Data Exchange (ETDEWEB)

    Dershowitz, B.; Eiben, T. [Golder Associates Inc., Seattle (United States); Follin, S.; Andersson, Johan [Golder Grundteknik KB, Stockholm (Sweden)

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL{sup -1}]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT{sup -1}]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and

  7. Utility of Social Modeling for Proliferation Assessment - Enhancing a Facility-Level Model for Proliferation Resistance Assessment of a Nuclear Enegry System

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Brothers, Alan J.; Gastelum, Zoe N.; Olson, Jarrod; Thompson, Sandra E.

    2009-10-26

    The Utility of Social Modeling for Proliferation Assessment project (PL09-UtilSocial) investigates the use of social and cultural information to improve nuclear proliferation assessments, including nonproliferation assessments, Proliferation Resistance (PR) assessments, safeguards assessments, and other related studies. These assessments often use and create technical information about a host State and its posture towards proliferation, the vulnerability of a nuclear energy system (NES) to an undesired event, and the effectiveness of safeguards. This objective of this project is to find and integrate social and technical information by explicitly considering the role of cultural, social, and behavioral factors relevant to proliferation; and to describe and demonstrate if and how social science modeling has utility in proliferation assessment. This report describes a modeling approach and how it might be used to support a location-specific assessment of the PR assessment of a particular NES. The report demonstrates the use of social modeling to enhance an existing assessment process that relies on primarily technical factors. This effort builds on a literature review and preliminary assessment performed as the first stage of the project and compiled in PNNL-18438. [ T his report describes an effort to answer questions about whether it is possible to incorporate social modeling into a PR assessment in such a way that we can determine the effects of social factors on a primarily technical assessment. This report provides: 1. background information about relevant social factors literature; 2. background information about a particular PR assessment approach relevant to this particular demonstration; 3. a discussion of social modeling undertaken to find and characterize social factors that are relevant to the PR assessment of a nuclear facility in a specific location; 4. description of an enhancement concept that integrates social factors into an existing, technically

  8. Forest Ecosystem Dynamics Assessment and Predictive Modelling in Eastern Himalaya

    Science.gov (United States)

    Kushwaha, S. P. S.; Nandy, S.; Ahmad, M.; Agarwal, R.

    2011-09-01

    This study focused on the forest ecosystem dynamics assessment and predictive modelling deforestation and forest cover prediction in a part of north-eastern India i.e. forest areas along West Bengal, Bhutan, Arunachal Pradesh and Assam border in Eastern Himalaya using temporal satellite imagery of 1975, 1990 and 2009 and predicted forest cover for the period 2028 using Cellular Automata Markov Modedel (CAMM). The exercise highlighted large-scale deforestation in the study area during 1975-1990 as well as 1990-2009 forest cover vectors. A net loss of 2,334.28 km2 forest cover was noticed between 1975 and 2009, and with current rate of deforestation, a forest area of 4,563.34 km2 will be lost by 2028. The annual rate of deforestation worked out to be 0.35 and 0.78% during 1975-1990 and 1990-2009 respectively. Bamboo forest increased by 24.98% between 1975 and 2009 due to opening up of the forests. Forests in Kokrajhar, Barpeta, Darrang, Sonitpur, and Dhemaji districts in Assam were noticed to be worst-affected while Lower Subansiri, West and East Siang, Dibang Valley, Lohit and Changlang in Arunachal Pradesh were severely affected. Among different forest types, the maximum loss was seen in case of sal forest (37.97%) between 1975 and 2009 and is expected to deplete further to 60.39% by 2028. The tropical moist deciduous forest was the next category, which decreased from 5,208.11 km2 to 3,447.28 (33.81%) during same period with further chances of depletion to 2,288.81 km2 (56.05%) by 2028. It noted progressive loss of forests in the study area between 1975 and 2009 through 1990 and predicted that, unless checked, the area is in for further depletion of the invaluable climax forests in the region, especially sal and moist deciduous forests. The exercise demonstrated high potential of remote sensing and geographic information system for forest ecosystem dynamics assessment and the efficacy of CAMM to predict the forest cover change.

  9. FOREST ECOSYSTEM DYNAMICS ASSESSMENT AND PREDICTIVE MODELLING IN EASTERN HIMALAYA

    Directory of Open Access Journals (Sweden)

    S. P. S. Kushwaha

    2012-09-01

    Full Text Available This study focused on the forest ecosystem dynamics assessment and predictive modelling deforestation and forest cover prediction in a part of north-eastern India i.e. forest areas along West Bengal, Bhutan, Arunachal Pradesh and Assam border in Eastern Himalaya using temporal satellite imagery of 1975, 1990 and 2009 and predicted forest cover for the period 2028 using Cellular Automata Markov Modedel (CAMM. The exercise highlighted large-scale deforestation in the study area during 1975–1990 as well as 1990–2009 forest cover vectors. A net loss of 2,334.28 km2 forest cover was noticed between 1975 and 2009, and with current rate of deforestation, a forest area of 4,563.34 km2 will be lost by 2028. The annual rate of deforestation worked out to be 0.35 and 0.78% during 1975–1990 and 1990–2009 respectively. Bamboo forest increased by 24.98% between 1975 and 2009 due to opening up of the forests. Forests in Kokrajhar, Barpeta, Darrang, Sonitpur, and Dhemaji districts in Assam were noticed to be worst-affected while Lower Subansiri, West and East Siang, Dibang Valley, Lohit and Changlang in Arunachal Pradesh were severely affected. Among different forest types, the maximum loss was seen in case of sal forest (37.97% between 1975 and 2009 and is expected to deplete further to 60.39% by 2028. The tropical moist deciduous forest was the next category, which decreased from 5,208.11 km2 to 3,447.28 (33.81% during same period with further chances of depletion to 2,288.81 km2 (56.05% by 2028. It noted progressive loss of forests in the study area between 1975 and 2009 through 1990 and predicted that, unless checked, the area is in for further depletion of the invaluable climax forests in the region, especially sal and moist deciduous forests. The exercise demonstrated high potential of remote sensing and geographic information system for forest ecosystem dynamics assessment and the efficacy of CAMM to predict the forest cover change.

  10. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  11. Forecasting consequences of accidental release: how reliable are current assessment models

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Hoffman, F.O.; Miller, C.W.

    1983-01-01

    This paper focuses on uncertainties in model output used to assess accidents. We begin by reviewing the historical development of assessment models and the associated interest in uncertainties as these evolutionary processes occurred in the United States. This is followed by a description of the sources of uncertainties in assessment calculations. Types of models appropriate for assessment of accidents are identified. A summary of results from our analysis of uncertainty is provided in results obtained with current methodology for assessing routine and accidental radionuclide releases to the environment. We conclude with discussion of preferred procedures and suggested future directions to improve the state-of-the-art of radiological assessments

  12. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    KAUST Repository

    Attada, Raju

    2018-04-17

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF–LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena

  13. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    Science.gov (United States)

    Attada, Raju; Kumar, Prashant; Dasari, Hari Prasad

    2018-04-01

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF-LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena in

  14. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    KAUST Repository

    Attada, Raju; Kumar, Prashant; Dasari, Hari Prasad

    2018-01-01

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF–LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena

  15. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    Energy Technology Data Exchange (ETDEWEB)

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  16. The PP ampersand L Nuclear Department model for conducting self-assessments

    International Nuclear Information System (INIS)

    Murthy, M.L.R.; Vernick, H.R.; Male, A.M.; Burchill, W.E.

    1995-01-01

    The nuclear department of Pennsylvania Power ampersand Light Company (PP ampersand L) has initiated an aggressive, methodical, self-assessment program. Self-assessments are conducted to prevent problems, improve performance, and monitor results. The assessment activities are conducted by, or for, an individual having responsibility for performing the work being assessed. This individual, or customer, accepts ownership of the assessment effort and commits to implementing the recommendations agreed on during the assessment. This paper discusses the main elements of the assessment model developed by PP ampersand L and the results the model has achieved to date

  17. [Homeostasis model assessment (HOMA) values in Chilean elderly subjects].

    Science.gov (United States)

    Garmendia, María Luisa; Lera, Lydia; Sánchez, Hugo; Uauy, Ricardo; Albala, Cecilia

    2009-11-01

    The homeostasis assessment model for insulin resistance (HOMA-IR) estimates insulin resistance using basal insulin and glucose values and has a good concordance with values obtained with the euglycemic clamp. However it has a high variability that depends on environmental, genetic and physiologic factors. Therefore it is imperative to establish normal HOMA values in different populations. To report HOMA-IR values in Chilean elderly subjects and to determine the best cutoff point to diagnose insulin resistance. Cross sectional study of 1003 subjects older than 60 years of whom 803 (71% women) did not have diabetes. In 154 subjects, an oral glucose tolerance test was also performed. Insulin resistance (IR) was defined as the HOMA value corresponding to percentile 75 of subjects without over or underweight. The behavior of HOMA-IR in metabolic syndrome was studied and receiver operating curves (ROC) were calculated, using glucose intolerance defined as a blood glucose over 140 mg/dl and hyperinsulinemia, defined as a serum insulin over 60 microU/ml, two hours after the glucose load. Median HOMA-IR values were 1.7. Percentile 75 in subjects without obesity or underweight was 2.57. The area under the ROC curve, when comparing HOMA-IR with glucose intolerance and hyperinsulinemia, was 0.8 (95% confidence values 0.72-0.87), with HOMA-IR values ranging from 2.04 to 2.33. HOMA-IR is a useful method to determine insulin resistance in epidemiological studies. The HOMA-IR cutoff point for insulin resistance defined in thi spopulation was 2.6.

  18. Benchmark on residual stress modeling in fracture mechanics assessment

    International Nuclear Information System (INIS)

    Marie, S.; Deschanels, H.; Chapuliot, S.; Le Delliou, P.

    2014-01-01

    In the frame of development in analytical defect assessment methods for the RSE-M and RCC-MRx codes, new work on the consideration of residual stresses is initiated by AREVA, CEA and EDF. The first step of this work is the realization of a database of F.E. reference cases. To validate assumptions and develop a good practice guideline for the consideration of residual stresses in finite element calculations, a benchmark between AREVA, CEA and EDF is going-on. A first application presented in this paper focuses on the analysis of the crack initiation of aged duplex stainless steel pipes submitted to an increasing pressure loading. Residual stresses are related to pipe fabrication process and act as shell bending condition. Two tests were performed: the first with an internal longitudinal semi-elliptical crack and the second with an external crack. The analysis first focuses on the ability to accurately estimate the measured pressure at the crack initiation of the two tests. For that purpose, the comparison of results obtained with different methods of taking into account the residual stresses (i.e. thermal fields or initial strain field). It then validates post-treatment procedures for J or G determination, and finally compares of the results obtained by the different partners. It is then shown that the numerical models can integrate properly the impact of residual stresses on the crack initiation pressure. Then, an excellent agreement is obtained between the different numerical evaluations of G provided by the participants to the benchmark so that best practice and reference F.E. solutions for residual stresses consideration can be provided based on that work. (authors)

  19. Caries Risk Assessment in School Children Using Reduced Cariogram Model

    OpenAIRE

    Taqi, Muhammad; Razak, Ishak Abdul; Ab-Murat, Norintan

    2017-01-01

    Objective: To estimate the percentage of children with low, moderate and high caries risk; and to determine the predictors of caries risk amongst 11-12 year old Pakistani school children. Methods: Subjects’ caries risk was assessed using the Cariogram programme. The survey was done among school children in Bhakkar district of Punjab, Pakistan. Caries and plaque level were assessed using the DMFT and Sillnes and Loe indices respectively, while diet content and frequency were assessed using a t...

  20. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  1. Assessing Models of Public Understanding In ELSI Outreach Materials

    Energy Technology Data Exchange (ETDEWEB)

    Bruce V. Lewenstein, Ph.D.; Dominique Brossard, Ph.D.

    2006-03-01

    issues has been used in educational public settings to affect public understanding of science. After a theoretical background discussion, our approach is three-fold. First, we will provide an overview, a ?map? of DOE-funded of outreach programs within the overall ELSI context to identify the importance of the educational component, and to present the criteria we used to select relevant and representative case studies. Second, we will document the history of the case studies. Finally, we will explore an intertwined set of research questions: (1) To identify what we can expect such projects to accomplish -in other words to determine the goals that can reasonably be achieved by different types of outreach, (2) To point out how the case study approach could be useful for DOE-ELSI outreach as a whole, and (3) To use the case study approach as a basis to test theoretical models of science outreach in order to assess to what extent those models accord with real world outreach activities. For this last goal, we aim at identifying what practices among ELSI outreach activities contribute most to dissemination, or to participation, in other words in which cases outreach materials spark action in terms of public participation in decisions about scientific issues.

  2. Assessment and improvement of biotransfer models to cow's milk and beef used in exposure assessment tools for organic pollutants.

    Science.gov (United States)

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2015-11-01

    The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow's milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants. Copyright © 2015. Published by Elsevier Ltd.

  3. Ecological models for regulatory risk assessments of pesticides: Developing a strategy for the future.

    NARCIS (Netherlands)

    Thorbek, P.; Forbes, V.; Heimbach, F.; Hommen, U.; Thulke, H.H.; Brink, van den P.J.

    2010-01-01

    Ecological Models for Regulatory Risk Assessments of Pesticides: Developing a Strategy for the Future provides a coherent, science-based view on ecological modeling for regulatory risk assessments. It discusses the benefits of modeling in the context of registrations, identifies the obstacles that

  4. Crop modelling for integrated assessment of risk to food production from climate change

    NARCIS (Netherlands)

    Ewert, F.; Rötter, R.P.; Bindi, M.; Webber, Heidi; Trnka, M.; Kersebaum, K.C.; Olesen, J.E.; Ittersum, van M.K.; Janssen, S.J.C.; Rivington, M.; Semenov, M.A.; Wallach, D.; Porter, J.R.; Stewart, D.; Verhagen, J.; Gaiser, T.; Palosuo, T.; Tao, F.; Nendel, C.; Roggero, P.P.; Bartosová, L.; Asseng, S.

    2015-01-01

    The complexity of risks posed by climate change and possible adaptations for crop production has called for integrated assessment and modelling (IAM) approaches linking biophysical and economic models. This paper attempts to provide an overview of the present state of crop modelling to assess

  5. Assessing and improving the quality of modeling : a series of empirical studies about the UML

    NARCIS (Netherlands)

    Lange, C.F.J.

    2007-01-01

    Assessing and Improving the Quality of Modeling A Series of Empirical Studies about the UML This thesis addresses the assessment and improvement of the quality of modeling in software engineering. In particular, we focus on the Unified Modeling Language (UML), which is the de facto standard in

  6. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    Energy Technology Data Exchange (ETDEWEB)

    Sobolik, S.R.; Ho, C.K.; Dunn, E. [Sandia National Labs., Albuquerque, NM (United States); Robey, T.H. [Spectra Research Inst., Albuquerque, NM (United States); Cruz, W.T. [Univ. del Turabo, Gurabo (Puerto Rico)

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document.

  7. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    International Nuclear Information System (INIS)

    Sobolik, S.R.; Ho, C.K.; Dunn, E.; Robey, T.H.; Cruz, W.T.

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document

  8. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  9. Modelling self-assessed vulnerability to HIV and its associated ...

    African Journals Online (AJOL)

    Background: Globally, individuals' self-assessment of vulnerability to HIV infection is important to maintain safer sexual behaviour and reduce risky behaviours. However, determinants of self-perceived risk of HIV infection are not well documented and differ. We assessed the level of self-perceived vulnerability to HIV ...

  10. Assessing the Needs of Adults for Continuing Education: A Model.

    Science.gov (United States)

    Moore, Donald E., Jr.

    1980-01-01

    Reviews the needs assessment studies described in this journal issue. Concludes that (1) lessons from completed needs assessments can help continuing education practitioners plan and conduct future studies, and (2) a rational, need-reduction, decision-making approach can improve continuing education programs. (CT)