WorldWideScience

Sample records for probabilistic source term

  1. Probabilistic source term predictions for use with decision support systems

    International Nuclear Information System (INIS)

    Grindon, E.; Kinniburgh, C.G.

    2003-01-01

    Full text: Decision Support Systems for use in off-site emergency management, following an incident at a Nuclear Power Plant (NPP) within Europe, are becoming accepted as a useful and appropriate tool to aid decision makers. An area which is not so well developed is the 'upstream' prediction of the source term released into the environment. Rapid prediction of this source term is crucial to the appropriate early management of a nuclear emergency. The initial source term prediction would today be typically based on simple tabulations taking little, or no, account of plant status. It is the interface between the inward looking plant control room team and the outward looking off-site emergency management team that needs to be addressed. This is not an easy proposition as these two distinct disciplines have little common basis from which to communicate their immediate findings and concerns. Within the Euratom Fifth Framework Programme (FP5), complementary approaches are being developed to the pre-release stage; each based on software tools to help bridge this gap. Traditionally source terms (or releases into the environment) provided for use with Decision Support Systems are estimated on a deterministic basis. These approaches use a single, deterministic assumption about plant status. The associated source term represents the 'best estimate' based an available information. No information is provided an the potential for uncertainty in the source term estimate. Using probabilistic methods the outcome is typically a number of possible plant states each with an associated source term and probability. These represent both the best estimate and the spread of the likely source term. However, this is a novel approach and the usefulness of such source term prediction tools is yet to be tested on a wide scale. The benefits of probabilistic source term estimation are presented here; using, as an example, the SPRINT tool developed within the FP5 STERPS project. System for the

  2. Probabilistic Dose Assessment from SB-LOCA Accident in Ujung Lemahabang Using TMI-2 Source Term

    Directory of Open Access Journals (Sweden)

    Sunarko

    2017-01-01

    Full Text Available Probabilistic dose assessment and mapping for nuclear accident condition are performed for Ujung Lemahabang site in Muria Peninsula region in Indonesia. Source term is obtained from Three-Mile Island unit 2 (TMI-2 PWR-type SB-LOCA reactor accident inverse modeling. Effluent consisted of Xe-133, Kr-88, I-131, and Cs-137 released from a 50 m stack. Lagrangian Particle Dispersion Method (LPDM and 3-dimensional mass-consistent wind field are employed to obtain surface-level time-integrated air concentration and spatial distribution of ground-level total dose in dry condition. Site-specific meteorological data is obtained from hourly records obtained during the Site Feasibility Study period in Ujung Lemahabang. Effluent is released from a height of 50 meters in uniform rate during a 6-hour period and the dose is integrated during this period in a neutrally stable atmospheric condition. Maximum dose noted is below regulatory limit of 1 mSv and radioactive plume is spread mostly to the W-SW inland and to N-NE from the proposed plant to Java Sea. This paper has demonstrated for the first time a probabilistic analysis method for assessing possible spatial dose distribution, a hypothetical release, and a set of meteorological data for Ujung Lemahabang region.

  3. Review and evaluation of the Millstone Unit 3 probabilistic safety study. Containment failure modes, radiological source - terms and offsite consequences

    International Nuclear Information System (INIS)

    Khatib-Rahbar, M.; Pratt, W.; Ludewig, H.

    1985-09-01

    A technical review and evaluation of the Millstone Unit 3 probabilistic safety study has been performed. It was determined that; (1) long-term damage indices (latent fatalities, person-rem, etc.) are dominated by late failure of the containment, (2) short-term damage indices (early fatalities, etc.) are dominated by bypass sequences for internally initiated events, while severe seismic sequences can also contribute significantly to early damage indices. These overall estimates of severe accident risk are extremely low compared with other societal sources of risk. Furthermore, the risks for Millstone-3 are comparable to risks from other nuclear plants at high population sites. Seismically induced accidents dominate the severe accident risks at Millstone-3. Potential mitigative features were shown not to be cost-effective for internal events. Value-impact analysis for seismic events showed that a manually actuated containment spray system might be cost-effective

  4. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  5. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    Science.gov (United States)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  6. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  7. Is Probabilistic Evidence a Source of Knowledge?

    Science.gov (United States)

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  8. Procedures for conducting probabilistic safety assessments of nuclear power plants (level 2). Accident progression, containment analysis and estimation of accident source terms

    International Nuclear Information System (INIS)

    1995-01-01

    The present publication on Level 2 PSA is based on a compilation and review of practices in various Member States. It complements Safety Series No. 50-P-4, issued in 1992, on Procedures for Conducting Probabilistic Safety Assessments of Nuclear Power Plants (Level 1). Refs, figs and tabs

  9. Fully probabilistic seismic source inversion – Part 1: Efficient parameterisation

    Directory of Open Access Journals (Sweden)

    S. C. Stähler

    2014-11-01

    Full Text Available Seismic source inversion is a non-linear problem in seismology where not just the earthquake parameters themselves but also estimates of their uncertainties are of great practical importance. Probabilistic source inversion (Bayesian inference is very adapted to this challenge, provided that the parameter space can be chosen small enough to make Bayesian sampling computationally feasible. We propose a framework for PRobabilistic Inference of Seismic source Mechanisms (PRISM that parameterises and samples earthquake depth, moment tensor, and source time function efficiently by using information from previous non-Bayesian inversions. The source time function is expressed as a weighted sum of a small number of empirical orthogonal functions, which were derived from a catalogue of >1000 source time functions (STFs by a principal component analysis. We use a likelihood model based on the cross-correlation misfit between observed and predicted waveforms. The resulting ensemble of solutions provides full uncertainty and covariance information for the source parameters, and permits propagating these source uncertainties into travel time estimates used for seismic tomography. The computational effort is such that routine, global estimation of earthquake mechanisms and source time functions from teleseismic broadband waveforms is feasible.

  10. Probabilist methods applied to electric source problems in nuclear safety

    International Nuclear Information System (INIS)

    Carnino, A.; Llory, M.

    1979-01-01

    Nuclear Safety has frequently been asked to quantify safety margins and evaluate the hazard. In order to do so, the probabilist methods have proved to be the most promising. Without completely replacing determinist safety, they are now commonly used at the reliability or availability stages of systems as well as for determining the likely accidental sequences. In this paper an application linked to the problem of electric sources is described, whilst at the same time indicating the methods used. This is the calculation of the probable loss of all the electric sources of a pressurized water nuclear power station, the evaluation of the reliability of diesels by event trees of failures and the determination of accidental sequences which could be brought about by the 'total electric source loss' initiator and affect the installation or the environment [fr

  11. Chernobyl source term estimation

    International Nuclear Information System (INIS)

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1990-09-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. The model simulations revealed that the radioactive cloud became segmented during the first day, with the lower section heading toward Scandinavia and the upper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. By optimizing the agreement between the observed cloud arrival times and duration of peak concentrations measured over Europe, Japan, Kuwait, and the US with the model predicted concentrations, it was possible to derive source term estimates for those radionuclides measured in airborne radioactivity. This was extended to radionuclides that were largely unmeasured in the environment by performing a reactor core radionuclide inventory analysis to obtain release fractions for the various chemical transport groups. These analyses indicated that essentially all of the noble gases, 60% of the radioiodines, 40% of the radiocesium, 10% of the tellurium and about 1% or less of the more refractory elements were released. These estimates are in excellent agreement with those obtained on the basis of worldwide deposition measurements. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents. However, the 137 Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests, while the 131 I and 90 Sr released by the Chernobyl accident was only about 0.1% of that released by the weapon tests. 13 refs., 2 figs., 7 tabs

  12. Some practical implications of source term reassessment

    International Nuclear Information System (INIS)

    1988-03-01

    This report provides a brief summary of the current knowledge of severe accident source terms and suggests how this knowledge might be applied to a number of specific aspects of reactor safety. In preparing the report, consideration has been restricted to source term issues relating to light water reactors (LWRs). Consideration has also generally been restricted to the consequences of hypothetical severe accidents rather than their probability of occurrence, although it is recognized that, in the practical application of source term research, it is necessary to take account of probability as well as consequences. The specific areas identified were as follows: Exploration of the new insights that are available into the management of severe accidents; Investigating the impact of source term research on emergency planning and response; Assessing the possibilities which exist in present reactor designs for preventing or mitigating the consequences of severe accidents and how these might be used effectively; Exploring the need for backfitting and assessing the implications of source term research for future designs; and Improving the quantification of the radiological consequences of hypothetical severe accidents for probabilistic safety assessments (PSAs) and informing the public about the realistic risks associated with nuclear power plants. 7 refs

  13. Advanced neutron source reactor probabilistic flow blockage assessment

    International Nuclear Information System (INIS)

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool

  14. A global probabilistic tsunami hazard assessment from earthquake sources

    Science.gov (United States)

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  15. Fission-product source terms

    International Nuclear Information System (INIS)

    Lorenz, R.A.

    1981-01-01

    This presentation consists of a review of fission-product source terms for light water reactor (LWR) fuel. A source term is the quantity of fission products released under specified conditions that can be used to calculate the consequences of the release. The source term usually defines release from breached fuel-rod cladding but could also describe release from the primary coolant system, the reactor containment shell, or the site boundary. The source term would be different for each locality, and the chemical and physical forms of the fission products could also differ

  16. A probabilistic analysis of cumulative carbon emissions and long-term planetary warming

    International Nuclear Information System (INIS)

    Fyke, Jeremy; Matthews, H Damon

    2015-01-01

    Efforts to mitigate and adapt to long-term climate change could benefit greatly from probabilistic estimates of cumulative carbon emissions due to fossil fuel burning and resulting CO 2 -induced planetary warming. Here we demonstrate the use of a reduced-form model to project these variables. We performed simulations using a large-ensemble framework with parametric uncertainty sampled to produce distributions of future cumulative emissions and consequent planetary warming. A hind-cast ensemble of simulations captured 1980–2012 historical CO 2 emissions trends and an ensemble of future projection simulations generated a distribution of emission scenarios that qualitatively resembled the suite of Representative and Extended Concentration Pathways. The resulting cumulative carbon emission and temperature change distributions are characterized by 5–95th percentile ranges of 0.96–4.9 teratonnes C (Tt C) and 1.4 °C–8.5 °C, respectively, with 50th percentiles at 3.1 Tt C and 4.7 °C. Within the wide range of policy-related parameter combinations that produced these distributions, we found that low-emission simulations were characterized by both high carbon prices and low costs of non-fossil fuel energy sources, suggesting the importance of these two policy levers in particular for avoiding dangerous levels of climate warming. With this analysis we demonstrate a probabilistic approach to the challenge of identifying strategies for limiting cumulative carbon emissions and assessing likelihoods of surpassing dangerous temperature thresholds. (letter)

  17. Design parameters and source terms: Volume 3, Source terms

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 11 refs., 9 tabs

  18. Some problems in the categorization of source terms

    International Nuclear Information System (INIS)

    Abbey, F.; Dunbar, I.H.; Hayns, M.R.; Nixon, W.

    1985-01-01

    In recent years techniques for calculating source terms have been considerably improved. It would be unfortunate if the new information were to be blurred by the use of old schemes for the categorization of source terms. In the past categorization schemes have been devised without the question of the general principles of categorization and the available options being addressed explicitly. In this paper these principles are set out, providing a framework within which categorization schemes used in past probabilistic risk assessments and possible future improvements are discussed. In particular the use of input from scoping consequence calculations in deciding how to group source terms, and the question of how modelling uncertainties may be expressed as uncertainties in a final category source terms are considered

  19. A Probabilistic Palimpsest Model of Visual Short-term Memory

    Science.gov (United States)

    Matthey, Loic; Bays, Paul M.; Dayan, Peter

    2015-01-01

    Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204

  20. From sub-source to source: Interpreting results of biological trace investigations using probabilistic models

    NARCIS (Netherlands)

    Oosterman, W.T.; Kokshoorn, B.; Maaskant-van Wijk, P.A.; de Zoete, J.

    2015-01-01

    The current method of reporting a putative cell type is based on a non-probabilistic assessment of test results by the forensic practitioner. Additionally, the association between donor and cell type in mixed DNA profiles can be exceedingly complex. We present a probabilistic model for

  1. Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging

    Science.gov (United States)

    Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.

    2017-10-01

    Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.

  2. A probabilistic justification for using tf.idf term weighting in information retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2000-01-01

    This paper presents a new probabilistic model of information retrieval. The most important modeling assumption made is that documents and queries are defined by an ordered sequence of single terms. This assumption is not made in well known existing models of information retrieval, but is essential

  3. Probabilistic blind deconvolution of non-stationary sources

    DEFF Research Database (Denmark)

    Olsson, Rasmus Kongsgaard; Hansen, Lars Kai

    2004-01-01

    We solve a class of blind signal separation problems using a constrained linear Gaussian model. The observed signal is modelled by a convolutive mixture of colored noise signals with additive white noise. We derive a time-domain EM algorithm `KaBSS' which estimates the source signals...

  4. Pennsylvania Source Term Tracking System

    International Nuclear Information System (INIS)

    1992-08-01

    The Pennsylvania Source Term Tracking System tabulates surveys received from radioactive waste generators in the Commonwealth of radioactive waste is collected each quarter from generators using the Low-Level Radioactive Waste Management Quarterly Report Form (hereafter called the survey) and then entered into the tracking system data base. This personal computer-based tracking system can generate 12 types of tracking reports. The first four sections of this reference manual supply complete instructions for installing and setting up the tracking system on a PC. Section 5 presents instructions for entering quarterly survey data, and Section 6 discusses generating reports. The appendix includes samples of each report

  5. Severe accident source term reassessment

    International Nuclear Information System (INIS)

    Hazzan, M.J.; Gardner, R.; Warman, E.A.; Jacobs, S.B.

    1987-01-01

    This paper summarizes the status of the reassessment of severe reactor accident source terms, which are defined as the quantity, type, and timing of fission product releases from such accidents. Concentration is on the major results and conclusions of analyses with modern methods for both pressurized water reactors (PWRs) and boiling water reactors (BWRs), and the special case of containment bypass. Some distinctions are drawn between analyses for PWRs and BWRs. In general, the more the matter is examined, the consequences, or probability of serious consequences, seem to be less. (author)

  6. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    International Nuclear Information System (INIS)

    HO, CLIFFORD K.; ARNOLD, BILL W.; COCHRAN, JOHN R.; TAIRA, RANDAL Y.

    2002-01-01

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings)

  7. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    Science.gov (United States)

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  8. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2009-01-01

    on the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate closures......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...

  9. Very Short-term Nonparametric Probabilistic Forecasting of Renewable Energy Generation - with Application to Solar Energy

    DEFF Research Database (Denmark)

    Golestaneh, Faranak; Pinson, Pierre; Gooi, Hoay Beng

    2016-01-01

    Due to the inherent uncertainty involved in renewable energy forecasting, uncertainty quantification is a key input to maintain acceptable levels of reliability and profitability in power system operation. A proposal is formulated and evaluated here for the case of solar power generation, when only...... approach to generate very short-term predictive densities, i.e., for lead times between a few minutes to one hour ahead, with fast frequency updates. We rely on an Extreme Learning Machine (ELM) as a fast regression model, trained in varied ways to obtain both point and quantile forecasts of solar power...... generation. Four probabilistic methods are implemented as benchmarks. Rival approaches are evaluated based on a number of test cases for two solar power generation sites in different climatic regions, allowing us to show that our approach results in generation of skilful and reliable probabilistic forecasts...

  10. Very-short-term wind power probabilistic forecasts by sparse vector autoregression

    DEFF Research Database (Denmark)

    Dowell, Jethro; Pinson, Pierre

    2016-01-01

    A spatio-temporal method for producing very-shortterm parametric probabilistic wind power forecasts at a large number of locations is presented. Smart grids containing tens, or hundreds, of wind generators require skilled very-short-term forecasts to operate effectively, and spatial information...... is highly desirable. In addition, probabilistic forecasts are widely regarded as necessary for optimal power system management as they quantify the uncertainty associated with point forecasts. Here we work within a parametric framework based on the logit-normal distribution and forecast its parameters....... The location parameter for multiple wind farms is modelled as a vector-valued spatiotemporal process, and the scale parameter is tracked by modified exponential smoothing. A state-of-the-art technique for fitting sparse vector autoregressive models is employed to model the location parameter and demonstrates...

  11. Estimation of Source terms for Emergency Planning and Preparedness

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Chul Un; Chung, Bag Soon; Ahn, Jae Hyun; Yoon, Duk Ho; Jeong, Chul Young; Lim, Jong Dae [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Kang, Sun Gu; Suk, Ho; Park, Sung Kyu; Lim, Hac Kyu; Lee, Kwang Nam [Korea Power Engineering Company Consulting and Architecture Engineers, (Korea, Republic of)

    1997-12-31

    In this study the severe accident sequences for each plant of concern, which represent accident sequences with a high core damage frequency and significant accident consequences, were selected based on the results of probabilistic safety assessments and source term and time-histories of various safety parameters under severe accidents. Accidents progression analysis for each selected accident sequence was performed by MAAP code. It was determined that the measured values, dose rate and radioisotope concentration, could provide information to the operators on occurrence and timing of core damage, reactor vessel failure, and containment failure during severe accidents. Radioactive concentration in the containment atmosphere, which may be measured by PASS, was estimated. Radioisotope concentration in emergency planning, evaluation of source term behavior in the containment, estimation of core damage degree, analysis of severe accident phenomena, core damage timing, and the amount of radioisotope released to the environment. (author). 50 refs., 60 figs.

  12. Uncertainty Quantification in Earthquake Source Characterization with Probabilistic Centroid Moment Tensor Inversion

    Science.gov (United States)

    Dettmer, J.; Benavente, R. F.; Cummins, P. R.

    2017-12-01

    This work considers probabilistic, non-linear centroid moment tensor inversion of data from earthquakes at teleseismic distances. The moment tensor is treated as deviatoric and centroid location is parametrized with fully unknown latitude, longitude, depth and time delay. The inverse problem is treated as fully non-linear in a Bayesian framework and the posterior density is estimated with interacting Markov chain Monte Carlo methods which are implemented in parallel and allow for chain interaction. The source mechanism and location, including uncertainties, are fully described by the posterior probability density and complex trade-offs between various metrics are studied. These include the percent of double couple component as well as fault orientation and the probabilistic results are compared to results from earthquake catalogs. Additional focus is on the analysis of complex events which are commonly not well described by a single point source. These events are studied by jointly inverting for multiple centroid moment tensor solutions. The optimal number of sources is estimated by the Bayesian information criterion to ensure parsimonious solutions. [Supported by NSERC.

  13. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  14. A probabilistic framework for acoustic emission source localization in plate-like structures

    International Nuclear Information System (INIS)

    Dehghan Niri, E; Salamone, S

    2012-01-01

    This paper proposes a probabilistic approach for acoustic emission (AE) source localization in isotropic plate-like structures based on an extended Kalman filter (EKF). The proposed approach consists of two main stages. During the first stage, time-of-flight (TOF) measurements of Lamb waves are carried out by a continuous wavelet transform (CWT), accounting for systematic errors due to the Heisenberg uncertainty; the second stage uses an EKF to iteratively estimate the AE source location and the wave velocity. The advantages of the proposed algorithm over the traditional methods include the capability of: (1) taking into account uncertainties in TOF measurements and wave velocity and (2) efficiently fusing multi-sensor data to perform AE source localization. The performance of the proposed approach is validated through pencil-lead breaks performed on an aluminum plate at systematic grid locations. The plate was instrumented with an array of four piezoelectric transducers in two different configurations. (paper)

  15. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  16. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  17. Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    International Nuclear Information System (INIS)

    Devitt, Simon J; Stephens, Ashley M; Munro, William J; Nemoto, Kae

    2011-01-01

    In this paper, we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilized to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large-scale computer. Photons in this system are continually recycled back into the preparation network, allowing for an arbitrarily deep three-dimensional cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high-frequency, deterministic photon sources.

  18. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Madankan, R. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pouget, S. [Department of Geology, University at Buffalo (United States); Singla, P., E-mail: psingla@buffalo.edu [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Bursik, M. [Department of Geology, University at Buffalo (United States); Dehn, J. [Geophysical Institute, University of Alaska, Fairbanks (United States); Jones, M. [Center for Computational Research, University at Buffalo (United States); Patra, A. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pavolonis, M. [NOAA-NESDIS, Center for Satellite Applications and Research (United States); Pitman, E.B. [Department of Mathematics, University at Buffalo (United States); Singh, T. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Webley, P. [Geophysical Institute, University of Alaska, Fairbanks (United States)

    2014-08-15

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This paper presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.

  19. A probabilistic assessment of large scale wind power development for long-term energy resource planning

    Science.gov (United States)

    Kennedy, Scott Warren

    A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable

  20. Calculation of source terms for NUREG-1150

    International Nuclear Information System (INIS)

    Breeding, R.J.; Williams, D.C.; Murfin, W.B.; Amos, C.N.; Helton, J.C.

    1987-10-01

    The source terms estimated for NUREG-1150 are generally based on the Source Term Code Package (STCP), but the actual source term calculations used in computing risk are performed by much smaller codes which are specific to each plant. This was done because the method of estimating the uncertainty in risk for NUREG-1150 requires hundreds of source term calculations for each accident sequence. This is clearly impossible with a large, detailed code like the STCP. The small plant-specific codes are based on simple algorithms and utilize adjustable parameters. The values of the parameters appearing in these codes are derived from the available STCP results. To determine the uncertainty in the estimation of the source terms, these parameters were varied as specified by an expert review group. This method was used to account for the uncertainties in the STCP results and the uncertainties in phenomena not considered by the STCP

  1. An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2015-09-01

    Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.

  2. SOURCE TERMS FOR HLW GLASS CANISTERS

    International Nuclear Information System (INIS)

    J.S. Tang

    2000-01-01

    This calculation is prepared by the Monitored Geologic Repository (MGR) Waste Package Design Section. The objective of this calculation is to determine the source terms that include radionuclide inventory, decay heat, and radiation sources due to gamma rays and neutrons for the high-level radioactive waste (HLW) from the, West Valley Demonstration Project (WVDP), Savannah River Site (SRS), Hanford Site (HS), and Idaho National Engineering and Environmental Laboratory (INEEL). This calculation also determines the source terms of the canister containing the SRS HLW glass and immobilized plutonium. The scope of this calculation is limited to source terms for a time period out to one million years. The results of this calculation may be used to carry out performance assessment of the potential repository and to evaluate radiation environments surrounding the waste packages (WPs). This calculation was performed in accordance with the Development Plan ''Source Terms for HLW Glass Canisters'' (Ref. 7.24)

  3. Regulatory impact of nuclear reactor accident source term assumptions. Technical report

    International Nuclear Information System (INIS)

    Pasedag, W.F.; Blond, R.M.; Jankowski, M.W.

    1981-06-01

    This report addresses the reactor accident source term implications on accident evaluations, regulations and regulatory requirements, engineered safety features, emergency planning, probabilistic risk assessment, and licensing practice. Assessment of the impact of source term modifications and evaluation of the effects in Design Basis Accident analyses, assuming a change of the chemical form of iodine from elemental to cesium iodide, has been provided. Engineered safety features used in current LWR designs are found to be effective for all postulated combinations of iodine source terms under DBA conditions. In terms of potential accident consequences, it is not expected that the difference in chemical form between elemental iodine and cesium iodide would be significant. In order to account for the current information on source terms, a spectrum of accident scenerios is discussed to realistically estimate the source terms resulting from a range of potential accident conditions

  4. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  5. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  6. Probabilistic assessment of the long-term performance of the Panel Mine tailings area

    International Nuclear Information System (INIS)

    Balins, J.K.; Davis, J.B.; Payne, R.A.

    1994-01-01

    Rio Algom's Panel Uranium Mine originally operated between 1958 and 1961. It was reactivated in 1979 and operated continuously until 1990. In all, the mine produced about 14 million tons of potentially acid generating, low level radioactive uranium tailings; about 5% pyrite (by weight) with less than 0.1% U 3 O 8 . The tailings area consists of two rock rimmed basins. Topographic lows around the perimeter are closed by a total of six containment dams. To minimize the acid generating potential within the tailings, a decommissioning plan to flood the impounded tailings is being implemented. The anticipated performance of engineered structures (dams, spillways, channels, etc.) and the flooded tailings concept, over time periods in the order of thousands of years, have been addressed using probabilistic methods, based on subjective probability distributions consistent with available site specific information. The probable costs associated with long-term inspection and maintenance of the facility, as well as the probable costs and environmental consequences (e.g. tailings releases) associated with potential dam failures due to disruptive events such as floods, droughts and earthquakes were determined using a probabilistic model which consists of five, essentially independent, sub-models: a Maintenance Model, an Earthquake Response Model, a Flood Response Model, a Drought Model and an Integration Model. The principal conclusion derived from this assessment is that, for a well designed, constructed and maintained facility, there is very little likelihood that water and/or tailings solids will be released as a result of a containment dam failure; annual probability of the order of 10 -6 . Failure to maintain the facility over the long-term significantly increases the likelihood of dam failure with resultant release of water and suspended tailings solids

  7. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  8. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    method is suitable to compute the valueof the parameter C 2 .When no mathematical model of the source can be made available, estimations of the value C2 can be find in literature.In this paper a probabilistic mathematical representation of the unknown source is proposed, such that the asparagus patch model of the source can be approximated. The computation of the value C2 can be done in conjunction with the CMSA method, knowing the apparent mass of the load and the random acceleration specification at the interface between load and source, respectively.Strength & stiffness design rules for spacecraft, instrumentation, units, etc. will be practiced, as mentioned in ECSS Standards and Handbooks, Launch Vehicle User's manuals, papers, books , etc. A probabilistic description of the design parameters is foreseen.As an example a simple experiment has been worked out.

  9. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    Science.gov (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly

  10. A Probabilistic Short-Term Water Demand Forecasting Model Based on the Markov Chain

    Directory of Open Access Journals (Sweden)

    Francesca Gagliardi

    2017-07-01

    Full Text Available This paper proposes a short-term water demand forecasting method based on the use of the Markov chain. This method provides estimates of future demands by calculating probabilities that the future demand value will fall within pre-assigned intervals covering the expected total variability. More specifically, two models based on homogeneous and non-homogeneous Markov chains were developed and presented. These models, together with two benchmark models (based on artificial neural network and naïve methods, were applied to three real-life case studies for the purpose of forecasting the respective water demands from 1 to 24 h ahead. The results obtained show that the model based on a homogeneous Markov chain provides more accurate short-term forecasts than the one based on a non-homogeneous Markov chain, which is in line with the artificial neural network model. Both Markov chain models enable probabilistic information regarding the stochastic demand forecast to be easily obtained.

  11. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    Science.gov (United States)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  12. Subsurface Shielding Source Term Specification Calculation

    International Nuclear Information System (INIS)

    S.Su

    2001-01-01

    The purpose of this calculation is to establish appropriate and defensible waste-package radiation source terms for use in repository subsurface shielding design. This calculation supports the shielding design for the waste emplacement and retrieval system, and subsurface facility system. The objective is to identify the limiting waste package and specify its associated source terms including source strengths and energy spectra. Consistent with the Technical Work Plan for Subsurface Design Section FY 01 Work Activities (CRWMS M and O 2001, p. 15), the scope of work includes the following: (1) Review source terms generated by the Waste Package Department (WPD) for various waste forms and waste package types, and compile them for shielding-specific applications. (2) Determine acceptable waste package specific source terms for use in subsurface shielding design, using a reasonable and defensible methodology that is not unduly conservative. This calculation is associated with the engineering and design activity for the waste emplacement and retrieval system, and subsurface facility system. The technical work plan for this calculation is provided in CRWMS M and O 2001. Development and performance of this calculation conforms to the procedure, AP-3.12Q, Calculations

  13. Performance assessment of deterministic and probabilistic weather predictions for the short-term optimization of a tropical hydropower reservoir

    Science.gov (United States)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter

    2016-04-01

    Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of

  14. A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    Science.gov (United States)

    Supino, M.; Festa, G.; Zollo, A.

    2017-12-01

    The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to

  15. Source term estimation for small sized HTRs

    International Nuclear Information System (INIS)

    Moormann, R.

    1992-08-01

    Accidents which have to be considered are core heat-up, reactivity transients, water of air ingress and primary circuit depressurization. The main effort of this paper belongs to water/air ingress and depressurization, which requires consideration of fission product plateout under normal operation conditions; for the latter it is clearly shown, that absorption (penetration) mechanisms are much less important than assumed sometimes in the past. Source term estimation procedures for core heat-up events are shortly reviewed; reactivity transients are apparently covered by them. Besides a general literature survey including identification of areas with insufficient knowledge this paper contains some estimations on the thermomechanical behaviour of fission products in water in air ingress accidents. Typical source term examples are also presented. In an appendix, evaluations of the AVR experiments VAMPYR-I and -II with respect to plateout and fission product filter efficiency are outlined and used for a validation step of the new plateout code SPATRA. (orig.)

  16. Reevaluation of HFIR source term: Supplement 2

    International Nuclear Information System (INIS)

    Thomas, W.E.

    1986-11-01

    The HFIR source term has been reevaluated to assess the impact of the increase in core lifetime from 15 to 24 days. Calculations were made to determine the nuclide activities of the iodines, noble gases, and other fission products. The results show that there is no significant change in off-site dose due to the increased fuel cycle for the release scenario postulated in ORNL-3573

  17. Short-term Probabilistic Load Forecasting with the Consideration of Human Body Amenity

    Directory of Open Access Journals (Sweden)

    Ning Lu

    2013-02-01

    Full Text Available Load forecasting is the basis of power system planning and design. It is important for the economic operation and reliability assurance of power system. However, the results of load forecasting given by most existing methods are deterministic. This study aims at probabilistic load forecasting. First, the support vector machine regression is used to acquire the deterministic results of load forecasting with the consideration of human body amenity. Then the probabilistic load forecasting at a certain confidence level is given after the analysis of error distribution law corresponding to certain heat index interval. The final simulation shows that this probabilistic forecasting method is easy to implement and can provide more information than the deterministic forecasting results, and thus is helpful for decision-makers to make reasonable decisions.

  18. Hazardous constituent source term. Revision 2

    International Nuclear Information System (INIS)

    1994-01-01

    The Department of Energy (DOE) has several facilities that either generate and/or store transuranic (TRU)-waste from weapons program research and production. Much of this waste also contains hazardous waste constituents as regulated under Subtitle C of the Resource Conservation and Recovery Act (RCRA). Toxicity characteristic metals in the waste principally include lead, occurring in leaded rubber gloves and shielding. Other RCRA metals may occur as contaminants in pyrochemical salt, soil, debris, and sludge and solidified liquids, as well as in equipment resulting from decontamination and decommissioning activities. Volatile organic compounds (VOCS) contaminate many waste forms as a residue adsorbed on surfaces or occur in sludge and solidified liquids. Due to the presence of these hazardous constituents, applicable disposal regulations include land disposal restrictions established by Hazardous and Solid Waste Amendments (HSWA). The DOE plans to dispose of TRU-mixed waste from the weapons program in the Waste Isolation Pilot Plant (WIPP) by demonstrating no-migration of hazardous constituents. This paper documents the current technical basis for methodologies proposed to develop a post-closure RCRA hazardous constituent source term. For the purposes of demonstrating no-migration, the hazardous constituent source term is defined as the quantities of hazardous constituents that are available for transport after repository closure. Development of the source term is only one of several activities that will be involved in the no-migration demonstration. The demonstration will also include uncertainty and sensitivity analyses of contaminant transport

  19. Spent fuel assembly source term parameters

    International Nuclear Information System (INIS)

    Barrett, P.R.; Foadian, H.; Rashid, Y.R.; Seager, K.D.; Gianoulakis, S.E.

    1993-01-01

    Containment of cask contents by a transport cask is a function of the cask body, one or more closure lids, and various bolting hardware, and seals associated with the cavity closure and other containment penetrations. In addition, characteristics of cask contents that impede the ability of radionuclides to move from an origin to the external environment also provide containment. In essence, multiple release barriers exist in series in transport casks, and the magnitude of the releasable activity in the cask is considerably lower than the total activity of its contents. A source term approach accounts for the magnitude of the releasable activity available in the cask by assessing the degree of barrier resistance to release provided by material characteristics and inherent barriers that impede the release of radioactive contents. Standardized methodologies for defining the spent-fuel transport packages with specified regulations have recently been developed. An essential part of applying the source term methodology involves characterizing the response of the spent fuel under regulatory conditions of transport. Thermal and structural models of the cask and fuel are analyzed and used to predict fuel rod failure probabilities. Input to these analyses and failure evaluations cover a wide range of geometrical and material properties. An important issue in the development of these models is the sensitivity of the radioactive source term generated during transport to individual parameters such as temperature and fluence level. This paper provides a summary of sensitivity analyses concentrating on the structural response and failure predictions of the spent fuel assemblies

  20. Real time source term and dose assessment

    International Nuclear Information System (INIS)

    Breznik, B.; Kovac, A.; Mlakar, P.

    2001-01-01

    The Dose Projection Programme is a tool for decision making in case of nuclear emergency. The essential input data for quick emergency evaluation in the case of hypothetical pressurised water reactor accident are following: source term, core damage assessment, fission product radioactivity, release source term and critical exposure pathways for an early phase of the release. A reduced number of radio-nuclides and simplified calculations can be used in dose calculation algorithm. Simple expert system personal computer programme has been developed for the Krsko Nuclear Power Plant for dose projection within the radius of few kilometers from the pressurised water reactor in early phase of an accident. The input data are instantaneous data of core activity, core damage indicators, release fractions, reduction factor of the release pathways, spray operation, release timing, and dispersion coefficient. Main dose projection steps are: accurate in-core radioactivity determination using reactor power input; core damage and in-containment source term assessment based on quick indications of instrumentation or on activity analysis data; user defines release pathway for typical PWR accident scenarius; dose calculation is performed only for exposure pathway critical for decision about evacuation or sheltering in early phase of an accident.(author)

  1. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P.

    2012-09-01

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  2. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  3. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    International Nuclear Information System (INIS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Brink, Henrik; Crellin-Quick, Arien; Butler, Nathaniel R.

    2012-01-01

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  4. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Brink, Henrik; Crellin-Quick, Arien [Astronomy Department, University of California, Berkeley, CA 94720-3411 (United States); Butler, Nathaniel R., E-mail: jwrichar@stat.berkeley.edu [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287 (United States)

    2012-12-15

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  5. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  6. Source terms in relation to air cleaning

    International Nuclear Information System (INIS)

    Bernero, R.M.

    1985-01-01

    There are two sets of source terms for consideration in air cleaning, those for routine releases and those for accident releases. With about 1000 reactor years of commercial operating experience in the US done, there is an excellent data base for routine and expected transient releases. Specifications for air cleaning can be based on this body of experience with confidence. Specifications for air cleaning in accident situations is another matter. Recent investigations of severe accident behavior are offering a new basis for source terms and air cleaning specifications. Reports by many experts in the field describe an accident environment notably different from previous models. It is an atmosphere heavy with aerosols, both radioactive and inert. Temperatures are sometimes very high; radioiodine is typically in the form of cesium iodide aerosol particles; other nuclides, such as tellurium, are also important aerosols. Some of the present air cleaning requirements may be very important in light of these new accident behavior models. Others may be wasteful or even counterproductive. The use of the new data on accident behavior models to reevaluate requirements promptly is discussed

  7. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    Energy Technology Data Exchange (ETDEWEB)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.; Keisler, J.M.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozone are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.

  8. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 2 tabs

  9. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan---Conceptual Design Report SCP-CDR. The previous study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites. Volume 2 contains tables of source terms

  10. Influence of Chemistry on source term assessment

    International Nuclear Information System (INIS)

    Herranz Puebla, L.E.; Lopez Diez, I.; Rodriguez Maroto, J.J.; Martinez Lopez-Alcorocho, A.

    1991-01-01

    The major goal of a phenomenology analysis of containment during a severe accident situation can be splitedd into the following ones: to know the containment response to the different loads and to predict accurately the fission product and aerosol behavior. In this report, the main results coming from the study of a hypothetical accident scenario, based on LA-4 experiment of LACE project, are presented. In order to do it, several codes have been coupled: CONTEMPT4/MOD5 (thermalhydraulics), NAUA/MOD5 (aerosol physics) and IODE (iodine chemistry). 12 refs. It has been demonstrated the impossibility of assessing with confidence the Source Term if the chemical conduct of some radionuclides is not taken into account. In particular, the influence on the iodine retention efficiency of the sump of variables such as pH has been proven. (Author). 12 refs

  11. An analog ensemble for short-term probabilistic solar power forecast

    International Nuclear Information System (INIS)

    Alessandrini, S.; Delle Monache, L.; Sperati, S.; Cervone, G.

    2015-01-01

    Highlights: • A novel method for solar power probabilistic forecasting is proposed. • The forecast accuracy does not depend on the nominal power. • The impact of climatology on forecast accuracy is evaluated. - Abstract: The energy produced by photovoltaic farms has a variable nature depending on astronomical and meteorological factors. The former are the solar elevation and the solar azimuth, which are easily predictable without any uncertainty. The amount of liquid water met by the solar radiation within the troposphere is the main meteorological factor influencing the solar power production, as a fraction of short wave solar radiation is reflected by the water particles and cannot reach the earth surface. The total cloud cover is a meteorological variable often used to indicate the presence of liquid water in the troposphere and has a limited predictability, which is also reflected on the global horizontal irradiance and, as a consequence, on solar photovoltaic power prediction. This lack of predictability makes the solar energy integration into the grid challenging. A cost-effective utilization of solar energy over a grid strongly depends on the accuracy and reliability of the power forecasts available to the Transmission System Operators (TSOs). Furthermore, several countries have in place legislation requiring solar power producers to pay penalties proportional to the errors of day-ahead energy forecasts, which makes the accuracy of such predictions a determining factor for producers to reduce their economic losses. Probabilistic predictions can provide accurate deterministic forecasts along with a quantification of their uncertainty, as well as a reliable estimate of the probability to overcome a certain production threshold. In this paper we propose the application of an analog ensemble (AnEn) method to generate probabilistic solar power forecasts (SPF). The AnEn is based on an historical set of deterministic numerical weather prediction (NWP) model

  12. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  13. Radioactivity source terms for underground engineering application

    Energy Technology Data Exchange (ETDEWEB)

    Tewes, H A [Lawrence Radiation Laboratory, Livermore, CA (United States)

    1969-07-01

    The constraints on nuclide production are usually very similar in any underground engineering application of nuclear explosives. However, in some applications the end product could be contaminated unless the proper nuclear device is used. This fact can be illustrated from two underground engineering experiments-Gasbuggy and Sloop. In the Gasbuggy experiment, appreciable tritium has been shown to be present in the gas currently being produced. However, in future gas stimulation applications (as distinct from experiments), a minimum production of tritium by the explosive is desirable since product contamination by this nuclide may place severe limitations on the use of the tritiated gas. In Sloop, where production of copper is the goal of the experiment, product contamination would not be caused by tritium but could result from other nuclides: Thus, gas stimulation could require the use of fission explosives while the lower cost per kiloton of thermonuclear explosives could make them attractive for ore-crushing applications. Because of this consideration, radionuclide production calculations must be made for both fission and for thermonuclear explosives in the underground environment. Such activation calculations materials of construction are performed in a manner similar to that described in another paper, but radionuclide production in the environment must be computed using both fission neutron and 14-MeV neutron sources in order to treat the 'source term' problem realistically. In making such computations, parameter studies including the effects of environmental temperature, neutron shielding, and rock types have been carried out. Results indicate the importance of carefully evaluating the radionuclide production for each individual underground engineering application. (author)

  14. Radioactivity source terms for underground engineering application

    International Nuclear Information System (INIS)

    Tewes, H.A.

    1969-01-01

    The constraints on nuclide production are usually very similar in any underground engineering application of nuclear explosives. However, in some applications the end product could be contaminated unless the proper nuclear device is used. This fact can be illustrated from two underground engineering experiments-Gasbuggy and Sloop. In the Gasbuggy experiment, appreciable tritium has been shown to be present in the gas currently being produced. However, in future gas stimulation applications (as distinct from experiments), a minimum production of tritium by the explosive is desirable since product contamination by this nuclide may place severe limitations on the use of the tritiated gas. In Sloop, where production of copper is the goal of the experiment, product contamination would not be caused by tritium but could result from other nuclides: Thus, gas stimulation could require the use of fission explosives while the lower cost per kiloton of thermonuclear explosives could make them attractive for ore-crushing applications. Because of this consideration, radionuclide production calculations must be made for both fission and for thermonuclear explosives in the underground environment. Such activation calculations materials of construction are performed in a manner similar to that described in another paper, but radionuclide production in the environment must be computed using both fission neutron and 14-MeV neutron sources in order to treat the 'source term' problem realistically. In making such computations, parameter studies including the effects of environmental temperature, neutron shielding, and rock types have been carried out. Results indicate the importance of carefully evaluating the radionuclide production for each individual underground engineering application. (author)

  15. Source Term Model for Fine Particle Resuspension from Indoor Surfaces

    National Research Council Canada - National Science Library

    Kim, Yoojeong; Gidwani, Ashok; Sippola, Mark; Sohn, Chang W

    2008-01-01

    This Phase I effort developed a source term model for particle resuspension from indoor surfaces to be used as a source term boundary condition for CFD simulation of particle transport and dispersion in a building...

  16. Source term modelling parameters for Project-90

    International Nuclear Information System (INIS)

    Shaw, W.; Smith, G.; Worgan, K.; Hodgkinson, D.; Andersson, K.

    1992-04-01

    This document summarises the input parameters for the source term modelling within Project-90. In the first place, the parameters relate to the CALIBRE near-field code which was developed for the Swedish Nuclear Power Inspectorate's (SKI) Project-90 reference repository safety assessment exercise. An attempt has been made to give best estimate values and, where appropriate, a range which is related to variations around base cases. It should be noted that the data sets contain amendments to those considered by KBS-3. In particular, a completely new set of inventory data has been incorporated. The information given here does not constitute a complete set of parameter values for all parts of the CALIBRE code. Rather, it gives the key parameter values which are used in the constituent models within CALIBRE and the associated studies. For example, the inventory data acts as an input to the calculation of the oxidant production rates, which influence the generation of a redox front. The same data is also an initial value data set for the radionuclide migration component of CALIBRE. Similarly, the geometrical parameters of the near-field are common to both sub-models. The principal common parameters are gathered here for ease of reference and avoidance of unnecessary duplication and transcription errors. (au)

  17. Source term calculations - Ringhals 2 PWR

    International Nuclear Information System (INIS)

    Johansson, L.L.

    1998-02-01

    This project was performed within the fifth and final phase of sub-project RAK-2.1 of the Nordic Co-operative Reactor Safety Program, NKS.RAK-2.1 has also included studies of reflooding of degraded core, recriticality and late phase melt progression. Earlier source term calculations for Swedish nuclear power plants are based on the integral code MAAP. A need was recognised to compare these calculations with calculations done with mechanistic codes. In the present work SCDAP/RELAP5 and CONTAIN were used. Only limited results could be obtained within the frame of RAK-2.1, since many problems were encountered using the SCDAP/RELAP5 code. The main obstacle was the extremely long execution times of the MOD3.1 version, but also some dubious fission product calculations. However, some interesting results were obtained for the studied sequence, a total loss of AC power. The report describes the modelling approach for SCDAP/RELAP5 and CONTAIN, and discusses results for the transient including the event of a surge line creep rupture. The study will probably be completed later, providing that an improved SCDAP/RELAP5 code version becomes available. (au) becomes available. (au)

  18. Case studies in the application of probabilistic safety assessment techniques to radiation sources. Final report of a coordinated research project 2001-2003

    International Nuclear Information System (INIS)

    2006-04-01

    Radiation sources are used worldwide in many industrial and medical applications. In general, the safety record associated with their use has been very good. However, accidents involving these sources have occasionally resulted in unplanned exposures to individuals. When assessed prospectively, this type of exposure is termed a 'potential exposure'. The International Commission on Radiological Protection (ICRP) has recommended the assessment of potential exposures that may result from radiation sources and has suggested that probabilistic safety assessment (PSA) techniques may be used in this process. Also, Paragraph 2.13 of the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources (BSS) requires that the authorization process for radiation sources include an assessment of all exposures, including potential exposures, which may result from the use of a radiation source. In light of the ICRP's work described above, and the possibility that PSA techniques could be used in exposure assessments that are required by the BSS, the IAEA initiated a coordinated research project (CRP) to study the benefits and limitations of the application of PSA techniques to radiation sources. The results of this CRP are presented in this publication. It should be noted that these results are based solely on the work performed, and the conclusions drawn, by the research teams involved in this CRP. It is intended that international organizations involved in radiation protection will review the information in this report and will take account of it during the development of guidance and requirements related to the assessment of potential exposures from radiation sources. Also, it is anticipated that the risk insights obtained through the studies will be considered by medical practitioners, facility staff and management, equipment designers, and regulators in their safety management and risk evaluation activities. A draft

  19. 10 CFR 50.67 - Accident source term.

    Science.gov (United States)

    2010-01-01

    ... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... to January 10, 1997, who seek to revise the current accident source term used in their design basis...

  20. Trading wind generation from short-term probabilistic forecasts of wind power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Chevallier, Christophe; Kariniotakis, Georges

    2007-01-01

    Due to the fluctuating nature of the wind resource, a wind power producer participating in a liberalized electricity market is subject to penalties related to regulation costs. Accurate forecasts of wind generation are therefore paramount for reducing such penalties and thus maximizing revenue......, as well as on modeling of the sensitivity a wind power producer may have to regulation costs. The benefits resulting from the application of these strategies are clearly demonstrated on the test case of the participation of a multi-MW wind farm in the Dutch electricity market over a year....... participation. Such strategies permit to further increase revenues and thus enhance competitiveness of wind generation compared to other forms of dispatchable generation. This paper formulates a general methodology for deriving optimal bidding strategies based on probabilistic forecasts of wind generation...

  1. Short-term Probabilistic Forecasting of Wind Speed Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Iversen, Jan Emil Banning; Morales González, Juan Miguel; Møller, Jan Kloppenborg

    2016-01-01

    and uncertain nature. In this paper, we propose a modeling framework for wind speed that is based on stochastic differential equations. We show that stochastic differential equations allow us to naturally capture the time dependence structure of wind speed prediction errors (from 1 up to 24 hours ahead) and......It is widely accepted today that probabilistic forecasts of wind power production constitute valuable information for both wind power producers and power system operators to economically exploit this form of renewable energy, while mitigating the potential adverse effects related to its variable......, most importantly, to derive point and quantile forecasts, predictive distributions, and time-path trajectories (also referred to as scenarios or ensemble forecasts), all by one single stochastic differential equation model characterized by a few parameters....

  2. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  3. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  4. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  5. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  6. A Bayesian Method for Short-Term Probabilistic Forecasting of Photovoltaic Generation in Smart Grid Operation and Control

    Directory of Open Access Journals (Sweden)

    Gabriella Ferruzzi

    2013-02-01

    Full Text Available A new short-term probabilistic forecasting method is proposed to predict the probability density function of the hourly active power generated by a photovoltaic system. Firstly, the probability density function of the hourly clearness index is forecasted making use of a Bayesian auto regressive time series model; the model takes into account the dependence of the solar radiation on some meteorological variables, such as the cloud cover and humidity. Then, a Monte Carlo simulation procedure is used to evaluate the predictive probability density function of the hourly active power by applying the photovoltaic system model to the random sampling of the clearness index distribution. A numerical application demonstrates the effectiveness and advantages of the proposed forecasting method.

  7. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  8. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.

    2013-10-01

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  9. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  10. Design parameters and source terms: Volume 3, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan /endash/ Conceptual Design Report, SCP-CDR. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites

  11. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  12. Phase 1 immobilized low-activity waste operational source term

    International Nuclear Information System (INIS)

    Burbank, D.A.

    1998-01-01

    This report presents an engineering analysis of the Phase 1 privatization feeds to establish an operational source term for storage and disposal of immobilized low-activity waste packages at the Hanford Site. The source term information is needed to establish a preliminary estimate of the numbers of remote-handled and contact-handled waste packages. A discussion of the uncertainties and their impact on the source term and waste package distribution is also presented. It should be noted that this study is concerned with operational impacts only. Source terms used for accident scenarios would differ due to alpha and beta radiation which were not significant in this study

  13. Radiological and chemical source terms for Solid Waste Operations Complex

    International Nuclear Information System (INIS)

    Boothe, G.F.

    1994-01-01

    The purpose of this document is to describe the radiological and chemical source terms for the major projects of the Solid Waste Operations Complex (SWOC), including Project W-112, Project W-133 and Project W-100 (WRAP 2A). For purposes of this document, the term ''source term'' means the design basis inventory. All of the SWOC source terms involve the estimation of the radiological and chemical contents of various waste packages from different waste streams, and the inventories of these packages within facilities or within a scope of operations. The composition of some of the waste is not known precisely; consequently, conservative assumptions were made to ensure that the source term represents a bounding case (i.e., it is expected that the source term would not be exceeded). As better information is obtained on the radiological and chemical contents of waste packages and more accurate facility specific models are developed, this document should be revised as appropriate. Radiological source terms are needed to perform shielding and external dose calculations, to estimate routine airborne releases, to perform release calculations and dose estimates for safety documentation, to calculate the maximum possible fire loss and specific source terms for individual fire areas, etc. Chemical source terms (i.e., inventories of combustible, flammable, explosive or hazardous chemicals) are used to determine combustible loading, fire protection requirements, personnel exposures to hazardous chemicals from routine and accident conditions, and a wide variety of other safety and environmental requirements

  14. Probabilistic M/EEG source imaging from sparse spatio-temporal event structure

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Attias, Hagai T.; Wipf, David

    While MEG and EEG source imaging methods have to tackle a severely ill-posed problem their success can be stated as their ability to constrain the solutions using appropriate priors. In this paper we propose a hierarchical Bayesian model facilitating spatio-temporal patterns through the use of bo...

  15. Environmental radiation safety source term evaluation program

    International Nuclear Information System (INIS)

    Moss, O.R.; Filipy, R.E.; Cannon, W.C.; Craig, D.K.

    1977-04-01

    Plutonium-238 is currently used in the form of a pure refractory oxide as a power source on a number of space vehicles that have already been or will be launched during the next few years. Although the sources are designed and built to withstand re-entry into the earth's atmosphere and impact with the earth's surface without releasing any plutonium, the possibility of such an event can never be absolutely excluded. Three separate tasks were undertaken in this study. The interactions between soils and 238 PuO 2 aerosols which might be created in a space launch about environment were examined. Aging of the plutonium-soil mixture under a humid atmosphere showed a trend toward the slow coagulation of two dilute aerosols. Studies on marine animals were conducted to assess the response of 238 PuO 2 pellets to conditions found 60 feet below the ocean surface. Ultrafilterability studies measured the solubility of 238 PuO 2 as a function of time, temperature, suspension concentration and molality of solvent

  16. A Probabilistic Clustering Theory of the Organization of Visual Short-Term Memory

    Science.gov (United States)

    Orhan, A. Emin; Jacobs, Robert A.

    2013-01-01

    Experimental evidence suggests that the content of a memory for even a simple display encoded in visual short-term memory (VSTM) can be very complex. VSTM uses organizational processes that make the representation of an item dependent on the feature values of all displayed items as well as on these items' representations. Here, we develop a…

  17. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  18. Insights provided by Probabilistic Safety Assessment Relating to the Loss of Electrical Sources

    International Nuclear Information System (INIS)

    Lanore, Jeanne-Marie

    2015-01-01

    The loss of electrical sources is generally an important contributor to the risk related to nuclear plants. In particular the external hazards initiating events lead generally to a loss of electrical sources. This importance was underscored by the Fukushima accident. A strength of PSA is to provide insights not only into the causes of the event but also into the potential consequences (core damage prevention, large release prevention, and mitigation) with the corresponding risk impact. PSA could provide a measure of Defence-in-Depth in case of loss of a safety function. The task intends to illustrate the PSA capabilities with outstanding practical examples. The task will rely on a survey of existing PSAs. It will provide a complementary view for ROBELSYS task. The content and status of the task are summarized in 2 slides

  19. Source term and radiological consequences of the Chernobyl accident

    International Nuclear Information System (INIS)

    Mourad, R.

    1987-09-01

    This report presents the results of a study of the source term and radiological consequences of the Chernobyl accident. The results two parts. The first part was performed during the first 2 months following the accident and dealt with the evaluation of the source term and an estimate of individual doses in the European countries outside the Soviet Union. The second part was performed after August 25-29, 1986 when the Soviets presented in a IAEA Conference in Vienna detailed information about the accident, including source term and radiological consequences in the Soviet Union. The second part of the study reconfirms the source term evaluated in the first part and in addition deals with the radiological consequences in the Soviet Union. Source term and individual doses are calculated from measured post-accident data, reported by the Soviet Union and European countries, microcomputer program PEAR (Public Exposure from Accident Releases). 22 refs

  20. Aerosol behavior and light water reactor source terms

    International Nuclear Information System (INIS)

    Abbey, F.; Schikarski, W.O.

    1988-01-01

    The major developments in nuclear aerosol modeling following the accident to pressurized water reactor Unit 2 at Three Mile Island are briefly reviewed and the state of the art summarized. The importance and implications of these developments for severe accident source terms for light water reactors are then discussed in general terms. The treatment is not aimed at identifying specific source term values but is intended rather to illustrate trends, to assess the adequacy of the understanding of major aspects of aerosol behavior for source term prediction, and demonstrate in qualitative terms the effect of various aspects of reactor design. Areas where improved understanding of aerosol behavior might lead to further reductions in current source terms predictions are also considered

  1. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    International Nuclear Information System (INIS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-01-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh. (paper)

  2. SHEDS-HT: an integrated probabilistic exposure model for prioritizing exposures to chemicals with near-field and dietary sources.

    Science.gov (United States)

    Isaacs, Kristin K; Glen, W Graham; Egeghy, Peter; Goldsmith, Michael-Rock; Smith, Luther; Vallero, Daniel; Brooks, Raina; Grulke, Christopher M; Özkaynak, Halûk

    2014-11-04

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for high-throughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirect exposures from near-field sources, SHEDS-HT employs a fugacity-based module to estimate concentrations in indoor environmental media. The concentration estimates, along with relevant exposure factors and human activity data, are then used by the model to rapidly generate probabilistic population distributions of near-field indirect exposures via dermal, nondietary ingestion, and inhalation pathways. Pathway-specific estimates of near-field direct exposures from consumer products are also modeled

  3. Medium-Term Probabilistic Forecasting of Extremely Low Prices in Electricity Markets: Application to the Spanish Case

    Directory of Open Access Journals (Sweden)

    Antonio Bello

    2016-03-01

    Full Text Available One of the most relevant challenges that have arisen in electricity markets during the last few years is the emergence of extremely low prices. Trying to predict these events is crucial for market agents in a competitive environment. This paper proposes a novel methodology to simultaneously accomplish punctual and probabilistic hourly predictions about the appearance of extremely low electricity prices in a medium-term scope. The proposed approach for making real ex ante forecasts consists of a nested compounding of different forecasting techniques, which incorporate Monte Carlo simulation, combined with spatial interpolation techniques. The procedure is based on the statistical identification of the process key drivers. Logistic regression for rare events, decision trees, multilayer perceptrons and a hybrid approach, which combines a market equilibrium model with logistic regression, are used. Moreover, this paper assesses whether periodic models in which parameters switch according to the day of the week can be even more accurate. The proposed techniques are compared to a Markov regime switching model and several naive methods. The proposed methodology empirically demonstrates its effectiveness by achieving promising results on a real case study based on the Spanish electricity market. This approach can provide valuable information for market agents when they face decision making and risk-management processes. Our findings support the additional benefit of using a hybrid approach for deriving more accurate predictions.

  4. Dose assessments for Greifswald and Cadarache with new source terms from ITER NSSR-1

    International Nuclear Information System (INIS)

    Raskob, W.; Forschungszentrum Karlsruhe GmbH Technik und Umwelt; Hasemann, I.

    1997-08-01

    Probabilistic dose assessments for accidental atmospheric releases of various ITER source terms which contain tritium and/or activation products were performed for the sites of Greifswald, Germany, and Cadarache, France. No country specific rules were applied and the input parameters were adapted as far as possible to those used within former ITER studies to achieve a better comparability with site independent dose assessments performed in the frame of ITER. The calculations were based on source terms which, at the first time, contain a combination of tritium and activation products. This allowed a better judgement of the contribution of the individual fusion relevant materials to the total dose. The results were compared to site independent dose limits defined in the frame of ITER. Source terms for two different categories, representing 'extremely unlikely events' (CAT-IV) and 'hypothetical sequences' (CAT-V), were investigated. In no cases, the release scenarios of category CAT-IV exceeded the ITER limits. In addition, early doses from the hypothetical scenarios of type CAT-V were still below 50 mSv or 100 mSv, values which are commonly used as lower reference values for evacuation in many potential home countries of ITER. Only the banning of food products was found to be a potential countermeasure which may affect larger areas. (orig.) [de

  5. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 1: Model components for sources parameterization

    Science.gov (United States)

    Azzaro, Raffaele; Barberi, Graziella; D'Amico, Salvatore; Pace, Bruno; Peruzza, Laura; Tuvè, Tiziana

    2017-11-01

    The volcanic region of Mt. Etna (Sicily, Italy) represents a perfect lab for testing innovative approaches to seismic hazard assessment. This is largely due to the long record of historical and recent observations of seismic and tectonic phenomena, the high quality of various geophysical monitoring and particularly the rapid geodynamics clearly demonstrate some seismotectonic processes. We present here the model components and the procedures adopted for defining seismic sources to be used in a new generation of probabilistic seismic hazard assessment (PSHA), the first results and maps of which are presented in a companion paper, Peruzza et al. (2017). The sources include, with increasing complexity, seismic zones, individual faults and gridded point sources that are obtained by integrating geological field data with long and short earthquake datasets (the historical macroseismic catalogue, which covers about 3 centuries, and a high-quality instrumental location database for the last decades). The analysis of the frequency-magnitude distribution identifies two main fault systems within the volcanic complex featuring different seismic rates that are controlled essentially by volcano-tectonic processes. We discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults by using an historical approach and a purely geologic method. We derive a magnitude-size scaling relationship specifically for this volcanic area, which has been implemented into a recently developed software tool - FiSH (Pace et al., 2016) - that we use to calculate the characteristic magnitudes and the related mean recurrence times expected for each fault. Results suggest that for the Mt. Etna area, the traditional assumptions of uniform and Poissonian seismicity can be relaxed; a time-dependent fault-based modeling, joined with a 3-D imaging of volcano-tectonic sources depicted by the recent instrumental seismicity, can therefore be implemented in PSHA maps

  6. ITER Safety Task NID-5A, Subtask 1-1: Source terms and energies - initial tritium source terms. Final report

    International Nuclear Information System (INIS)

    Fong, C.; Kalyanam, K.M.; Tanaka, M.R.; Sood, S.; Natalizio, A.; Delisle, M.

    1995-02-01

    The overall objective of the Early Safety and Environmental Characterization Study (ESECS) is to assess the environmental impact of tritium using appropriate assumptions on a hypothetical site for ITER, having the r eference s ite characteristics as proposed by the JCT. The objective of this work under the above subtask 1-1, NID-5a, is to determine environmental source terms (i.e., process source term x containment release fraction) for the fuel cycle and cooling systems. The work is based on inventories and process source terms (i.e., inventory x mobilization fraction), provided by others (under Task NID 3b). The results of this work form the basis for the determination, by others, of the off-site dose (i.e., environmental source term x dose/release ratio). For the determination of the environmental source terms, the TMAP4 code has been utilized (ref 1). This code is approved by ITER for safety assessment. 6 refs

  7. Short-term wind power forecasting: probabilistic and space-time aspects

    DEFF Research Database (Denmark)

    Tastu, Julija

    work deals with the proposal and evaluation of new mathematical models and forecasting methods for short-term wind power forecasting, accounting for space-time dynamics based on geographically distributed information. Different forms of power predictions are considered, starting from traditional point...... into the corresponding models are analysed. As a final step, emphasis is placed on generating space-time trajectories: this calls for the prediction of joint multivariate predictive densities describing wind power generation at a number of distributed locations and for a number of successive lead times. In addition......Optimal integration of wind energy into power systems calls for high quality wind power predictions. State-of-the-art forecasting systems typically provide forecasts for every location individually, without taking into account information coming from the neighbouring territories. It is however...

  8. Probabilistic diffusion tractography of the optic radiations and visual function in preterm infants at term equivalent age.

    Science.gov (United States)

    Bassi, Laura; Ricci, Daniela; Volzone, Anna; Allsop, Joanna M; Srinivasan, Latha; Pai, Aakash; Ribes, Carmen; Ramenghi, Luca A; Mercuri, Eugenio; Mosca, Fabio; Edwards, A David; Cowan, Frances M; Rutherford, Mary A; Counsell, Serena J

    2008-02-01

    Children born prematurely have a high incidence of visual disorders which cannot always be explained by focal retinal or brain lesions. The aim of this study was to test the hypothesis that visual function in preterm infants is related to the microstructural development of white matter in the optic radiations. We used diffusion tensor imaging (DTI) with probabilistic diffusion tractography to delineate the optic radiations at term equivalent age and compared the fractional anisotropy (FA) to a contemporaneous evaluation of visual function. Thirty-seven preterm infants (19 male) born at median (range) 28(+4) (24(+1)-32(+3)) weeks gestational age, were examined at a post-menstrual age of 42 (39(+6)-43) weeks. MRI and DTI were acquired on a 3 Tesla MR system with DTI obtained in 15 non-collinear directions with a b value of 750 s/mm(2). Tracts were generated from a seed mask placed in the white matter lateral to the lateral geniculate nucleus and mean FA values of these tracts were determined. Visual assessment was performed using a battery of nine items assessing different aspects of visual abilities. Ten infants had evidence of cerebral lesions on conventional MRI. Multiple regression analysis demonstrated that the visual assessment score was independently correlated with FA values, but not gestational age at birth, post-menstrual age at scan or the presence of lesions on conventional MRI. The occurrence of mild retinopathy of prematurity did not affect the FA measures or visual scores. We then performed a secondary analysis using tract-based spatial statistics to determine whether global brain white matter development was related to visual function and found that only FA in the optic radiations was correlated with visual assessment score. Our results suggest that in preterm infants at term equivalent age visual function is directly related to the development of white matter in the optic radiations.

  9. Probabilistic integrated risk assessment of human exposure risk to environmental bisphenol A pollution sources.

    Science.gov (United States)

    Fu, Keng-Yen; Cheng, Yi-Hsien; Chio, Chia-Pin; Liao, Chung-Min

    2016-10-01

    Environmental bisphenol A (BPA) exposure has been linked to a variety of adverse health effects such as developmental and reproductive issues. However, establishing a clear association between BPA and the likelihood of human health is complex yet fundamentally uncertain. The purpose of this study was to assess the potential exposure risks from environmental BPA among Chinese population based on five human health outcomes, namely immune response, uterotrophic assay, cardiovascular disease (CVD), diabetes, and behavior change. We addressed these health concerns by using a stochastic integrated risk assessment approach. The BPA dose-dependent likelihood of effects was reconstructed by a series of Hill models based on animal models or epidemiological data. We developed a physiologically based pharmacokinetic (PBPK) model that allows estimation of urinary BPA concentration from external exposures. Here we showed that the daily average exposure concentrations of BPA and urinary BPA estimates were consistent with the published data. We found that BPA exposures were less likely to pose significant risks for infants (0-1 year) and adults (male and female >20 years) with human long-term BPA susceptibility in relation to multiple exposure pathways, and for informing the public of the negligible magnitude of environmental BPA pollution impacts on human health.

  10. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  11. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  12. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  13. Effect of source term composition on offsite doses

    International Nuclear Information System (INIS)

    Karahalios, P.; Gardner, R.

    1985-01-01

    The development of new realistic accident source terms has identified the need to establish a basis for comparing the impact of such source terms. This paper attempts to develop a generalized basis of comparison by investigating contributions to offsite acute whole body doses from each group of radionuclides being released to the atmosphere, using CRAC2. The paper also investigates the effect of important parameters such as regional meteorology, sheltering, and duration of release. Finally, the paper focuses on significant changes in the relative importance of individual radionuclide groups in PWR2, SST1, and a revision of the Stone and Webster proposed interim source term

  14. Source term analyses under severe accidents for KNGR

    Energy Technology Data Exchange (ETDEWEB)

    Song, Yong Mann; Park, Soo Yong

    2001-03-01

    In this study, in-containment source term for LOFW (Loss of Feed Water), which has appeared the most frequent core melt accident, is calculated and compared with NUREG-1465 source term. This study provides not only new source term data using MELCOR1.8.4 and its state-of-the-art models but also evaluating basis of KNGR design and its mitigation capability under severe accidents. As the selected accident is identical with LOFW-S17, which has been analyzed using MAAP by KEPCO with only difference of 2 SITs, mutual comparison of the results is especially expected.

  15. Long-term medical costs and life expectancy of acute myeloid leukemia: a probabilistic decision model.

    Science.gov (United States)

    Wang, Han-I; Aas, Eline; Howell, Debra; Roman, Eve; Patmore, Russell; Jack, Andrew; Smith, Alexandra

    2014-03-01

    Acute myeloid leukemia (AML) can be diagnosed at any age and treatment, which can be given with supportive and/or curative intent, is considered expensive compared with that for other cancers. Despite this, no long-term predictive models have been developed for AML, mainly because of the complexities associated with this disease. The objective of the current study was to develop a model (based on a UK cohort) to predict cost and life expectancy at a population level. The model developed in this study combined a decision tree with several Markov models to reflect the complexity of the prognostic factors and treatments of AML. The model was simulated with a cycle length of 1 month for a time period of 5 years and further simulated until age 100 years or death. Results were compared for two age groups and five different initial treatment intents and responses. Transition probabilities, life expectancies, and costs were derived from a UK population-based specialist registry-the Haematological Malignancy Research Network (www.hmrn.org). Overall, expected 5-year medical costs and life expectancy ranged from £8,170 to £81,636 and 3.03 to 34.74 months, respectively. The economic and health outcomes varied with initial treatment intent, age at diagnosis, trial participation, and study time horizon. The model was validated by using face, internal, and external validation methods. The results show that the model captured more than 90% of the empirical costs, and it demonstrated good fit with the empirical overall survival. Costs and life expectancy of AML varied with patient characteristics and initial treatment intent. The robust AML model developed in this study could be used to evaluate new diagnostic tools/treatments, as well as enable policy makers to make informed decisions. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Development of source term PIRT of Fukushima Daiichi NPPs accident

    International Nuclear Information System (INIS)

    Suehiro, S.; Okamoto, K.

    2017-01-01

    The severe accident evaluation committee of AESJ (Atomic Energy Society of Japan) developed the thermal hydraulic PIRT (Phenomena Identification and Ranking Table) and the source term PIRT based on findings during the Fukushima Daiichi NPPs accident. These PIRTs aimed to explore the debris distribution and the current condition in the NPPs with high accuracy and to extract higher priority from the aspect of the sophistication of the analytical technology to predict the severe accident phenomena by the code. The source term PIRT was divided into 3 phases for the time domain and 9 categories for the spatial domain. The 68 phenomena were extracted and the importance from viewpoint of the source term was ranked through brainstorming and discussion. This paper describes the developed source term PIRT list and summarized the high ranked phenomena in each phase. (author)

  17. Revised accident source terms and control room habitability

    International Nuclear Information System (INIS)

    Lahti, G.P.; Hubner, R.S.; Johnson, W.J.; Schwartz, B.C.

    1993-01-01

    In April 1992, the NRC staff presented to the Commissioners the draft NUREG open-quotes Revised Accident Source Terms for Light-Water Nuclear Power Plants.close quotes This document is the culmination of more than ten years of NRC-sponsored research and represents the first change in the NRC's position on source terms since TID-14844 was issued in 1962. The purpose of this paper is to investigate the impact of the revised source terms on the current approach to analyzing control room habitability as required by 10 CFR 50. Sample calculations are presented that identify aspects of the model requiring clarification before the implementation of the revised source terms. 6 refs., 4 tabs

  18. The latest results from source term research. Overview and outlook

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, Luis E. [Centro de Investigaciones Energeticas Medio Ambientales y Tecnologica (CIEMAT), Madrid (Spain); Haste, Tim [Centre d' Etudes de Cadarache, Paul-Lez-Durance (France). Institut de Radioprotection et de Surete Nucleaire (IRSN); Kaerkelae, Teemu [VTT Technical Research Centre of Finland Ltd, Espoo (Finland)

    2016-12-15

    Source term research has continued internationally for more than 30 years, increasing confidence in calculations of the potential radioactive release to the environment after a severe reactor accident. Important experimental data have been obtained, mainly under international frameworks such as OECD/NEA and EURATOM. Specifically, Phebus FP provides major insights into fission product release and transport. Results are included in severe accident analysis codes. Data from international projects are being interpreted with a view to further improvements in these codes. This paper synthesizes the recent main outcomes from source term research on these topics, and on source term mitigation. It highlights knowledge gaps remaining and discusses ways to proceed. Aside from this further knowledge-driven research, there is consensus on the need to assess the source term predictive ability of current system codes, taking account of scale-up from experiment to reactor conditions.

  19. Revised accident source terms for light-water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Soffer, L. [Nuclear Regulatory Commission, Washington, DC (United States)

    1995-02-01

    This paper presents revised accident source terms for light-water reactors incorporating the severe accident research insights gained in this area over the last 15 years. Current LWR reactor accident source terms used for licensing date from 1962 and are contained in Regulatory Guides 1.3 and 1.4. These specify that 100% of the core inventory of noble gases and 25% of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental iodine. These assumptions have strongly affected present nuclear air cleaning requirements by emphasizing rapid actuation of spray systems and filtration systems optimized to retain elemental iodine. A proposed revision of reactor accident source terms and some im implications for nuclear air cleaning requirements was presented at the 22nd DOE/NRC Nuclear Air Cleaning Conference. A draft report was issued by the NRC for comment in July 1992. Extensive comments were received, with the most significant comments involving (a) release fractions for both volatile and non-volatile species in the early in-vessel release phase, (b) gap release fractions of the noble gases, iodine and cesium, and (c) the timing and duration for the release phases. The final source term report is expected to be issued in late 1994. Although the revised source terms are intended primarily for future plants, current nuclear power plants may request use of revised accident source term insights as well in licensing. This paper emphasizes additional information obtained since the 22nd Conference, including studies on fission product removal mechanisms, results obtained from improved severe accident code calculations and resolution of major comments, and their impact upon the revised accident source terms. Revised accident source terms for both BWRS and PWRS are presented.

  20. The Multimedia Environmental Pollutant Assessment System (MEPAS)reg-sign: Source-term release formulations

    International Nuclear Information System (INIS)

    Streile, G.P.; Shields, K.D.; Stroh, J.L.; Bagaasen, L.M.; Whelan, G.; McDonald, J.P.; Droppo, J.G.; Buck, J.W.

    1996-11-01

    This report is one of a series of reports that document the mathematical models in the Multimedia Environmental Pollutant Assessment System (MEPAS). Developed by Pacific Northwest National Laboratory for the US Department of Energy, MEPAS is an integrated impact assessment software implementation of physics-based fate and transport models in air, soil, and water media. Outputs are estimates of exposures and health risk assessments for radioactive and hazardous pollutants. Each of the MEPAS formulation documents covers a major MEPAS component such as source-term, atmospheric, vadose zone/groundwater, surface water, and health exposure/health impact assessment. Other MEPAS documentation reports cover the sensitivity/uncertainty formulations and the database parameter constituent property estimation methods. The pollutant source-term release component is documented in this report. MEPAS simulates the release of contaminants from a source, transport through the air, groundwater, surface water, or overland pathways, and transfer through food chains and exposure pathways to the exposed individual or population. For human health impacts, risks are computed for carcinogens and hazard quotients for noncarcinogens. MEPAS is implemented on a desktop computer with a user-friendly interface that allows the user to define the problem, input the required data, and execute the appropriate models for both deterministic and probabilistic analyses

  1. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  2. Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model

    Science.gov (United States)

    Anderson, K. R.

    2016-12-01

    Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics

  3. Bayesian source term determination with unknown covariance of measurements

    Science.gov (United States)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  4. Utility view of the source term and air cleaning

    International Nuclear Information System (INIS)

    Littlefield, P.S.

    1985-01-01

    The utility view of the source term and air cleaning is discussed. The source term is made up of: (1) noble gases, which there has been a tendency to ignore in the past because it was thought there was nothing that could be done with them anyway, (2) the halogens, which have been dealt with in Air Cleaning Conferences in the past in terms of charcoal and other systems for removing them, and (3) the solid components of the source term which particulate filters are designed to handle. Air cleaning systems consist of filters, adsorbers, containment sprays, suppression pools in boiling water reactors and ice beds in ice condenser-equipped plants. The feasibility and cost of air cleaning systems are discussed

  5. Low-level radioactive waste performance assessments: Source term modeling

    International Nuclear Information System (INIS)

    Icenhour, A.S.; Godbee, H.W.; Miller, L.F.

    1995-01-01

    Low-level radioactive wastes (LLW) generated by government and commercial operations need to be isolated from the environment for at least 300 to 500 yr. Most existing sites for the storage or disposal of LLW employ the shallow-land burial approach. However, the U.S. Department of Energy currently emphasizes the use of engineered systems (e.g., packaging, concrete and metal barriers, and water collection systems). Future commercial LLW disposal sites may include such systems to mitigate radionuclide transport through the biosphere. Performance assessments must be conducted for LUW disposal facilities. These studies include comprehensive evaluations of radionuclide migration from the waste package, through the vadose zone, and within the water table. Atmospheric transport mechanisms are also studied. Figure I illustrates the performance assessment process. Estimates of the release of radionuclides from the waste packages (i.e., source terms) are used for subsequent hydrogeologic calculations required by a performance assessment. Computer models are typically used to describe the complex interactions of water with LLW and to determine the transport of radionuclides. Several commonly used computer programs for evaluating source terms include GWSCREEN, BLT (Breach-Leach-Transport), DUST (Disposal Unit Source Term), BARRIER (Ref. 5), as well as SOURCE1 and SOURCE2 (which are used in this study). The SOURCE1 and SOURCE2 codes were prepared by Rogers and Associates Engineering Corporation for the Oak Ridge National Laboratory (ORNL). SOURCE1 is designed for tumulus-type facilities, and SOURCE2 is tailored for silo, well-in-silo, and trench-type disposal facilities. This paper focuses on the source term for ORNL disposal facilities, and it describes improved computational methods for determining radionuclide transport from waste packages

  6. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  7. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  8. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  9. Selection of models to calculate the LLW source term

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab

  10. Review of SFR In-Vessel Radiological Source Term Studies

    International Nuclear Information System (INIS)

    Suk, Soo Dong; Lee, Yong Bum

    2008-10-01

    An effort has been made in this study to search for and review the literatures in public domain on the studies of the phenomena related to the release of radionuclides and aerosols to the reactor containment of the sodium fast reactor (SFR) plants (i.e., in-vessel source term), made in Japan and Europe including France, Germany and UK over the last few decades. Review work is focused on the experimental programs to investigate the phenomena related to determining the source terms, with a brief review on supporting analytical models and computer programs. In this report, the research programs conducted to investigate the CDA (core disruptive accident) bubble behavior in the sodium pool for determining 'primary' or 'instantaneous' source term are first introduced. The studies performed to determine 'delayed source term' are then described, including the various stages of phenomena and processes: fission product (FP) release from fuel , evaporation release from the surface of the pool, iodine mass transfer from fission gas bubble, FP deposition , and aerosol release from core-concrete interaction. The research programs to investigate the release and transport of FPs and aerosols in the reactor containment (i.e., in-containment source term) are not described in this report

  11. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  12. Actinide Source Term Program, position paper. Revision 1

    International Nuclear Information System (INIS)

    Novak, C.F.; Papenguth, H.W.; Crafts, C.C.; Dhooge, N.J.

    1994-01-01

    The Actinide Source Term represents the quantity of actinides that could be mobilized within WIPP brines and could migrate with the brines away from the disposal room vicinity. This document presents the various proposed methods for estimating this source term, with a particular focus on defining these methods and evaluating the defensibility of the models for mobile actinide concentrations. The conclusions reached in this document are: the 92 PA open-quotes expert panelclose quotes model for mobile actinide concentrations is not defensible; and, although it is extremely conservative, the open-quotes inventory limitsclose quotes model is the only existing defensible model for the actinide source term. The model effort in progress, open-quotes chemical modeling of mobile actinide concentrationsclose quotes, supported by a laboratory effort that is also in progress, is designed to provide a reasonable description of the system and be scientifically realistic and supplant the open-quotes Inventory limitsclose quotes model

  13. A Study on Improvement of Algorithm for Source Term Evaluation

    International Nuclear Information System (INIS)

    Park, Jeong Ho; Park, Do Hyung; Lee, Jae Hee

    2010-03-01

    The program developed by KAERI for source term assessment of radwastes from the advanced nuclear fuel cycle consists of spent fuel database analysis module, spent fuel arising projection module, and automatic characterization module for radwastes from pyroprocess. To improve the algorithm adopted the developed program, following items were carried out: - development of an algorithm to decrease analysis time for spent fuel database - development of setup routine for a analysis procedure - improvement of interface for spent fuel arising projection module - optimization of data management algorithm needed for massive calculation to estimate source terms of radwastes from advanced fuel cycle The program developed through this study has a capability to perform source term estimation although several spent fuel assemblies with different fuel design, initial enrichment, irradiation history, discharge burnup, and cooling time are processed at the same time in the pyroprocess. It is expected that this program will be very useful for the design of unit process of pyroprocess and disposal system

  14. Determination of source term for Krsko NPP extended fuel cycle

    International Nuclear Information System (INIS)

    Nemec, T.; Persic, A.; Zagar, T.; Zefran, B.

    2004-01-01

    The activity and composition of the potential radioactive releases (source term) is important in the decision making about off-site emergency measures in case of a release into environment. Power uprate of Krsko NPP during modernization in 2000 as well as changing of the fuel type and the core design have influenced the source term value. In 2003 a project of 'Jozef Stefan' Institute and Slovenian nuclear safety administration determined a plantspecific source term for new conditions of fuel type and burnup for extended fuel cycle. Calculations of activity and isotopic composition of the core have been performed with ORIGEN-ARP program. Results showed that the core activity for extended 15 months fuel cycle is slightly lower than for the 12 months cycles, mainly due to larger share of fresh fuel. (author)

  15. Directional Unfolded Source Term (DUST) for Compton Cameras.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Dean J.; Mitchell, Dean J.; Horne, Steven M.; O' Brien, Sean; Thoreson, Gregory G

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  16. Spallation Neutron Source Accident Terms for Environmental Impact Statement Input

    Energy Technology Data Exchange (ETDEWEB)

    Devore, J.R.; Harrington, R.M.

    1998-08-01

    This report is about accidents with the potential to release radioactive materials into the environment surrounding the Spallation Neutron Source (SNS). As shown in Chap. 2, the inventories of radioactivity at the SNS are dominated by the target facility. Source terms for a wide range of target facility accidents, from anticipated events to worst-case beyond-design-basis events, are provided in Chaps. 3 and 4. The most important criterion applied to these accident source terms is that they should not underestimate potential release. Therefore, conservative methodology was employed for the release estimates. Although the source terms are very conservative, excessive conservatism has been avoided by basing the releases on physical principles. Since it is envisioned that the SNS facility may eventually (after about 10 years) be expanded and modified to support a 4-MW proton beam operational capability, the source terms estimated in this report are applicable to a 4-MW operating proton beam power unless otherwise specified. This is bounding with regard to the 1-MW facility that will be built and operated initially. See further discussion below in Sect. 1.2.

  17. Flowsheets and source terms for radioactive waste projections

    International Nuclear Information System (INIS)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF 6 conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables

  18. ITER Safety Task NID-5A, Subtask 1-1: Source terms and energies - initial tritium source terms. Final report

    International Nuclear Information System (INIS)

    Fong, C.; Kalyanam, K.M.; Tanaka, M.R.; Sood, S.; Natalizio, A.; Delisle, M.

    1995-02-01

    The overall objective of the Early Safety and Environmental Characterization Study (ESECS) is to assess the environmental impact of tritium using appropriate assumptions on a hypothetical site for ITER, having the r eference s ite characteristics as proposed by the JCT. The objective of this work under the above subtask 1-1, NID-5a, is to determine environmental source terms (i.e., process source term x containment release fraction) for the fuel cycle and cooling systems. The work is based on inventories and process source terms (i.e., inventory x mobilization fraction), provided by others (under Task NID 3b). The results of this work form the basis for the determination, by others, of the off-site dose (i.e., environmental source term x dose/release ratio). For the determination of the environmental source terms, the TMAP4 code has been utilized (ref 1). This code is approved by ITER for safety assessment. Volume 3 is a compilation of appendices giving detailed results of the study

  19. ITER Safety Task NID-5A, Subtask 1-1: Source terms and energies - initial tritium source terms. Final report

    International Nuclear Information System (INIS)

    Fong, C.; Kalyanam, K.M.; Tanaka, M.R.; Sood, S.; Natalizio, A.; Delisle, M.

    1995-02-01

    The overall objective of the Early Safety and Environmental Characterization Study (ESECS) is to assess the environmental impact of tritium using appropriate assumptions on a hypothetical site for ITER, having the r eference s ite characteristics as proposed by the JCT. The objective of this work under the above subtask 1-1, NID-5a, is to determine environmental source terms (i.e., process source term x containment release fraction) for the fuel cycle and cooling systems. The work is based on inventories and process source terms (i.e., inventory x mobilization fraction), provided by others (under Task NID 3b). The results of this work form the basis for the determination, by others, of the off-site dose (i.e., environmental source term x dose/release ratio). For the determination of the environmental source terms, the TMAP4 code has been utilized (ref 1). This code is approved by ITER for safety assessment. Volume 2 is a compilation of appendices giving detailed results of the study. 5 figs

  20. Perspectives on source terms based on early research and development

    International Nuclear Information System (INIS)

    Pressesky, A.J.

    1985-07-01

    This report presents an overview of the key documentation of the research and development programs relevant to the source term issue which were undertaken by the Atomic Energy Commission between 1950 and 1970. The source term is taken to be the amount, composition (physical and chemical), and timing of the projected release of radioactivity to the environment in the hypothetical event of a severe reactor accident in a light water reactor of the type currently being licensed, built and operated. The objective is to illuminate and provide perspectives on (a) the maturity of the technical data base and the analytical methodology, (b) the extent to which remaining conservatisms can be applied to compensate for uncertainties, (c) the purpose for which the technology and methodology will be used, and (d) the need to keep problems and uncertainties in proper perspective. Comments that can provide some context for the difficult programmatic choices to be made are included, and technical considerations that may be inadequately applied or neglected in some current source term calculations were studied. This review has not uncovered any significant technical considerations that have been omitted or are being inadequately treated in current source term analyses, except perhaps the contribution made to in-containment aerosols by coolant comminution upon escape at pressure from the reactor coolant system. 11 refs

  1. STACE: Source Term Analyses for Containment Evaluations of transport casks

    International Nuclear Information System (INIS)

    Seager, K.D.; Gianoulakis, S.E.; Barrett, P.R.; Rashid, Y.R.; Reardon, P.C.

    1992-01-01

    Following the guidance of ANSI N14.5, the STACE methodology provides a technically defensible means for estimating maximum permissible leakage rates. These containment criteria attempt to reflect the true radiological hazard by performing a detailed examination of the spent fuel, CRUD, and residual contamination contributions to the releasable source term. The evaluation of the spent fuel contribution to the source term has been modeled fairly accurately using the STACE methodology. The structural model predicts the cask drop load history, the mechanical response of the fuel assembly, and the probability of cladding breach. These data are then used to predict the amount of fission gas, volatile species, and fuel fines that are releasable from the cask. There are some areas where data are sparse or lacking (e.g., the quantity and size distribution of fuel rod breaches) in which experimental validation is planned. The CRUD spallation fraction is the major area where no quantitative data has been found; therefore, this also requires experimental validation. In the interim, STACE conservatively assumes a 100% spallation fraction for computing the releasable activity. The source term methodology also conservatively assumes that there is 1 Ci of residual contamination available for release in the transport cask. However, residual contamination is still by far the smallest contributor to the source term activity

  2. Literature study of source term research for PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Sponton, L.L.; NiIsson, Lars

    2001-04-01

    A literature survey has been carried out in support of ongoing source term calculations with the MELCOR code of some severe accident scenarios for the Swedish Ringhals 2 pressurised water reactor (PWR). The research in the field of severe accidents in power reactors and the source term for subsequent release of radioisotopes was intensified after the Harrisburg accident and has produced a large amount of reports and papers. This survey was therefore limited to research concerning PWR type of reactors and with emphasis on papers related to MELCOR code development. A background is given, relating to some historic documents, and then more recent research after 1990 is reviewed. Of special interest is the ongoing PMbus-programme which is creating new and important results of benefit to the code development and validation of, among others, the MELCOR code. It is concluded that source term calculations involve simulation of many interacting complex physical phenomena, which result in large uncertainties The research has, however, over the years led to considerable improvements Thus has the uncertainty in source term predictions been reduced one to two orders of magnitude from the simpler codes in the early 1980-s to the more realistic codes of today, like MELCOR.

  3. Fission product source term research at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Malinauskas, A.P.

    1985-01-01

    The purpose of this work is to describe some of the research being performed at ORNL in support of the effort to describe, as realistically as possible, fission product source terms for nuclear reactor accidents. In order to make this presentation manageable, only those studies directly concerned with fission product behavior, as opposed to thermal hydraulics, accident sequence progression, etc., will be discussed

  4. Literature study of source term research for PWRs

    International Nuclear Information System (INIS)

    Sponton, L.L.; NiIsson, Lars

    2001-04-01

    A literature survey has been carried out in support of ongoing source term calculations with the MELCOR code of some severe accident scenarios for the Swedish Ringhals 2 pressurised water reactor (PWR). The research in the field of severe accidents in power reactors and the source term for subsequent release of radioisotopes was intensified after the Harrisburg accident and has produced a large amount of reports and papers. This survey was therefore limited to research concerning PWR type of reactors and with emphasis on papers related to MELCOR code development. A background is given, relating to some historic documents, and then more recent research after 1990 is reviewed. Of special interest is the ongoing PMbus-programme which is creating new and important results of benefit to the code development and validation of, among others, the MELCOR code. It is concluded that source term calculations involve simulation of many interacting complex physical phenomena, which result in large uncertainties The research has, however, over the years led to considerable improvements Thus has the uncertainty in source term predictions been reduced one to two orders of magnitude from the simpler codes in the early 1980-s to the more realistic codes of today, like MELCOR

  5. EDF source term reduction project main outcomes and further developments

    International Nuclear Information System (INIS)

    Ranchoux, Gilles; Bonnefon, Julien; Benfarah, Moez; Wintergerst Matthieu; Gressier, Frederic; Leclercq, Stephanie

    2012-09-01

    The dose reduction is a strategic purpose for EDF in link with the stakes of, nuclear acceptability, respect of regulation and productivity gains. This consists not only in improving the reactor shutdown organization (time spent in control area, biological shielding,...) but also in improving the radiological state of the unit and the efficiency of the source term reduction operations. Since 2003, EDF has been running an innovative project called 'Source Term Reduction' federating the different EDF research and engineering centers in order to: - participate to the long term view about Radiological Protection issues (international feedback analyses), - develop contamination prediction tools (OSCAR software) suitable for the industrial needs (operating units and EPR design), - develop scientific models useful for the understanding of contamination mechanisms to support the strategic decision processes, - carry on with updating and analyzing of contamination measurements feedback in corrosion products (EMECC and CZT campaigns), - carry on with the operational support at short or middle term by optimizing startup and shutdown processes, pre-oxidation or and by improving purification efficiency or material characteristics. This paper will show in a first part the main 2011 results in occupational exposure (collective and individual dose, RCS index...). In a second part, an overview of the main EDF outcomes of the last 3 years in the field of source term reduction will be presented. Future developments extended to contamination issues in EDF NPPs will be also pointed out in this paper. (authors)

  6. Near-source mobile methane emission estimates using EPA Method33a and a novel probabilistic approach as a basis for leak quantification in urban areas

    Science.gov (United States)

    Albertson, J. D.

    2015-12-01

    Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large uncertainties in current approaches. In this paper, we describe results from a series of near-source (< 30 m) controlled methane releases where an instrumented van was used to measure methane concentrations during both fixed location sampling and during mobile traverses immediately downwind of the source. The measurements were used to evaluate the application of EPA Method 33A for estimating methane emissions downwind of a source and also to test the application of a new probabilistic approach for estimating emission rates from mobile traverse data.

  7. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  8. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  9. Visualizing Probabilistic Proof

    OpenAIRE

    Guerra-Pujol, Enrique

    2015-01-01

    The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.

  10. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  11. Realistic minimum accident source terms - Evaluation, application, and risk acceptance

    International Nuclear Information System (INIS)

    Angelo, P. L.

    2009-01-01

    The evaluation, application, and risk acceptance for realistic minimum accident source terms can represent a complex and arduous undertaking. This effort poses a very high impact to design, construction cost, operations and maintenance, and integrated safety over the expected facility lifetime. At the 2005 Nuclear Criticality Safety Division (NCSD) Meeting in Knoxville Tenn., two papers were presented mat summarized the Y-12 effort that reduced the number of criticality accident alarm system (CAAS) detectors originally designed for the new Highly Enriched Uranium Materials Facility (HEUMF) from 258 to an eventual as-built number of 60. Part of that effort relied on determining a realistic minimum accident source term specific to the facility. Since that time, the rationale for an alternate minimum accident has been strengthened by an evaluation process that incorporates realism. A recent update to the HEUMF CAAS technical basis highlights the concepts presented here. (authors)

  12. Considerations about source term now used aiming to emergency planning

    International Nuclear Information System (INIS)

    Austregesilo Filho, H.

    1987-01-01

    The applicability of source terms, in parametric studies for improving external emergengy plan for Angra-I reactor is presented. The source term is defined as, the quantity and radioactive material disposable for releasing to the environment in case of austere accident in a nuclear power plant. The following hypothesis: occuring accident, 100% of the noble gases, 50% of halogens and 1% of solid fission products contained into the reactor core, are released immediately toward the containment building; the radioactivity releasing to the environment is done at a constant rate of 0.1% in mass per day; the actuation of mitigated systems of radioactivity releasing, such as, spray of container or system of air recirculation by filters, is not considered; and the releasing is done at soil level. (M.C.K.) [pt

  13. Development of in-vessel source term analysis code, tracer

    International Nuclear Information System (INIS)

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  14. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    Energy Technology Data Exchange (ETDEWEB)

    Frederick, Jennifer M

    2018-03-01

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  15. Fission product source terms and engineered safety features

    International Nuclear Information System (INIS)

    Malinauskas, A.P.

    1984-01-01

    The author states that new, technically defensible, methodologies to establish realistic source term values for nuclear reactor accidents will soon be available. Although these methodologies will undoubtedly find widespread use in the development of accident response procedures, the author states that it is less clear that the industry is preparing to employ the newer results to develop a more rational approach to strategies for the mitigation of fission product releases. Questions concerning the performance of existing engineered safety systems are reviewed

  16. Basic repository source term and data sheet report: Lavender Canyon

    International Nuclear Information System (INIS)

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Lavender Canyon, Utah. 3 refs; 6 tabs

  17. Basic repository source term and data sheet report: Davis Canyon

    International Nuclear Information System (INIS)

    1988-01-01

    This report is one of series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water electricity, and natural gas. Data are presented for construction and operation at an assumed site in Davis Canyon, Utah. 6 tabs

  18. Source terms for airborne radioactivity arising from uranium mill wastes

    International Nuclear Information System (INIS)

    O'Riordan, M.C.; Downing, A.L.

    1978-01-01

    One of the problems in assessing the radiological impact of uranium milling is to determine the rates of release to the air of material from the various sources of radioactivity. Such source terms are required for modelling the transport of radioactive material in the atmosphere. Activity arises from various point and area sources in the mill itself and from the mill tailings. The state of the tailings changes in time from slurry to solid. A layer of water may be maintained over the solids during the life of the mine, and the tailings may be covered with inert material on abandonment. Releases may be both gaseous and particulate. This paper indicates ways in which radon emanation and the suspension of long-lived particulate activity might be quantified, and areas requiring further exploration are identified

  19. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    Science.gov (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  20. The Dependency of Probabilistic Tsunami Hazard Assessment on Magnitude Limits of Seismic Sources in the South China Sea and Adjoining Basins

    Science.gov (United States)

    Li, Hongwei; Yuan, Ye; Xu, Zhiguo; Wang, Zongchen; Wang, Juncheng; Wang, Peitao; Gao, Yi; Hou, Jingming; Shan, Di

    2017-06-01

    The South China Sea (SCS) and its adjacent small basins including Sulu Sea and Celebes Sea are commonly identified as tsunami-prone region by its historical records on seismicity and tsunamis. However, quantification of tsunami hazard in the SCS region remained an intractable issue due to highly complex tectonic setting and multiple seismic sources within and surrounding this area. Probabilistic Tsunami Hazard Assessment (PTHA) is performed in the present study to evaluate tsunami hazard in the SCS region based on a brief review on seismological and tsunami records. 5 regional and local potential tsunami sources are tentatively identified, and earthquake catalogs are generated using Monte Carlo simulation following the Tapered Gutenberg-Richter relationship for each zone. Considering a lack of consensus on magnitude upper bound on each seismic source, as well as its critical role in PTHA, the major concern of the present study is to define the upper and lower limits of tsunami hazard in the SCS region comprehensively by adopting different corner magnitudes that could be derived by multiple principles and approaches, including TGR regression of historical catalog, fault-length scaling, tectonic and seismic moment balance, and repetition of historical largest event. The results show that tsunami hazard in the SCS and adjoining basins is subject to large variations when adopting different corner magnitudes, with the upper bounds 2-6 times of the lower. The probabilistic tsunami hazard maps for specified return periods reveal much higher threat from Cotabato Trench and Sulawesi Trench in the Celebes Sea, whereas tsunami hazard received by the coasts of the SCS and Sulu Sea is relatively moderate, yet non-negligible. By combining empirical method with numerical study of historical tsunami events, the present PTHA results are tentatively validated. The correspondence lends confidence to our study. Considering the proximity of major sources to population-laden cities

  1. Dose assessments for Greifswald and Cadarache with updated source terms from ITER NSSR-2

    International Nuclear Information System (INIS)

    Raskob, W.; Hasemann, I.

    1998-08-01

    The International Thermonuclear Experimental Reactor ITER is in its late engineering phase. One of the most important safety aspects - in particular for achieving public acceptance - is to assure that the releases of harzardous material are minimal during normal operation and for accidental events, even if very unlikely. To this purpose probabilistic dose assessments for accidental atmospheric releases of various ITER source terms which contain tritium and/or activation products were performed for the sites of Greifswald, Germany, and Cadarache, France. In addition, routine releases into the atmosphere and hydrosphere have been evaluated. No country specific rules were applied and the input parameters were adapted as far as possible to those used within former studies to achieve a better comparability with site independent dose assessments performed in the frame of ITER. The calculations were based on source terms which, for the first time, contain a combination of tritium and activation products. This allowed a better judgment of the contribution to the total dose of the individual fusion relevant materials. The results were compared to site independent dose limits defined in the frame of ITER. Annual doses from routine releases (CAT-I) are below 0.1 μSv for the aquatic scenarios and are close to 1 μSv for the atmospheric source terms. Source terms for two different categories of accidental releases, representing 'extremely unlikely events' (CAT-IV) and 'hypothetical sequences' (CAT-V), were investigated. In none of these cases, the release scenarios of category CAT-IV exceed the ITER limits. In addition, relevant characteristic quantities of the early dose distribution from the hypothetical scenarios of type CAT-V are still below 50 mSv or 100 mSv, values which are commonly used as lower reference values for evacuation in many potential home countries of ITER. These site specific assessments confirmed that the proposed release limits and thus the derived dose

  2. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  3. A simple method for estimating potential source term bypass fractions from confinement structures

    International Nuclear Information System (INIS)

    Kalinich, D.A.; Paddleford, D.F.

    1997-01-01

    Confinement structures house many of the operating processes at the Savannah River Site (SRS). Under normal operating conditions, a confinement structure in conjunction with its associated ventilation systems prevents the release of radiological material to the environment. However, under potential accident conditions, the performance of the ventilation systems and integrity of the structure may be challenged. In order to calculate the radiological consequences associated with a potential accident (e.g. fires, explosion, spills, etc.), it is necessary to determine the fraction of the source term initially generated by the accident that escapes from the confinement structure to the environment. While it would be desirable to estimate the potential bypass fraction using sophisticated control-volume/flow path computer codes (e.g. CONTAIN, MELCOR, etc.) in order to take as much credit as possible for the mitigative effects of the confinement structure, there are many instances where using such codes is not tractable due to limits on the level-of-effort allotted to perform the analysis. Moreover, the current review environment, with its emphasis on deterministic/bounding-versus probabilistic/best-estimate-analysis discourages using analytical techniques that require the consideration of a large number of parameters. Discussed herein is a simplified control-volume/flow path approach for calculating source term bypass fraction that is amenable to solution in a spreadsheet or with a commercial mathematical solver (e.g. MathCad or Mathematica). It considers the effects of wind and fire pressure gradients on the structure, ventilation system operation, and Halon discharges. Simple models are used to characterize the engineered and non-engineered flow paths. By making judicious choices for the limited set of problem parameters, the results from this approach can be defended as bounding and conservative

  4. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  5. Overview of plant specific source terms and their impact on risk

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    2004-01-01

    Probabilistic risk assesment and safety assessment focuses on systems and measures to prevent core meltdown, and it integrates many aspects of design and operation. It provides mapping of initiating event, frequencies onto plant damage state and through plant systems analysis, utilizes fault tree and event tree logic models, may include 'external event' analysis such as fire, flood, wind, seismic events. Percent contribution of sequences to the core damage frequency are shown for the following plants, taken as examples ZION, EDISON, OCONEE 3, SEABROOK, SIZEWELL B, MILLSTONE 3, RINGHALS 2. The presentation includes comparison of the following initiating event frequencies: loss of off-site power; small LOCA; large LOCA, steam generator tube rupture; loss of feedwater; turbine trip; reactor trip. Consequence analysis deals with: dispersion and depletion of radioactivity in the atmosphere, health effects, factors in the off-site emergency plan analyzed with codes that address the weather conditions; provision of mapping of source terms; risk diagram for early fatalities and for latent cancer fatalities

  6. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  7. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  8. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  9. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  10. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    Science.gov (United States)

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Development of Reference Source Terms for EU-APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Kim, ByungIl; Lee, Chonghui; Lee, Dongsu; Ko, Heejin; Kang, Sangho [KEPCO Engineering and Construction Co. Inc., Yongin (Korea, Republic of)

    2014-05-15

    These source terms are developed for the typical U. S. NPP and do not reflect the design characteristics of EU-APR1400 (1,400 MWe PWR) which will be applied for the EUR certification in European countries. The process of developing the RST for EU-APR1400 is to undergo a similar process that NUREG-1465 had gone through when it came out with its proposed source terms. The purpose of this study is to develop the EU-APR1400 design-specific RST complied with the EUR. The Large LOCA is the reference equence used in the NUREG-1465 evaluation, whereas the EUAPR1400 risk-significant sequences are dominated by small LOCA and non-LOCA sequences. Moreover, when considering the EU-APR1400 has many design features to mitigate the consequences of severe accident phenomena, it is not surprising that the aspects of both release fractions and durations are distinctly different from NUREG-1465. This RST will be continuously updated to reflect to the design features of EU-APR1400, and then, be used as the reference for design purposes such as criteria satisfaction of radioactivity releases, equipment survivability, control room habitability for severe accident, and so on.

  12. Centrifugal Filtration System for Severe Accident Source Term Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Shu Chang; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of this paper is to present the conceptual design of a filtration system that can be used to process airborne severe accident source term. Reactor containment may lose its structural integrity due to over-pressurization during a severe accident. This can lead to uncontrolled radioactive releases to the environment. For preventing the dispersion of these uncontrolled radioactive releases to the environment, several ways to capture or mitigate these radioactive source term releases are under investigation at KAIST. Such technologies are based on concepts like a vortex-like air curtain, a chemical spray, and a suction arm. Treatment of the radioactive material captured by these systems would be required, before releasing to environment. For current filtration systems in the nuclear industry, IAEA lists sand, multi-venturi scrubber, high efficiency particulate arresting (HEPA), charcoal and combinations of the above in NS-G-1-10, 4.143. Most if not all of the requirements of the scenario for applying this technology near the containment of an NPP site and the environmental constraints were analyzed for use in the design of the centrifuge filtration system.

  13. Atucha-I source terms for sequences initiated by transients

    International Nuclear Information System (INIS)

    Baron, J.; Bastianelli, B.

    1997-01-01

    The present work is part of an expected source terms study in the Atucha I nuclear power plant during severe accidents. From the accident sequences with a significant probability to produce core damage, those initiated by operational transients have been identified as the most relevant. These sequences have some common characteristics, in the sense that all of them resume in the opening of the primary system safety valves, and leave this path open for the coolant loss. In the case these sequences continue as severe accidents, the same path will be used for the release of the radionuclides, from the core, through the primary system and to the containment. Later in the severe accident sequence, the failure of the pressure vessel will occur, and the corium will fall inside the reactor cavity, interacting with the concrete. During these processes, more radioactive products will be released inside the containment. In the present work the severe accident simulation initiated by a blackout is performed, from the point of view of the phenomenology of the behavior of the radioactive products, as they are transported in the piping, during the core-concrete interactions, and inside the containment buildings until it failure. The final result is the source term into the atmosphere. (author) [es

  14. NRC source term assessment for incident response dose projections

    International Nuclear Information System (INIS)

    Easley, P.; Pasedag, W.

    1984-01-01

    The NRC provides advice and assistance to licensees and State and local authorities in responding to accidents. The TACT code supports this function by providing source term projections for two situations during early (15 to 60 minutes) accident response: (1) Core/containment damage is indicated, but there are no measured releases. Quantification of a predicted release permits emergency response before people are exposed. With TACT, response personnel can estimate releases based on fuel and cladding conditions, coolant boundary and containment integrity, and mitigative systems operability. For this type of estimate, TACT is intermediate between default assumptions and time-consuming mechanistic codes. (2) A combination of plant status and limited release data are available. For this situation, iterations between predictions based on known conditions which are compared to measured releases gives reasonable confidence in supplemental source term information otherwise unavailable: nuclide mix, releases not monitored, and trending or abrupt changes. The assumptions and models used in TACT, and examples of its use, are given in this paper

  15. Chernobyl source term, atmospheric dispersion, and dose estimation

    International Nuclear Information System (INIS)

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1988-02-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling, and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. These analyses indicated that essentially all of the noble gases, 80% of the radioiodines, 40% of the radiocesium, 10% of the tellurium, and about 1% or less of the more refractory elements were released. Atmospheric dispersion modeling of the radioactive cloud over the Northern Hemisphere revealed that the cloud became segmented during the first day, with the lower section heading toward Scandinavia and the uppper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. The inhalation doses due to direct cloud exposure were estimated to exceed 10 mGy near the Chernobyl area, to range between 0.1 and 0.001 mGy within most of Europe, and to be generally less than 0.00001 mGy within the US. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents, while the 137 Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests. 9 refs., 3 figs., 6 tabs

  16. Analysis of the source term in the Chernobyl-4 accident

    International Nuclear Information System (INIS)

    Alonso, A.; Lopez Montero, J.V.; Pinedo Garrido, P.

    1990-01-01

    The report presents the analysis of the Chernobyl accident and of the phenomena with major influence on the source term, including the chemical effects of materials dumped over the reactor, carried out by the Chair of Nuclear Technology at Madrid University under a contract with the CEC. It also includes the comparison of the ratio (Cs-137/Cs-134) between measurements performed by Soviet authorities and countries belonging to the Community and OECD area. Chapter II contains a summary of both isotope measurements (Cs-134 and Cs-137), and their ratios, in samples of air, water, soil and agricultural and animal products collected by the Soviets in their report presented in Vienna (1986). Chapter III reports on the inventories of cesium isotopes in the core, while Chapter IV analyses the transient, especially the fuel temperature reached, as a way to deduce the mechanisms which took place in the cesium escape. The cesium source term is analyzed in Chapter V. Normal conditions have been considered, as well as the transient and the post-accidental period, including the effects of deposited materials. The conclusion of this study is that Chernobyl accidental sequence is specific of the RBMK type of reactors, and that in the Western world, basic research on fuel behaviour for reactivity transients has already been carried out

  17. Deterministic and probabilistic interval prediction for short-term wind power generation based on variational mode decomposition and machine learning methods

    International Nuclear Information System (INIS)

    Zhang, Yachao; Liu, Kaipei; Qin, Liang; An, Xueli

    2016-01-01

    Highlights: • Variational mode decomposition is adopted to process original wind power series. • A novel combined model based on machine learning methods is established. • An improved differential evolution algorithm is proposed for weight adjustment. • Probabilistic interval prediction is performed by quantile regression averaging. - Abstract: Due to the increasingly significant energy crisis nowadays, the exploitation and utilization of new clean energy gains more and more attention. As an important category of renewable energy, wind power generation has become the most rapidly growing renewable energy in China. However, the intermittency and volatility of wind power has restricted the large-scale integration of wind turbines into power systems. High-precision wind power forecasting is an effective measure to alleviate the negative influence of wind power generation on the power systems. In this paper, a novel combined model is proposed to improve the prediction performance for the short-term wind power forecasting. Variational mode decomposition is firstly adopted to handle the instability of the raw wind power series, and the subseries can be reconstructed by measuring sample entropy of the decomposed modes. Then the base models can be established for each subseries respectively. On this basis, the combined model is developed based on the optimal virtual prediction scheme, the weight matrix of which is dynamically adjusted by a self-adaptive multi-strategy differential evolution algorithm. Besides, a probabilistic interval prediction model based on quantile regression averaging and variational mode decomposition-based hybrid models is presented to quantify the potential risks of the wind power series. The simulation results indicate that: (1) the normalized mean absolute errors of the proposed combined model from one-step to three-step forecasting are 4.34%, 6.49% and 7.76%, respectively, which are much lower than those of the base models and the hybrid

  18. Use of source term uncoupled in radionuclide migration equations

    International Nuclear Information System (INIS)

    Silveira, Claudia Siqueira da; Lima, Zelmo Rodrigues de; Alvim, Antonio Carlos Marques

    2008-01-01

    Final repositories of high-level radioactive waste have been considered in deep, low permeability and stable geological formations. A common problem found is the migration modeling of radionuclides in a fractured rock. In this work, the physical system adopted consists of the rock matrix containing a single planar fracture situated in water saturated porous rock. The partial differential equations that describe the radionuclide transport were discretized using finite differences techniques, of which the following methods were adopted: Explicit Euler, Implicit Euler and Crank-Nicholson. For each one of these methods, the advective term was discretized with the following numerical schemes: backward differences, centered differences and forward differences. We make a comparison to determine which temporal and space discretization has the best result in relation to a reference solution. The obtained results show that the Explicit Euler Method with forward discretization in the advective term has a good accuracy. Next, with the objective of improving the answer of the Implicit Euler and Crank-Nicholson Methods it was accomplished a source term uncouplement, the diffusive flux. The obtained results were considered satisfactory by comparison with previous studies. (author)

  19. Release modes and processes relevant to source-term calculations at Yucca Mountain

    International Nuclear Information System (INIS)

    Apted, M.J.

    1994-01-01

    The feasibility of permanent disposal of radioactive high-level waste (HLW) in repositories located in deep geologic formations is being studied world-wide. The most credible release pathway is interaction between groundwater and nuclear waste forms, followed by migration of radionuclide-bearing groundwater to the accessible environment. Under hydrologically unsaturated conditions, vapor transport of volatile radionuclides is also possible. The near-field encompasses the waste packages composed of engineered barriers (e.g. man-made materials, such as vitrified waste forms, corrosion-resistant containers), while the far-field includes the natural barriers (e.g. host rock, hydrologic setting). Taken together, these two subsystems define a series of multiple, redundant barriers that act to assure the safe isolation of nuclear waste. In the U.S., the Department of energy (DOE) is investigating the feasibility of safe, long-term disposal of high-level nuclear waste at the Yucca Mountain site in Nevada. The proposed repository horizon is located in non-welded tuffs within the unsaturated zone (i.e. above the water table) at Yucca Mountain. The purpose of this paper is to describe the source-term models for radionuclide release from waste packages at Yucca Mountain site. The first section describes the conceptual release modes that are relevant for this site and waste package design, based on a consideration of the performance of currently proposed engineered barriers under expected and unexpected conditions. No attempt is made to asses the reasonableness nor probability of occurrence for any specific release mode. The following section reviews the waste-form characteristics that are required to model and constrain the release of radionuclides from the waste package. The next section present mathematical models for the conceptual release modes, selected from those that have been implemented into a probabilistic total system assessment code developed for the Electric Power

  20. Preliminary investigation of processes that affect source term identification

    International Nuclear Information System (INIS)

    Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.

    1991-09-01

    Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ( 3 H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total 3 H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon et al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use 3 H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily 3 H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the 3 H discharge from SWSA 5 to streams is increasing or decreasing

  1. Source term identification in atmospheric modelling via sparse optimization

    Science.gov (United States)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  2. A comparison of world-wide uses of severe reactor accident source terms

    International Nuclear Information System (INIS)

    Ang, M.L.; Frid, W.; Kersting, E.J.; Friederichs, H.G.; Lee, R.Y.; Meyer-Heine, A.; Powers, D.A.; Soda, K.; Sweet, D.

    1994-09-01

    The definitions of source terms to reactor containments and source terms to the environment are discussed. A comparison is made between the TID-14844 example source term and the alternative source term described in NUREG-1465. Comparisons of these source terms to the containments and those used in France, Germany, Japan, Sweden, and the United Kingdom are made. Source terms to the environment calculated in NUREG-1500 and WASH-1400 are discussed. Again, these source terms are compared to those now being used in France, Germany, Japan, Sweden, and the United Kingdom. It is concluded that source terms to the containment suggested in NUREG-1465 are not greatly more conservative than those used in other countries. Technical bases for the source terms are similar. The regulatory use of the current understanding of radionuclide behavior varies among countries

  3. Influence of iodine chemistry on source term assessment

    International Nuclear Information System (INIS)

    Herranz Puebla, L. E.; Lopez Diez, I.; Rodriguez Maroto, J. J.; Martinez Lopez-Alcorocho, A.

    1991-01-01

    The major goal of a phenomenology analysis of containment during a severe accident situation can be spitted into the following ones: to know the containment response to the different loads and to predict accurately the fission product and aerosol behavior. In this report, the main results coming from the study of a hypothetical accident scenario, based on LA-4 experiment of LACE project, are presented. In order to do it, several codes have been coupled: CONTEMPT4/MOD5 (thermohydraulics), NAUA/MOD5 (aerosol physics) and IODE (iodine chemistry). It has been demonstrated the impossibility of assessing with confidence the Source Term if the chemical conduct of some radionuclides is not taken into account. In particular, the influence on the iodine retention efficiency of the sump of variables such as pH has been proven. (Author)12 refs

  4. Tank waste source term inventory validation. Volume II. Letter report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories.

  5. Tank waste source term inventory validation. Volume 1. Letter report

    International Nuclear Information System (INIS)

    Brevick, C.H.; Gaddis, L.A.; Johnson, E.D.

    1995-01-01

    The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories. This document is Volume I of the Letter Report entitled Tank Waste Source Term Inventory Validation

  6. Running the source term code package in Elebra MX-850

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-01-01

    The source term package (STCP) is one of the main tools applied in calculations of behavior of fission products from nuclear power plants. It is a set of computer codes to assist the calculations of the radioactive materials leaving from the metallic containment of power reactors to the environment during a severe reactor accident. The original version of STCP runs in SDC computer systems, but as it has been written in FORTRAN 77, is possible run it in others systems such as IBM, Burroughs, Elebra, etc. The Elebra MX-8500 version of STCP contains 5 codes:March 3, Trapmelt, Tcca, Vanessa and Nava. The example presented in this report has taken into consideration a small LOCA accident into a PWR type reactor. (M.I.)

  7. Lysimeter data as input to performance assessment source term codes

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.; Rogers, R.D.; Sullivan, T.

    1992-01-01

    The Field Lysimeter Investigation: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-II c prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. In this paper, radionuclide releases from waste forms in the first seven years of sampling are presented and discussed. Application of lysimeter data to be used in performance assessment source term models is presented. Initial results from use of data in two models are discussed

  8. Tank waste source term inventory validation. Volume II. Letter report

    International Nuclear Information System (INIS)

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories

  9. The EC CAST project (carbon-14 source term)

    International Nuclear Information System (INIS)

    Williams, S. J.

    2015-01-01

    Carbon-14 is a key radionuclide in the assessment of the safety of underground geological disposal facilities for radioactive wastes. It is possible for carbon-14 to be released from waste packages in a variety of chemical forms, both organic and inorganic, and as dissolved or gaseous species The EC CAST (CArbon-14 Source Term) project aims to develop understanding of the generation and release of carbon-14 from radioactive waste materials under conditions relevant to packaging and disposal. It focuses on the release of carbon-14 from irradiated metals (steels and zirconium alloys), from irradiated graphite and from spent ion-exchange resins. The CAST consortium brings together 33 partners. CAST commenced in October 2013 and this paper describes progress to March 2015. The main activities during this period were reviews of the current status of knowledge, the identification and acquisition of suitable samples and the design of experiments and analytical procedures. (authors)

  10. Source term development for tritium at the Sheffield disposal site

    International Nuclear Information System (INIS)

    MacKenzie, D.R.; Barletta, R.E.; Smalley, J.F.; Kempf, C.R.; Davis, R.E.

    1984-01-01

    The Sheffield low-level radioactive waste disposal site, which ceased operation in 1978, has been the focus of modeling efforts by the NRC for the purpose of predicting long-term site behavior. To provide the NRC with the information required for its modeling effort, a study to define the source term for tritium in eight trenches at the Sheffield site has been undertaken. Tritium is of special interest since significant concentrations of the isotope have been found in groundwater samples taken at the site and at locations outside the site boundary. Previous estimates of tritium site inventory at Sheffield are in wide disagreement. In this study, the tritium inventory in the eight trenches was estimated by reviewing the radioactive shipping records (RSRs) for waste buried in these trenches. It has been found that the tritium shipped for burial at the site was probably higher than previously estimated. In the eight trenches surveyed, which amount to roughly one half the total volume and activity buried at Sheffield, approximately 2350 Ci of tritium from non-fuel cycle sources were identified. The review of RSRs also formed the basis for obtaining waste package descriptions and for contacting large waste generators to obtain more detailed information regarding these waste packages. As a result of this review and the selected generator contacts, the non-fuel cycle tritium waste was categorized. The tritium releases from each of these waste categories were modeled. The results of this modeling effort are presented for each of the eight trenches selected. 3 references, 2 figures

  11. The Phebus Fission Product and Source Term International Programmes

    International Nuclear Information System (INIS)

    Clement, B.; Zeyen, R.

    2005-01-01

    The international Phebus FP programme, initiated in 1988 is one of the major research programmes on light water reactors severe accidents. After a short description of the facility and of the test matrix, the main outcomes and results of the first four integral tests are provided and analysed. Several results were unexpected and some are of importance for safety analyses, particularly concerning fuel degradation, cladding oxidation, chemical form of some fission products, especially iodine, effect of control rod materials on degradation and chemistry, iodine behaviour in the containment. Prediction capabilities of calculation tools have largely been improved as a result of this research effort. However, significant uncertainties remain for a number of phenomena, requiring detailed physical analysis and implementation of improved models in codes, sustained by a number of separate-effect experiments. This is the subject of the new Source Term programme for a better understanding of the phenomenology on important safety issues, in accordance with priorities defined in the EURSAFE project of the 5 th European framework programme aiming at reducing the uncertainties on Source Term analyses. It covers iodine chemistry, impact of boron carbide control rods degradation and oxidation, air ingress situations and fission product release from fuel. Regarding the interpretation of Phebus, an international co-operation has been established since over ten years, particularly helpful for the improvement and common understanding of severe accident phenomena. Few months ago, the Phebus community was happy to welcome representatives of a large number of organisations from the following new European countries: the Czech republic, Hungary, Lithuania, Slovakia, Slovenia and also from Bulgaria and Romania. (author)

  12. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  13. Conceptual model for deriving the repository source term

    International Nuclear Information System (INIS)

    Alexander, D.H.; Apted, M.J.; Liebetrau, A.M.; Doctor, P.G.; Williford, R.E.; Van Luik, A.E.

    1984-11-01

    Part of a strategy for evaluating the compliance of geologic repositories with federal regulations is a modeling approach that would provide realistic release estimates for a particular configuration of the engineered-barrier system. The objective is to avoid worst-case bounding assumptions that are physically impossible or excessively conservative and to obtain probabilistic estimates of (1) the penetration time for metal barriers and (2) radionuclide-release rates for individually simulated waste packages after penetration has occurred. The conceptual model described in this paper will assume that release rates are explicitly related to such time-dependent processes as mass transfer, dissolution and precipitation, radionuclide decay, and variations in the geochemical environment. The conceptual model will take into account the reduction in the rates of waste-form dissolution and metal corrosion due to a buildup of chemical reaction products. The sorptive properties of the metal-barrier corrosion products in proximity to the waste form surface will also be included. Cumulative releases from the engineered-barrier system will be calculated by summing the releases from a probabilistically generated population of individual waste packages. 14 refs., 7 figs

  14. Development of a seismic source model for probabilistic seismic hazard assessment of nuclear power plant sites in Switzerland: the view from PEGASOS Expert Group 4 (EG1d)

    International Nuclear Information System (INIS)

    Wiemer, S.; Garcia-Fernandez, M.; Burg, J.-P.

    2009-01-01

    We present a seismogenic source model for site-specific probabilistic seismic hazard assessment at the sites of Swiss nuclear power plants. Our model is one of four developed in the framework of the PEGASOS project; it contains a logic tree with nine levels of decision-making. The two primary sources of input used in the areal zonation developed by us are the historical and instrumental seismicity record and large-scale geological/rheological units. From this, we develop a zonation of six macro zones, refined in a series of seven decision steps up to a maximum of 13 zones. Within zones, activity rates are either assumed homogeneous or smoothed using a Gaussian kernel with width of 5 or 15 km. To estimate recurrence rate, we assume a double truncated Gutenberg-Richter law, and consider five models of recurrence parameters with different degrees of freedom. Models are weighted in the logic tree using a weighted Akaike score. The maximum magnitude is estimated following the EPRI approach. We perform extensive sensitivity analyses in rate and hazard space in order to assess the role of de-clustering, the completeness model, quarry contamination, border properties, stationarity, regional b-value and magnitude-dependent hypocentral depth. (author)

  15. Development of a seismic source model for probabilistic seismic hazard assessment of nuclear power plant sites in Switzerland: the view from PEGASOS Expert Group 4 (EG1d)

    Energy Technology Data Exchange (ETDEWEB)

    Wiemer, S. [Institute of Geophysics, ETH Zuerich, Zuerich (Switzerland); Garcia-Fernandez, M. [Spanish Council for Scientific Research, Museum of Natural History, Dept. of Volcanology and Geophysics, Madrid (Spain); Burg, J.-P. [Institute of Geology, ETH Zuerich, Zuerich (Switzerland)

    2009-05-15

    We present a seismogenic source model for site-specific probabilistic seismic hazard assessment at the sites of Swiss nuclear power plants. Our model is one of four developed in the framework of the PEGASOS project; it contains a logic tree with nine levels of decision-making. The two primary sources of input used in the areal zonation developed by us are the historical and instrumental seismicity record and large-scale geological/rheological units. From this, we develop a zonation of six macro zones, refined in a series of seven decision steps up to a maximum of 13 zones. Within zones, activity rates are either assumed homogeneous or smoothed using a Gaussian kernel with width of 5 or 15 km. To estimate recurrence rate, we assume a double truncated Gutenberg-Richter law, and consider five models of recurrence parameters with different degrees of freedom. Models are weighted in the logic tree using a weighted Akaike score. The maximum magnitude is estimated following the EPRI approach. We perform extensive sensitivity analyses in rate and hazard space in order to assess the role of de-clustering, the completeness model, quarry contamination, border properties, stationarity, regional b-value and magnitude-dependent hypocentral depth. (author)

  16. A linear process-algebraic format for probabilistic systems with data (extended version)

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark

    2010-01-01

    This paper presents a novel linear process-algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  17. A linear process-algebraic format for probabilistic systems with data

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark; Gomes, L.; Khomenko, V.; Fernandes, J.M.

    This paper presents a novel linear process algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  18. A simplified approach to evaluating severe accident source term for PWR

    International Nuclear Information System (INIS)

    Huang, Gaofeng; Tong, Lili; Cao, Xuewu

    2014-01-01

    Highlights: • Traditional source term evaluation approaches have been studied. • A simplified approach of source term evaluation for 600 MW PWR is studied. • Five release categories are established. - Abstract: For early design of NPPs, no specific severe accident source term evaluation was considered. Some general source terms have been used for some NPPs. In order to implement a best estimate, a special source term evaluation should be implemented for an NPP. Traditional source term evaluation approaches (mechanism approach and parametric approach) have some difficulties associated with their implementation. The traditional approaches are not consistent with cost-benefit assessment. A simplified approach for evaluating severe accident source term for PWR is studied. For the simplified approach, a simplified containment event tree is established. According to representative cases selection, weighted coefficient evaluation, computation of representative source term cases and weighted computation, five containment release categories are established, including containment bypass, containment isolation failure, containment early failure, containment late failure and intact containment

  19. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  20. Evaluation Plan on In-vessel Source Term in PGSFR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Won; Ha, Kwi-Seok; Ahn, Sang June; Lee, Kwi Lim; Jeong, Taekyeong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    This strategy requires nuclear plants to have features that prevent radionuclide release and multiple barriers to the escape from the plants of any radionuclides that are released despite preventive measures. Considerations of the ability to prevent and mitigate release of radionuclides arise at numerous places in the safety regulations of nuclear plants. The effectiveness of mitigative capabilities in nuclear plants is subject to quantitative analysis. The radionuclide input to these quantitative analyses of effectiveness is the Source Term (ST). All features of the composition, magnitude, timing, chemical form and physical form of accidental radionuclide release constitute the ST. Also, ST is defined as the release of radionuclides from the fuel and coolant into the containment, and subsequently to the environment. The in-vessel STs of PGSFR will be estimated using the methodology of ANL-ART-38 report in additional to 4S methodology. The in-vessel STs are calculated through several phases: The inventory of each radionuclide is calculated by ORIGEN-2 code using the realistic burnup conditions. ST in the release from the core to primary sodium is calculated by using the assumption of ANL methodology. Lastly, ST in the release from the primary sodium to cover gas space is calculated by using equation and experimental materials.

  1. Source term calculations - Ringhals 2 PWR. Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Lise-Lotte

    1998-03-01

    This project was performed within the fifth and final phase of sub-project RAK-2.1 of the Nordic Co-operative Reactor Safety Program, NKS. RAK-2.1 has also included studies of reflooding of degraded core, recriticality and late phase melt progression. Earlier source term calculations for Swedish nuclear power plants are based on the integral code MAAP. A need was recognised to compare these calculations with calculations done with mechanistic codes. In the present work SCDAP/RELAP5 and CONTAIN were used. Only limited results could be obtained within the frame of RAK-2.1, since many problems were encountered using the SCDAP/RELAP5 code. The main obstacle was the extremely long execution times of the MOD3.1 version, but also some dubious fission product calculations. However, some interesting results were obtained for the studied sequence, a total loss of AC power. The report describes the modelling approach for SCDAP/RELAP5 and CONTAIN, and discusses results for the transient including the event of a surge line creep rupture. The study will probably be completed later, providing that an improved SCDAP/RELAP5 code version becomes available 8 refs, 16 figs, 5 tabs

  2. Source term experiments project (STEP): aerosol characterization system

    International Nuclear Information System (INIS)

    Schlenger, B.J.; Dunn, P.F.

    1985-01-01

    A series of four experiments has been conducted at Argonne National Laboratory's TREAT Reactor. These experiments, which are sponsored by an international consortium organized by the Electric Power Research Institute, are designed to investigate the source term, i.e., the type, quantity and timing of release of radioactive fission products from a light water reactor to the environment in the event of a severe accident in which the core is insufficiently cooled. The STEP tests have been designed to provide some of the necessary data regarding the magnitude and release rates of volatile fission products from degraded fuel pins, their physical and chemical characteristics, and aerosol formation and transport phenomena of those fission products that condense to form particles in the cooler regions of the reactor beyond the core. These are inpile experiments, whereby the test fuels are heated in a nuclear test reactor by neutron induced fission and subsequent cladding oxidation in steam environments that simulate as closely as practical predicted severe reactor accident conditions. The test sequences cover a range of pressure and fuel heatup rate, and include the effect of Ag/In/Cd control rod material. 1 ref., 8 figs., 1 tab

  3. Verification test calculations for the Source Term Code Package

    International Nuclear Information System (INIS)

    Denning, R.S.; Wooton, R.O.; Alexander, C.A.; Curtis, L.A.; Cybulskis, P.; Gieseke, J.A.; Jordan, H.; Lee, K.W.; Nicolosi, S.L.

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs

  4. Evaluation Plan on In-vessel Source Term in PGSFR

    International Nuclear Information System (INIS)

    Lee, Seung Won; Ha, Kwi-Seok; Ahn, Sang June; Lee, Kwi Lim; Jeong, Taekyeong

    2016-01-01

    This strategy requires nuclear plants to have features that prevent radionuclide release and multiple barriers to the escape from the plants of any radionuclides that are released despite preventive measures. Considerations of the ability to prevent and mitigate release of radionuclides arise at numerous places in the safety regulations of nuclear plants. The effectiveness of mitigative capabilities in nuclear plants is subject to quantitative analysis. The radionuclide input to these quantitative analyses of effectiveness is the Source Term (ST). All features of the composition, magnitude, timing, chemical form and physical form of accidental radionuclide release constitute the ST. Also, ST is defined as the release of radionuclides from the fuel and coolant into the containment, and subsequently to the environment. The in-vessel STs of PGSFR will be estimated using the methodology of ANL-ART-38 report in additional to 4S methodology. The in-vessel STs are calculated through several phases: The inventory of each radionuclide is calculated by ORIGEN-2 code using the realistic burnup conditions. ST in the release from the core to primary sodium is calculated by using the assumption of ANL methodology. Lastly, ST in the release from the primary sodium to cover gas space is calculated by using equation and experimental materials

  5. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  6. Iodine chemistry effect on source term assessments. A MELCOR 186 YT study of a PWR severe accident sequence

    International Nuclear Information System (INIS)

    Herranz, Luis E.; Garcia, Monica; Otero, Bernadette

    2009-01-01

    Level-2 Probabilistic Safety Analysis has demonstrated to be a powerful tool to give insights into multiple aspects concerning severe accidents: phenomena with the greatest potential to lead to containment failure, safety systems performance and, even, to identify any additional accident management that could mitigate the consequences of such an even, etc. A major result of level-2 PSA is iodine content in Source Term since it is the main responsible for the radiological impact during the first few days after a hypothetical severe accident. Iodine chemistry is known to considerably affect iodine behavior and although understanding has improved substantially since the early 90's, a thorough understanding is still missing and most PSA studies do not address it when assessing severe accident scenarios. This paper emphasizes the quantitative and qualitative significance of considering iodine chemistry in level-2 PSA estimates. To do so a cold leg break, low pressure severe accident sequence of an actual pressurized water reactor has been analyzed with the MELCOR 1.8.6 YT code. Two sets of calculations, with and without chemistry, have been carried out and compared. The study shows that iodine chemistry could result in an iodine release to environment about twice higher, most of which would consist of around 60% of iodine in gaseous form. From these results it is concluded that exploratory studies on the potential effect of iodine chemistry on source term estimates should be carried out. (author)

  7. A Bootstrap-Based Probabilistic Optimization Method to Explore and Efficiently Converge in Solution Spaces of Earthquake Source Parameter Estimation Problems: Application to Volcanic and Tectonic Earthquakes

    Science.gov (United States)

    Dahm, T.; Heimann, S.; Isken, M.; Vasyura-Bathke, H.; Kühn, D.; Sudhaus, H.; Kriegerowski, M.; Daout, S.; Steinberg, A.; Cesca, S.

    2017-12-01

    Seismic source and moment tensor waveform inversion is often ill-posed or non-unique if station coverage is poor or signals are weak. Therefore, the interpretation of moment tensors can become difficult, if not the full model space is explored, including all its trade-offs and uncertainties. This is especially true for non-double couple components of weak or shallow earthquakes, as for instance found in volcanic, geothermal or mining environments.We developed a bootstrap-based probabilistic optimization scheme (Grond), which is based on pre-calculated Greens function full waveform databases (e.g. fomosto tool, doi.org/10.5880/GFZ.2.1.2017.001). Grond is able to efficiently explore the full model space, the trade-offs and the uncertainties of source parameters. The program is highly flexible with respect to the adaption to specific problems, the design of objective functions, and the diversity of empirical datasets.It uses an integrated, robust waveform data processing based on a newly developed Python toolbox for seismology (Pyrocko, see Heimann et al., 2017, http://doi.org/10.5880/GFZ.2.1.2017.001), and allows for visual inspection of many aspects of the optimization problem. Grond has been applied to the CMT moment tensor inversion using W-phases, to nuclear explosions in Korea, to meteorite atmospheric explosions, to volcano-tectonic events during caldera collapse and to intra-plate volcanic and tectonic crustal events.Grond can be used to optimize simultaneously seismological waveforms, amplitude spectra and static displacements of geodetic data as InSAR and GPS (e.g. KITE, Isken et al., 2017, http://doi.org/10.5880/GFZ.2.1.2017.002). We present examples of Grond optimizations to demonstrate the advantage of a full exploration of source parameter uncertainties for interpretation.

  8. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  9. The influence of source term release parameters on health effects

    International Nuclear Information System (INIS)

    Jeong, Jong Tae; Ha, Jae Joo

    1998-08-01

    In this study, the influence of source term release parameters on the health effects was examined. This is very useful in identifying the relative importance of release parameters and can be an important factor in developing a strategy for reducing offsite risks. The release parameters investigated in this study are release height, heat content, fuel burnup, release time, release duration, and warning time. The health effects affected by the change of release parameters are early fatalities, cancer fatalities, early injuries, cancer injuries, early fatality risk, population weighted early fatality risk, population weighted cancer fatality risk, effective whole body population dose, population exceeding an early acute red bone marrow dose of 1.5 Sv, and distance at which early fatalities are expected to occur. As release height increases, the values of early health effects such as early fatalities and injuries decrease. However, the release height dose not have significant influences on late health effects. The values of both early and late health effects decrease as heat content increases. The increase fuel burnup, i.e., the increase of core inventories increases the late health effects, however, has small influence on the early health effects. But, the number of early injuries increases as the fuel burnup increases. The effects of release time increase shows very similar influence on both the early and late health effects. As the release time increases to 2 hours, the values of health effects increase and then decrease rapidly. As release duration increases, the values of late health effects increase slightly, however, the values of early health effects decrease. As warning time increases to 2 hours, the values of late health effects decrease and then shows no variation. The number of early injuries decreases rapidly as the warning time increases to 2 hours. However, the number of early fatalities and the early fatality risk increase as the warning time increases

  10. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  11. Conceptual model for deriving the repository source term

    International Nuclear Information System (INIS)

    Alexander, D.H.; Apted, M.J.; Liebetrau, A.M.; Van Luik, A.E.; Williford, R.E.; Doctor, P.G.; Pacific Northwest Lab., Richland, WA; Roy F. Weston, Inc./Rogers and Assoc. Engineering Corp., Rockville, MD)

    1984-01-01

    Part of a strategy for evaluating the compliance of geologic repositories with Federal regulations is a modeling approach that would provide realistic release estimates for a particular configuration of the engineered-barrier system. The objective is to avoid worst-case bounding assumptions that are physically impossible or excessively conservative and to obtain probabilitistic estimates of (1) the penetration time for metal barriers and (2) radionuclide-release rates for individually simulated waste packages after penetration has occurred. The conceptual model described in this paper will assume that release rates are explicitly related to such time-dependent processes as mass transfer, dissolution and precipitation, radionuclide decay, and variations in the geochemical environment. The conceptual model will take into account the reduction in the rates of waste-form dissolution and metal corrosion due to a buildup of chemical reaction products. The sorptive properties of the metal-barrier corrosion products in proximity to the waste form surface will also be included. Cumulative released from the engineered-barrier system will be calculated by summing the releases from a probabilistically generated population of individual waste packages. 14 refs., 7 figs

  12. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy – Part 1: Model components for sources parameterization

    Directory of Open Access Journals (Sweden)

    R. Azzaro

    2017-11-01

    Full Text Available The volcanic region of Mt. Etna (Sicily, Italy represents a perfect lab for testing innovative approaches to seismic hazard assessment. This is largely due to the long record of historical and recent observations of seismic and tectonic phenomena, the high quality of various geophysical monitoring and particularly the rapid geodynamics clearly demonstrate some seismotectonic processes. We present here the model components and the procedures adopted for defining seismic sources to be used in a new generation of probabilistic seismic hazard assessment (PSHA, the first results and maps of which are presented in a companion paper, Peruzza et al. (2017. The sources include, with increasing complexity, seismic zones, individual faults and gridded point sources that are obtained by integrating geological field data with long and short earthquake datasets (the historical macroseismic catalogue, which covers about 3 centuries, and a high-quality instrumental location database for the last decades. The analysis of the frequency–magnitude distribution identifies two main fault systems within the volcanic complex featuring different seismic rates that are controlled essentially by volcano-tectonic processes. We discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults by using an historical approach and a purely geologic method. We derive a magnitude–size scaling relationship specifically for this volcanic area, which has been implemented into a recently developed software tool – FiSH (Pace et al., 2016 – that we use to calculate the characteristic magnitudes and the related mean recurrence times expected for each fault. Results suggest that for the Mt. Etna area, the traditional assumptions of uniform and Poissonian seismicity can be relaxed; a time-dependent fault-based modeling, joined with a 3-D imaging of volcano-tectonic sources depicted by the recent instrumental seismicity, can therefore be

  13. Input to the PRAST computer code used in the SRS probabilistic risk assessment

    International Nuclear Information System (INIS)

    Kearnaghan, D.P.

    1992-01-01

    The PRAST (Production Reactor Algorithm for Source Terms) computer code was developed by Westinghouse Savannah River Company and Science Application International Corporation for the quantification of source terms for the SRS Savannah River Site (SRS) Reactor Probabilistic Risk Assessment. PRAST requires as input a set of release fractions, decontamination factors, transfer fractions and source term characteristics that accurately reflect the conditions that are evaluated by PRAST. This document links the analyses which form the basis for the PRAST input parameters. In addition, it gives the distribution of the input parameters that are uncertain and considered to be important to the evaluation of the source terms to the environment

  14. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  15. Techniques for long term conditioning and storage of radium sources

    International Nuclear Information System (INIS)

    Dogaru, Gheorghe; Dragolici, Felicia; Nicu, Mihaela

    2008-01-01

    The Horia Hulubei National Institute of Research and Development for Physics and Nuclear Engineering developed its own technology for conditioning the radium spent sealed radioactive sources. The laboratory dedicated to radiological characterization, identification of radium sources as well as the encapsulation of spent sealed radioactive sources was equipped with a local ventilation system, welding devices, tightness test devices as well as radiometric portable devices. Two types of capsules have been designed for conditioning of radium spent sealed radioactive sources. For these kinds of capsules different types of storage packaging were developed. Data on the radium inventory will be presented in the paper. The paper contains the description of the process of conditioning of spent sealed radioactive sources as well as the description of the capsules and packaging. The paper describes the equipment used for the conditioning of the radium spent sealed sources. (authors)

  16. Probabilistic evaluation of scenarios in long-term safety analyses. Results of the project ISIBEL; Probabilistische Bewertung von Szenarien in Langzeitsicherheitsanalysen. Ergebnisse des Vorhabens ISIBEL

    Energy Technology Data Exchange (ETDEWEB)

    Buhmann, Dieter; Becker, Dirk-Alexander; Laggiard, Eduardo; Ruebel, Andre; Spiessl, Sabine; Wolf, Jens

    2016-07-15

    In the frame of the project ISIBEL deterministic analyses on the radiological consequences of several possible developments of the final repository were performed (VSG: preliminary safety analysis of the site Gorleben). The report describes the probabilistic evaluation of the VSG scenarios using uncertainty and sensitivity analyses. It was shown that probabilistic analyses are important to evaluate the influence of uncertainties. The transfer of the selected scenarios in computational cases and the used modeling parameters are discussed.

  17. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    Science.gov (United States)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  18. Hanford tank residual waste - Contaminant source terms and release models

    International Nuclear Information System (INIS)

    Deutsch, William J.; Cantrell, Kirk J.; Krupka, Kenneth M.; Lindberg, Michael L.; Jeffery Serne, R.

    2011-01-01

    Highlights: → Residual waste from five Hanford spent fuel process storage tanks was evaluated. → Gibbsite is a common mineral in tanks with high Al concentrations. → Non-crystalline U-Na-C-O-P ± H phases are common in the U-rich residual. → Iron oxides/hydroxides have been identified in all residual waste samples. → Uranium release is highly dependent on waste and leachant compositions. - Abstract: Residual waste is expected to be left in 177 underground storage tanks after closure at the US Department of Energy's Hanford Site in Washington State, USA. In the long term, the residual wastes may represent a potential source of contamination to the subsurface environment. Residual materials that cannot be completely removed during the tank closure process are being studied to identify and characterize the solid phases and estimate the release of contaminants from these solids to water that might enter the closed tanks in the future. As of the end of 2009, residual waste from five tanks has been evaluated. Residual wastes from adjacent tanks C-202 and C-203 have high U concentrations of 24 and 59 wt.%, respectively, while residual wastes from nearby tanks C-103 and C-106 have low U concentrations of 0.4 and 0.03 wt.%, respectively. Aluminum concentrations are high (8.2-29.1 wt.%) in some tanks (C-103, C-106, and S-112) and relatively low ( 2 -saturated solution, or a CaCO 3 -saturated water. Uranium release concentrations are highly dependent on waste and leachant compositions with dissolved U concentrations one or two orders of magnitude higher in the tests with high U residual wastes, and also higher when leached with the CaCO 3 -saturated solution than with the Ca(OH) 2 -saturated solution. Technetium leachability is not as strongly dependent on the concentration of Tc in the waste, and it appears to be slightly more leachable by the Ca(OH) 2 -saturated solution than by the CaCO 3 -saturated solution. In general, Tc is much less leachable (<10 wt.% of the

  19. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  20. Quantification of severe accident source terms of a Westinghouse 3-loop plant

    International Nuclear Information System (INIS)

    Lee Min; Ko, Y.-C.

    2008-01-01

    Integrated severe accident analysis codes are used to quantify the source terms of the representative sequences identified in PSA study. The characteristics of these source terms depend on the detail design of the plant and the accident scenario. A historical perspective of radioactive source term is provided. The grouping of radionuclides in different source terms or source term quantification tools based on TID-14844, NUREG-1465, and WASH-1400 is compared. The radionuclides release phenomena and models adopted in the integrated severe accident analysis codes of STCP and MAAP4 are described. In the present study, the severe accident source terms for risk quantification of Maanshan Nuclear Power Plant of Taiwan Power Company are quantified using MAAP 4.0.4 code. A methodology is developed to quantify the source terms of each source term category (STC) identified in the Level II PSA analysis of the plant. The characteristics of source terms obtained are compared with other source terms. The plant analyzed employs a Westinghouse designed 3-loop pressurized water reactor (PWR) with large dry containment

  1. Reassessment of the technical bases for estimating source terms. Final report

    International Nuclear Information System (INIS)

    Silberberg, M.; Mitchell, J.A.; Meyer, R.O.; Ryder, C.P.

    1986-07-01

    This document describes a major advance in the technology for calculating source terms from postulated accidents at US light-water reactors. The improved technology consists of (1) an extensive data base from severe accident research programs initiated following the TMI accident, (2) a set of coupled and integrated computer codes (the Source Term Code Package), which models key aspects of fission product behavior under severe accident conditions, and (3) a number of detailed mechanistic codes that bridge the gap between the data base and the Source Term Code Package. The improved understanding of severe accident phenonmena has also allowed an identification of significant sources of uncertainty, which should be considered in estimating source terms. These sources of uncertainty are also described in this document. The current technology provides a significant improvement in evaluating source terms over that available at the time of the Reactor Safety Study (WASH-1400) and, because of this significance, the Nuclear Regulatory Commission staff is recommending its use

  2. Source term analysis for a RCRA mixed waste disposal facility

    International Nuclear Information System (INIS)

    Jordan, D.L.; Blandford, T.N.; MacKinnon, R.J.

    1996-01-01

    A Monte Carlo transport scheme was used to estimate the source strength resulting from potential releases from a mixed waste disposal facility. Infiltration rates were estimated using the HELP code, and transport through the facility was modeled using the DUST code, linked to a Monte Carlo driver

  3. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    Science.gov (United States)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next

  4. Source term estimation via monitoring data and its implementation to the RODOS system

    International Nuclear Information System (INIS)

    Bohunova, J.; Duranova, T.

    2000-01-01

    A methodology and computer code for interpretation of environmental data, i.e. source term assessment, from on-line environmental monitoring network was developed. The method is based on the conversion of measured dose rates to the source term, i.e. airborne radioactivity release rate, taking into account real meteorological data and location of the monitoring points. The bootstrap estimation methodology and bipivot method to estimate the source term from on-site gamma dose rate monitors is used. The mentioned methods provide an estimate of the mean value of the source term and a confidence interval for it. (author)

  5. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  6. Determination of source terms in a degenerate parabolic equation

    International Nuclear Information System (INIS)

    Cannarsa, P; Tort, J; Yamamoto, M

    2010-01-01

    In this paper, we prove Lipschitz stability results for inverse source problems relative to parabolic equations. We use the method introduced by Imanuvilov and Yamamoto in 1998 based on Carleman estimates. What is new here is that we study a class of one-dimensional degenerate parabolic equations. In our model, the diffusion coefficient vanishes at one extreme point of the domain. Instead of the classical Carleman estimates obtained by Fursikov and Imanuvilov for non degenerate equations, we use and extend some recent Carleman estimates for degenerate equations obtained by Cannarsa, Martinez and Vancostenoble. Finally, we obtain Lipschitz stability results in inverse source problems for our class of degenerate parabolic equations both in the case of a boundary observation and in the case of a locally distributed observation

  7. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  8. Revised reactor accident source terms in the U.S. and implementation for light water reactors

    International Nuclear Information System (INIS)

    Soffer, L.; Lee, J.Y.

    1992-01-01

    Current NRC reactor accident source terms used for licensing are contained in Regulatory Guides 1.3 and 1.4 and specify that 100 % of the core inventory of noble gases and 25 % of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental (I 2 ) iodine. These assumptions have strongly affected present nuclear plant designs. Severe accident research results have confirmed that although the current source term is very substantial and has resulted in a very high level of plant capability, the present source term is no longer compatible with a realistic understanding of severe accidents. The NRC has issued a proposed revision of the reactor accident source terms as part of several regulatory activities to incorporate severe accident insights for future plants. A revision to 10 CFR 100 is also being proposed to specify site criteria directly and to eliminate source terms and doses for site evaluation. Reactor source terms will continue to be important in evaluating plant designs. Although intended primarily for future plants, existing and evolutionary power plants may voluntarily apply revised accident source term insights as well in licensing. The proposed revised accident source terms are presented in terms of fission product composition, magnitude, timing and iodine chemical form. Some implications for light water reactors are discussed. (author)

  9. Source terms derived from analyses of hypothetical accidents, 1950-1986

    International Nuclear Information System (INIS)

    Stratton, W.R.

    1987-01-01

    This paper reviews the history of reactor accident source term assumptions. After the Three Mile Island accident, a number of theoretical and experimental studies re-examined possible accident sequences and source terms. Some of these results are summarized in this paper

  10. Conditioning of disused sealed sources in countries without disposal facility: Short term gain - long term pain

    International Nuclear Information System (INIS)

    Benitez-Navarro, J.C.; Salgado-Mojena, M.

    2002-01-01

    Owing to the considerable development in managing disused sealed radioactive sources (DSRS), the limited availability of disposal practices for them, and the new recommendations for the use of borehole disposal concept, it was felt that a paper reviewing the existing recommendations could be a starting point of discussion on the retrievability of the sources. Even when no international consensus exists as to an acceptable solution for the challenge of disposal of disused sealed sources, the 'Best Available Technology' for managing most of them, recommended for developing countries, included the cementation of the sources. The waste packages prepared in such a way do not allow any flexibility to accommodate possible future disposal requirements. Therefore, the 'Wait and See' approach could be also recommended for managing not only the sources with long-live radionuclides and high activity, but probably for all kind of existing disused sealed sources. The general aim of the current paper is to identify and review the current recommendations for managing disused sealed sources and to meditate on the most convenient management schemes for disused sealed radioactive sources in Member States without disposal capacities (Latin America, Africa). The risk that cemented DSRS could be incompatible with future disposal requirements was taken into account. (author)

  11. Source term estimation during incident response to severe nuclear power plant accidents. Draft

    Energy Technology Data Exchange (ETDEWEB)

    McKenna, T J; Giitter, J

    1987-07-01

    The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. The goal is to present a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. (author)

  12. Source term estimation during incident response to severe nuclear power plant accidents. Draft

    International Nuclear Information System (INIS)

    McKenna, T.J.; Giitter, J.

    1987-01-01

    The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. The goal is to present a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. (author)

  13. Source term estimation during incident response to severe nuclear power plant accidents

    International Nuclear Information System (INIS)

    McKenna, T.J.; Glitter, J.G.

    1988-10-01

    This document presents a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. 39 refs., 48 figs., 19 tabs

  14. Operator aids for prediction of source term attenuation

    International Nuclear Information System (INIS)

    Powers, D.A.

    2004-01-01

    Simplified expressions for the attenuation of radionuclide releases by sprays and by water pools are devised. These expressions are obtained by correlation of the 10th, 50th and 90th percentiles of uncertainty distributions for the water pool decontamination factor and the spray decontamination coefficient. These uncertainty distributions were obtained by Monte Carlo uncertainty analyses using detailed, mechanistic models of the pools and sprays. Uncertainties considered in the analyses include uncertainties in the phenomena and uncertainties in the initial and boundary conditions dictated by the progression of severe accidents. Final results are graphically displayed in terms of the decontamination factor achieved at selected levels of conservatism versus pool depth and water subcooling or, in the case of sprays, versus time. (author)

  15. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  16. Prospects of renewable energy sources in India: Prioritization of alternative sources in terms of Energy Index

    International Nuclear Information System (INIS)

    Jha, Shibani K.; Puppala, Harish

    2017-01-01

    The growing energy demand in progressing civilization governs the exploitation of various renewable sources over the conventional sources. Wind, Solar, Hydro, Biomass, and waste & Bagasse are the various available renewable sources in India. A reliable nonconventional geothermal source is also available in India but it is restricted to direct heat applications. This study archives the status of renewable alternatives in India. The techno economic factors and environmental aspects associated with each of these alternatives are discussed. This study focusses on prioritizing the renewable sources based on a parameter introduced as Energy Index. This index is evaluated using cumulative scores obtained for each of the alternatives. The cumulative score is obtained by evaluating each alternative over a range of eleven environmental and techno economic criteria following Fuzzy Analytical Hierarchy Process. The eleven criteria's considered in the study are Carbon dioxide emissions (CO 2 ), Sulphur dioxide emissions (SO 2 ), Nitrogen oxide emissions (NO x ), Land requirement, Current energy cost, Potential future energy cost, Turnkey investment, Capacity factor, Energy efficiency, Design period and Water consumption. It is concluded from the study that the geothermal source is the most preferable alternative with highest Energy Index. Hydro, Wind, Biomass and Solar sources are subsequently preferred alternatives. - Highlights: • FAH process is used to obtain cumulative score for each renewable alternative. • Cumulative score is normalized by highest score of ideal source. • Energy Index shows how best a renewable alternative is. • Priority order is obtained for alternatives based on Energy Index. • Geothermal is most preferable source followed by Hydro, Wind, Biomass and Solar.

  17. Operational short-term Probabilistic Volcanic Hazard Assessment of tephra fallout: an example from the 1982-1984 unrest at Campi Flegrei

    Science.gov (United States)

    Sandri, Laura; Selva, Jacopo; Costa, Antonio; Macedonio, Giovanni; Marzocchi, Warner

    2014-05-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology for the short-term PVHA and its operational implementation, based on the model BET_EF, in which measures from the monitoring system are used to routinely update the forecast of some parameters related to the eruption dynamics, that is, the probabilities of eruption, of every possible vent position and every possible eruption size. Then, considering all possible vent positions and eruptive sizes, tephra dispersal models are coupled with frequently updated meteorological forecasts. Finally, these results are merged through a Bayesian procedure, accounting for epistemic uncertainties at all the considered steps. As case study we retrospectively study some stages of the volcanic unrest that took place in Campi Flegrei (CF) in 1982-1984. In particular, we aim at presenting a practical example of possible operational tephra fall PVHA on a daily basis, in the surroundings of CF at different stages of the 1982-84 unrest. Tephra dispersal is simulated using the analytical HAZMAP code. We consider three possible eruptive sizes (a low, a medium and a

  18. Source term reduction at DAEC (including stellite ball recycling)

    International Nuclear Information System (INIS)

    Smith, R.; Schebler, D.

    1995-01-01

    The Duane Arnold Energy Center was seeking methods to reduce dose rates from the drywell due to Co-60. Duane Arnold is known in the industry to have one of the highest drywell dose rates from the industry standardized 'BRAC' point survey. A prime method to reduce dose rates due to Co-60 is the accelerated replacement of stellite pins and rollers in control rod blades due to their high stellite (cobalt) content. Usually the cobalt content in alloys of stellite is greater than 60% cobalt by weight. During the RFO-12 refueling outage at Duane Arnold, all of the remaining cobalt bearing control rod blades were replaced and new stellite free control rod blades were installed in the core. This left Duane Arnold with the disposal of highly radioactive stellite pins and rollers. The processing of control rod blades for disposal is a very difficult evolution. First, the velocity limiter (a bottom portion of the component) and the highly radioactive upper stellite control rod blade ins and rollers are separated from the control rod blade. Next, the remainder of the control rod blade is processed (chopped and/or crushed) to aid packaging the waste for disposal. The stellite bearings are then often carefully placed in with the rest of the waste in a burial liner to provide shielding for disposal or more often are left as 'orphans' in the spent fuel pool because their high specific activity create shipping and packaging problems. Further investigation by the utility showed that the stellite balls and pins could be recycled to a source manufacturer rather than disposed of in a low-level burial site. The cost savings to the utility was on the order of $200,000 with a gross savings of $400,000 in savings in burial site charges. A second advantage of the recycling of the stellite pins and rollers was a reduction in control in radioactive waste shipments

  19. Prospects for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.

    1992-01-01

    This article provides some reflections on future developments of Probabilistic Safety Assessment (PSA) in view of the present state of the art and evaluates current trends in the use of PSA for safety management. The main emphasis is on Level 1 PSA, although Level 2 aspects are also highlighted to some extent. As a starting point, the role of PSA is outlined from a historical perspective, demonstrating the rapid expansion of the uses of PSA. In this context the wide spectrum of PSA applications and the associated benefits to the users are in focus. It should be kept in mind, however, that PSA, in spite of its merits, is not a self-standing safety tool. It complements deterministic analysis and thus improves understanding and facilitating prioritization of safety issues. Significant progress in handling PSA limitations - such as reliability data, common-cause failures, human interactions, external events, accident progression, containment performance, and source-term issues - is described. This forms a background for expected future developments of PSA. Among the most important issues on the agenda for the future are PSA scope extensions, methodological improvements and computer code advancements, and full exploitation of the potential benefits of applications to operational safety management. Many PSA uses, if properly exercised, lead to safety improvements as well as major burden reductions. The article provides, in addition, International Atomic Energy Agency (IAEA) perspective on the topics covered, as reflected in the current PSA programs of the agency. 74 refs., 6 figs., 1 tab

  20. Probabilistic maps of the white matter tracts with known associated functions on the neonatal brain atlas: Application to evaluate longitudinal developmental trajectories in term-born and preterm-born infants.

    Science.gov (United States)

    Akazawa, Kentaro; Chang, Linda; Yamakawa, Robyn; Hayama, Sara; Buchthal, Steven; Alicata, Daniel; Andres, Tamara; Castillo, Deborrah; Oishi, Kumiko; Skranes, Jon; Ernst, Thomas; Oishi, Kenichi

    2016-03-01

    Diffusion tensor imaging (DTI) has been widely used to investigate the development of the neonatal and infant brain, and deviations related to various diseases or medical conditions like preterm birth. In this study, we created a probabilistic map of fiber pathways with known associated functions, on a published neonatal multimodal atlas. The pathways-of-interest include the superficial white matter (SWM) fibers just beneath the specific cytoarchitectonically defined cortical areas, which were difficult to evaluate with existing DTI analysis methods. The Jülich cytoarchitectonic atlas was applied to define cortical areas related to specific brain functions, and the Dynamic Programming (DP) method was applied to delineate the white matter pathways traversing through the SWM. Probabilistic maps were created for pathways related to motor, somatosensory, auditory, visual, and limbic functions, as well as major white matter tracts, such as the corpus callosum, the inferior fronto-occipital fasciculus, and the middle cerebellar peduncle, by delineating these structures in eleven healthy term-born neonates. In order to characterize maturation-related changes in diffusivity measures of these pathways, the probabilistic maps were then applied to DTIs of 49 healthy infants who were longitudinally scanned at three time-points, approximately five weeks apart. First, we investigated the normal developmental pattern based on 19 term-born infants. Next, we analyzed 30 preterm-born infants to identify developmental patterns related to preterm birth. Last, we investigated the difference in diffusion measures between these groups to evaluate the effects of preterm birth on the development of these functional pathways. Term-born and preterm-born infants both demonstrated a time-dependent decrease in diffusivity, indicating postnatal maturation in these pathways, with laterality seen in the corticospinal tract and the optic radiation. The comparison between term- and preterm

  1. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  2. Recent advances in the source term area within the SARNET European severe accident research network

    International Nuclear Information System (INIS)

    Herranz, L.E.; Haste, T.; Kärkelä, T.

    2015-01-01

    Highlights: • Main achievements of source term research in SARNET are given. • Emphasis on the radiologically important iodine and ruthenium fission products. • Conclusions on FP release, transport in the RCS and containment behaviour. • Significance of large-scale integral experiments to validate the analyses used. • A thorough list of the most recent references on source term research results. - Abstract: Source Term has been one of the main research areas addressed within the SARNET network during the 7th EC Framework Programme of EURATOM. The entire source term domain was split into three major areas: oxidising impact on source term, iodine chemistry in the reactor coolant system and containment and data and code assessment. The present paper synthesises the main technical outcome stemming from the SARNET FWP7 project in the area of source term and includes an extensive list of references in which deeper insights on specific issues may be found. Besides, based on the analysis of the current state of the art, an outlook of future source term research is outlined, where major changes in research environment are discussed (i.e., the end of the Phébus FP project; the end of the SARNET projects; and the launch of HORIZON 2020). Most probably research projects will be streamlined towards: release and transport under oxidising conditions, containment chemistry, existing and innovative filtered venting systems and others. These will be in addition to a number of projects that have been completed or are ongoing under different national and international frameworks, like VERDON, CHIP and EPICUR started under the International Source Term Programme (ISTP), the OECD/CSNI programmes BIP, BIP2, STEM, THAI and THAI2, and the French national programme MIRE. The experimental PASSAM project under the 7th EC Framework programme, focused on source term mitigation systems, is highlighted as a good example of a project addressing potential enhancement of safety systems

  3. Recent advances in the source term area within the SARNET European severe accident research network

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E., E-mail: luisen.herranz@ciemat.es [Centro de Investigaciones Energeticas Medio Ambientales y Tecnologica, CIEMAT, Avda. Complutense 40, E-28040 Madrid (Spain); Haste, T. [Institut de Radioprotection et de Sûreté Nucléaire, IRSN, BP 3, F-13115 St Paul lez Durance Cedex (France); Kärkelä, T. [VTT Technical Research Centre of Finland, P.O. Box 1000, FI-02044 VTT Espoo (Finland)

    2015-07-15

    Highlights: • Main achievements of source term research in SARNET are given. • Emphasis on the radiologically important iodine and ruthenium fission products. • Conclusions on FP release, transport in the RCS and containment behaviour. • Significance of large-scale integral experiments to validate the analyses used. • A thorough list of the most recent references on source term research results. - Abstract: Source Term has been one of the main research areas addressed within the SARNET network during the 7th EC Framework Programme of EURATOM. The entire source term domain was split into three major areas: oxidising impact on source term, iodine chemistry in the reactor coolant system and containment and data and code assessment. The present paper synthesises the main technical outcome stemming from the SARNET FWP7 project in the area of source term and includes an extensive list of references in which deeper insights on specific issues may be found. Besides, based on the analysis of the current state of the art, an outlook of future source term research is outlined, where major changes in research environment are discussed (i.e., the end of the Phébus FP project; the end of the SARNET projects; and the launch of HORIZON 2020). Most probably research projects will be streamlined towards: release and transport under oxidising conditions, containment chemistry, existing and innovative filtered venting systems and others. These will be in addition to a number of projects that have been completed or are ongoing under different national and international frameworks, like VERDON, CHIP and EPICUR started under the International Source Term Programme (ISTP), the OECD/CSNI programmes BIP, BIP2, STEM, THAI and THAI2, and the French national programme MIRE. The experimental PASSAM project under the 7th EC Framework programme, focused on source term mitigation systems, is highlighted as a good example of a project addressing potential enhancement of safety systems

  4. Source term model evaluations for the low-level waste facility performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Yim, M.S.; Su, S.I. [North Carolina State Univ., Raleigh, NC (United States)

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  5. Source-term reevaluation for US commercial nuclear power reactors: a status report

    International Nuclear Information System (INIS)

    Herzenberg, C.L.; Ball, J.R.; Ramaswami, D.

    1984-12-01

    Only results that had been discussed publicly, had been published in the open literature, or were available in preliminary reports as of September 30, 1984, are included here. More than 20 organizations are participating in source-term programs, which have been undertaken to examine severe accident phenomena in light-water power reactors (including the chemical and physical behavior of fission products under accident conditions), update and reevaluate source terms, and resolve differences between predictions and observations of radiation releases and related phenomena. Results from these source-term activities have been documented in over 100 publications to date

  6. An investigation of the closure problem applied to reactor accident source terms

    International Nuclear Information System (INIS)

    Brearley, I.R.; Nixon, W.; Hayns, M.R.

    1987-01-01

    The closure problem, as considered here, focuses attention on the question of when in current research programmes enough has been learned about the source terms for reactor accident releases. Noting that current research is tending to reduce the estimated magnitude of the aerosol component of atmospheric, accidental releases, several possible criteria for closure are suggested. Moreover, using the reactor accident consequence model CRACUK, the effect of gradually reducing the aerosol release fractions of a pressurized water reactor (PWR2) source term (as defined in the WASH-1400 study) is investigated and the implications of applying the suggested criteria to current source term research discussed. (author)

  7. Evaluation of the LMFBR cover gas source term and synthesis of the associated R and D

    International Nuclear Information System (INIS)

    Balard, F.; Carluec, B.

    1996-01-01

    At the end of the seventies and the beginning of the eighties, there appeared a pressing need of experimental results to assess the LMFBR's safety level. Because of the urgency, analytical studies were not systematically undertaken and maximum credible cover gas instantaneous source terms (radionuclides core release fraction) were got directly from crude out-of-pile experiment interpretations. Two types of studies and mock-ups were undertaken depending on the timescale of the phenomena: instantaneous source terms (corresponding to an unlikely energetic core disruptive accident CDA), and delayed ones (tens of minutes to some hours). The experiments performed in this frame are reviewed in this presentation: 1) instantaneous source term: - FAUST experiments: I, Cs, UO2 source terms (FzK, Germany), - FAST experiments : pool depth influence on non volatile source term (USA), - CARAVELLE experiments: nonvolatile source term in SPX1 geometry (CEA, France); 2) delayed source term: - NALA experiments: I, Cs, Sr, UO2 source term (FzK, Germany), - PAVE experiments: I source term (CEA, France), - NACOWA experiments: cover gas aerosols enrichment in I and Cs (FzK, Germany) - other French experiments in COPACABANA and GULLIVER facilities. The volatile fission products release is tightly bound to sodium evaporation and a large part of the fission products is dissolved in the liquid sodium aerosols present in the cover gas. Thus the knowledge of the amount of aerosol release to the cover gas is important for the evaluation of the source term. The maximum credible cover gas instantaneous source terms deduced from the experiments have led to conservative source terms to be taken into account in safety analysis. Nevertheless modelling attempts of the observed (in-pile or out-of-pile) physico-chemical phenomena have been undertaken for extrapolation to the reactor case. The main topics of this theoretical research are as follows: fission products evaporation in the cover gas (Fz

  8. Development of source term evaluation method for Korean Next Generation Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Keon Jae; Cheong, Jae Hak; Park, Jin Baek; Kim, Guk Gee [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-10-15

    This project had investigate several design features of radioactive waste processing system and method to predict nuclide concentration at primary coolant basic concept of next generation reactor and safety goals at the former phase. In this project several prediction methods of source term are evaluated conglomerately and detailed contents of this project are : model evaluation of nuclide concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant(NPP), investigation of prediction parameter of source term evaluation, basic parameter of PWR, operational parameter, respectively, radionuclide removal system and adjustment values of reference NPP, suggestion of source term prediction method of next generation NPP.

  9. Development of source term evaluation method for Korean Next Generation Reactor(III)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Geon Jae; Park, Jin Baek; Lee, Yeong Il; Song, Min Cheonl; Lee, Ho Jin [Korea Advanced Institue of Science and Technology, Taejon (Korea, Republic of)

    1998-06-15

    This project had investigated irradiation characteristics of MOX fuel method to predict nuclide concentration at primary and secondary coolant using a core containing 100% of all MOX fuel and development of source term evaluation tool. In this study, several prediction methods of source term are evaluated. Detailed contents of this project are : an evaluation of model for nuclear concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant using purely MOX fuel, suggestion of source term prediction method of NPP with a core using MOX fuel.

  10. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  11. Estimation of Source Term Behaviors in SBO Sequence in a Typical 1000MWth PWR and Comparison with Other Source Term Results

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Han, Seok Jung; Ahn, Kwang Il; Fynan, Douglas; Jung, Yong Hoon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Since the Three Mile Island (TMI) (1979), Chernobyl (1986), Fukushima Daiichi (2011) accidents, the assessment of radiological source term effects on the environment has been a key concern of nuclear safety. In the Fukushima Daiichi accident, the long-term SBO (station blackout) accident occurs. Using the worst case assumptions like in Fukushima accident on the accident sequences and on the availability of safety systems, the thermal hydraulic behaviors, core relocation and environmental source terms behaviors are estimated for long-term SBO accident for OPR-1000 reactor. MELCOR code version 1.8.6 is used in this analysis. Source term results estimated in this study is compared with other previous studies and estimated results in Fukushima accidents in UNSCEAR-2013 report. This study estimated that 11 % of iodine can be released to environment and 2% of cesium can be released to environment. UNSCEAR-2013 report estimated that 2 - 8 % of iodine have been released to environment and 1 - 3 % of cesium have been released to the environment. They have similar results in the aspect of release fractions of iodine and cesium to environment.

  12. Economical comparison of imported energy sources in terms of long-term production planning

    International Nuclear Information System (INIS)

    Gungor, Z.

    1999-01-01

    In this paper, the Turkish energy production sector is studied and power plants fueled by natural gas, imported coal and nuclear power are compared in terms of long-term (1996-2010) production economy. A net present value is used for comparing nuclear, coal and natural gas power plants. A scenario approach is utilized in establishing the effects of different factors, such as inflation rate, unit of investment costs, load factor change, discount rate and fuel price changes. Six different scenarios of interest are developed and discussed. The study ends with conclusions and recommendations based on a study of a reference scenario and alternative scenarios. (author)

  13. Variational approach to probabilistic finite elements

    Science.gov (United States)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  14. Relation between source term and emergency planning for nuclear power plants

    International Nuclear Information System (INIS)

    Shi Zhongqi; Yang Ling

    1992-01-01

    Some background information of the severe accidents and source terms related to the nuclear power plant emergency planning are presented. The new source term information in NUREG-0956 and NUREG-1150, and possible changes in emergency planning requirements in U.S.A. are briefly provided. It is suggested that a principle is used in selecting source terms for establishing the emergency planning policy and a method is used in determining the Emergency Planning Zone (EPZ) size in China. Based on the research results of (1) EPZ size of PWR nuclear power plants being built in China, and (2) impact of reactor size and selected source terms on the EPZ size, it is concluded that the suggested principle and the method are suitable and feasible for PWR nuclear power plants in China

  15. Consideration of emergency source terms for pebble-bed high temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Tao, Liu; Jun, Zhao; Jiejuan, Tong; Jianzhu, Cao

    2009-01-01

    Being the last barrier in the nuclear power plant defense-in-depth strategy, emergency planning (EP) is an integrated project. One of the key elements in this process is emergency source terms selection. Emergency Source terms for light water reactor (LWR) nuclear power plant (NPP) have been introduced in many technical documents, and advanced NPP emergency planning is attracting attention recently. Commercial practices of advanced NPP are undergoing in the world, pebble-bed high-temperature gas-cooled reactor (HTGR) power plant is under construction in China which is considered as a representative of advanced NPP. The paper tries to find some pieces of suggestion from our investigation. The discussion of advanced NPP EP will be summarized first, and then the characteristics of pebble-bed HTGR relating to EP will be described. Finally, PSA insights on emergency source terms selection and current pebble-bed HTGR emergency source terms suggestions are proposed

  16. Accident source terms for boiling water reactors with high burnup cores.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Powers, Dana Auburn; Leonard, Mark Thomas

    2007-11-01

    The primary objective of this report is to provide the technical basis for development of recommendations for updates to the NUREG-1465 Source Term for BWRs that will extend its applicability to accidents involving high burnup (HBU) cores. However, a secondary objective is to re-examine the fundamental characteristics of the prescription for fission product release to containment described by NUREG-1465. This secondary objective is motivated by an interest to understand the extent to which research into the release and behaviors of radionuclides under accident conditions has altered best-estimate calculations of the integral response of BWRs to severe core damage sequences and the resulting radiological source terms to containment. This report, therefore, documents specific results of fission product source term analyses that will form the basis for the HBU supplement to NUREG-1465. However, commentary is also provided on observed differences between the composite results of the source term calculations performed here and those reflected NUREG-1465 itself.

  17. Reassessment of the technical bases for estimating source terms. Draft report for comment

    International Nuclear Information System (INIS)

    Silberberg, M.; Mitchell, J.A.; Meyer, R.O.; Pasedag, W.F.; Ryder, C.P.; Peabody, C.A.; Jankowski, M.W.

    1985-07-01

    NUREG-0956 describes the NRC staff and contractor efforts to reassess and update the agency's analytical procedures for estimating accident source terms for nuclear power plants. The effort included development of a new source term analytical procedure - a set of computer codes - that is intended to replace the methodology of the Reactor Safety Study (WASH-1400) and to be used in reassessing the use of TID-14844 assumptions (10 CFR 100). NUREG-0956 describes the development of these codes, the demonstration of the codes to calculate source terms for specific cases, the peer review of this work, some perspectives on the overall impact of new source terms on plant risks, the plans for related research projects, and the conclusions and recommendations resulting from the effort

  18. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  19. Selected source term topics. Report to CSNI by an OECD/NEA Group of experts

    International Nuclear Information System (INIS)

    1987-04-01

    CSNI Report 136 summarizes the results of the work performed by the Group of Experts on the Source Term and Environmental Consequences (PWG4) during the period extending from 1983 and 1986. This report is complementary to Part 1, 'Technical Status of the Source Term' of CSNI Report 135, 'Report to CSNI on Source Term Assessment, Containment atmosphere control systems, and accident consequences'; it considers in detail a number of very specific issues thought to be important in the source term area. It consists of: an executive summary (prepared by the Chairman of the Group), a section on conclusions and recommendations, and five technical chapters (fission product chemistry in the primary circuit of a LWR during severe accidents; resuspension/re-entrainment of aerosols in LWRs following a meltdown accident; iodine chemistry under severe accident conditions; effects of combustion, steam explosions and pressurized melt ejection on fission product behaviour; radionuclide removal by pool scrubbing), a technical annex and two appendices

  20. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  1. Impact of source terms on distances to which reactor accident consequences occur

    International Nuclear Information System (INIS)

    Ostmeyer, R.M.

    1982-01-01

    Estimates of the distances over which reactor accident consequences might occur are important for development of siting criteria and for emergency response planning. This paper summarizes the results of a series of CRAC2 calculations performed to estimate these distances. Because of the current controversy concerning the magnitude of source terms for severe accidents, the impact of source term reductions upon distance estimates is also examined

  2. Data assimilation and source term estimation during the early phase of a nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Golubenkov, A.; Borodin, R. [SPA Typhoon, Emergency Centre (Russian Federation); Sohier, A.; Rojas Palma, C. [Centre de l`Etude de l`Energie Nucleaire, Mol (Belgium)

    1996-02-01

    The mathematical/physical base of possible methods to model the source term during an accidental release of radionuclides is discussed. Knowledge of the source term is important in view of optimizing urgent countermeasures to the population. In most cases however, it will be impossible to assess directly the release dynamics. Therefore methods are under development in which the source term is modelled, based on the comparison of off-site monitoring data and model predictions using an atmospheric dispersion model. The degree of agreement between the measured and calculated characteristics of the radioactive contamination of the air and the ground surface is an important criterion in this process. Due to the inherent complexity, some geometrical transformations taking space-time discrepancies between observed and modelled contamination fields are defined before the source term is adapted. This work describes the developed algorithms which are also tested against data from some tracer experiments performed in the past. This method is also used to reconstruct the dynamics of the Chernobyl source term. Finally this report presents a concept of software to reconstruct a multi-isotopic source term in real-time.

  3. Data assimilation and source term estimation during the early phase of a nuclear accident

    International Nuclear Information System (INIS)

    Golubenkov, A.; Borodin, R.; Sohier, A.; Rojas Palma, C.

    1996-02-01

    The mathematical/physical base of possible methods to model the source term during an accidental release of radionuclides is discussed. Knowledge of the source term is important in view of optimizing urgent countermeasures to the population. In most cases however, it will be impossible to assess directly the release dynamics. Therefore methods are under development in which the source term is modelled, based on the comparison of off-site monitoring data and model predictions using an atmospheric dispersion model. The degree of agreement between the measured and calculated characteristics of the radioactive contamination of the air and the ground surface is an important criterion in this process. Due to the inherent complexity, some geometrical transformations taking space-time discrepancies between observed and modelled contamination fields are defined before the source term is adapted. This work describes the developed algorithms which are also tested against data from some tracer experiments performed in the past. This method is also used to reconstruct the dynamics of the Chernobyl source term. Finally this report presents a concept of software to reconstruct a multi-isotopic source term in real-time

  4. Procedures for the elicitation of expert judgements in the probabilistic risk analysis of the long-term effects of radioactive waste repositories: an annotated bibliography

    International Nuclear Information System (INIS)

    Watson, S.R.

    1993-01-01

    This annotated bibliography describes the key literature relevant to the elicitation of expert judgements in radioactive waste management. The bibliography is divided into seven sections; section 2 lists the literature exploring the proper interpretation of probabilities used in Probabilistic Risk Analysis (PRA). Section 3 lists literature describing other calculi for handling uncertainty in a numerical fashion. In section 4 comments are given on how to elicit probabilities from individuals as a measure of subjective degrees of belief and section 5 lists the literature concerning how expert judgements can be combined. Sections 6 and 7 list literature giving an overview of the issues involved in PRA for radioactive waste repositories. (author)

  5. Long term leaching of chlorinated solvents from source zones in low permeability settings with fractures

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup; Chambon, Julie Claire Claudia; Troldborg, Mads

    2008-01-01

    spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis...

  6. Backup Sourcing Decisions for Coping with Supply Disruptions under Long-Term Horizons

    Directory of Open Access Journals (Sweden)

    Jing Hou

    2016-01-01

    Full Text Available This paper studies a buyer’s inventory control problem under a long-term horizon. The buyer has one major supplier that is prone to disruption risks and one backup supplier with higher wholesale price. Two kinds of sourcing methods are available for the buyer: single sourcing with/without contingent supply and dual sourcing. In contingent sourcing, the backup supplier is capacitated and/or has yield uncertainty, whereas in dual sourcing the backup supplier has an incentive to offer output flexibility during disrupted periods. The buyer’s expected cost functions and the optimal base-stock levels using each sourcing method under long-term horizon are obtained, respectively. The effects of three risk parameters, disruption probability, contingent capacity or uncertainty, and backup flexibility, are examined using comparative studies and numerical computations. Four sourcing methods, namely, single sourcing with contingent supply, dual sourcing, and single sourcing from either of the two suppliers, are also compared. These findings can be used as a valuable guideline for companies to select an appropriate sourcing strategy under supply disruption risks.

  7. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  8. Selective application of revised source terms to operating nuclear power plants

    International Nuclear Information System (INIS)

    Moon, Joo Hyun; Song, Jae Hyuk; Lee, Young Wook; Ko, Hyun Seok; Kang, Chang Sun

    2001-01-01

    More than 30 years later since 1962 when TID-14844 was promulgated, there has been big change of the US NRC's regulatory position in using accident source terms for radiological assessment following a design basis accident (DBA). To replace the instantaneous source terms of TID-14844, the time-dependent source terms of NUREG-1465 was published in 1995. In the meantime, the radiological acceptance criteria for reactor site evaluation in 10 CFR Part 100 were also revised. In particular, the concept of total effective dose equivalent has been incorporated in accordance with the radiation protection standards set forth in revised 10 CFR Part 20. Subsequently, the publication of Regulatory Guide 1.183 and the revision of Standard Review Plan 15.0.1 followed in 2000, which provided the licensee of operating nuclear power reactor with the acceptable guidance of applying the revised source term. The guidance allowed the holder of an operating license issued prior to January 10, 1997 to voluntarily revise the accident source terms used in the radiological consequence analyses of DBA. Regarding to its type of application, there suggested full and selective applications, Whether it is full or selective, based upon the scope and nature of associated plant modifications being proposed, the actual application of the revised source terms to an operating plant is expected to give a large impact on its facility design basis. Considering scope and cost of the analyses required for licensing, selective application is seemed to be more appealing to an licensee of the operating plant rather than full application. In this paper, hence, the selective application methodology is reviewed and is actally applied to the assessment of offsite radiological consequence following a LOCA at Ulchin Unit 3 and 4, in order to identify and analyze the potential impacts due to application of revised source terms and to assess the considerations taken in each application prior to its actual

  9. Source term determination from subcritical multiplication measurements at Koral-1 reactor

    International Nuclear Information System (INIS)

    Blazquez, J.B.; Barrado, J.M.

    1978-01-01

    By using an AmBe neutron source two independent procedures have been settled for the zero-power experimental fast-reactor Coral-1 in order to measure the source term which appears in the point kinetical equations. In the first one, the source term is measured when the reactor is just critical with source by taking advantage of the wide range of the linear approach to critical for Coral-1. In the second one, the measurement is made in subcritical state by making use of the previous calibrated control rods. Several applications are also included such as the measurement of the detector dead time, the determinations of the reactivity of small samples and the shape of the neutron importance of the source. (author)

  10. Review of radionuclide source terms used for performance-assessment analyses

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1993-06-01

    Two aspects of the radionuclide source terms used for total-system performance assessment (TSPA) analyses have been reviewed. First, a detailed radionuclide inventory (i.e., one in which the reactor type, decay, and burnup are specified) is compared with the standard source-term inventory used in prior analyses. The latter assumes a fixed ratio of pressurized-water reactor (PWR) to boiling-water reactor (BWR) spent fuel, at specific amounts of burnup and at 10-year decay. TSPA analyses have been used to compare the simplified source term with the detailed one. The TSPA-91 analyses did not show a significant difference between the source terms. Second, the radionuclides used in source terms for TSPA aqueous-transport analyses have been reviewed to select ones that are representative of the entire inventory. It is recommended that two actinide decay chains be included (the 4n+2 ''uranium'' and 4n+3 ''actinium'' decay series), since these include several radionuclides that have potentially important release and dose characteristics. In addition, several fission products are recommended for the same reason. The choice of radionuclides should be influenced by other parameter assumptions, such as the solubility and retardation of the radionuclides

  11. Lessons Learned from Characterization, Performance Assessment, and EPA Regulatory Review of the 1996 Actinide Source Term for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Larson, K.W.; Moore, R.C.; Nowak, E.J.; Papenguth, H.W.; Jow, H.

    1999-01-01

    The Waste Isolation Pilot Plant (WIPP) is a US Department of Energy (DOE) facility for the permanent disposal of transuranic waste from defense activities. In 1996, the DOE submitted the Title 40 CFR Part 191 Compliance Certification Application for the Waste Isolation Pilot Plant (CCA) to the US Environmental Protection Agency (EPA). The CCA included a probabilistic performance assessment (PA) conducted by Sandia National Laboratories to establish compliance with the quantitative release limits defined in 40 CFR 191.13. An experimental program to collect data relevant to the actinide source term began around 1989, which eventually supported the 1996 CCA PA actinide source term model. The actinide source term provided an estimate of mobile dissolved and colloidal Pu, Am, U, Th, and Np concentrations in their stable oxidation states, and accounted for effects of uncertainty in the chemistry of brines in waste disposal areas. The experimental program and the actinide source term included in the CCA PA underwent EPA review lasting more than 1 year. Experiments were initially conducted to develop data relevant to the wide range of potential future conditions in waste disposal areas. Interim, preliminary performance assessments and actinide source term models provided insight allowing refinement of experiments and models. Expert peer review provided additional feedback and confidence in the evolving experimental program. By 1995, the chemical database and PA predictions of WIPP performance were considered reliable enough to support the decision to add an MgO backfill to waste rooms to control chemical conditions and reduce uncertainty in actinide concentrations, especially for Pu and Am. Important lessons learned through the characterization, PA modeling, and regulatory review of the actinide source term are (1) experimental characterization and PA should evolve together, with neither activity completely dominating the other, (2) the understanding of physical processes

  12. ITER safety task NID-5a: ITER tritium environmental source terms - safety analysis basis

    International Nuclear Information System (INIS)

    Natalizio, A.; Kalyanam, K.M.

    1994-09-01

    The Canadian Fusion Fuels Technology Project's (CFFTP) is part of the contribution to ITER task NID-5a, Initial Tritium Source Term. This safety analysis basis constitutes the first part of the work for establishing tritium source terms and is intended to solicit comments and obtain agreement. The analysis objective is to provide an early estimate of tritium environmental source terms for the events to be analyzed. Events that would result in the loss of tritium are: a Loss of Coolant Accident (LOCA), a vacuum vessel boundary breach. a torus exhaust line failure, a fuelling machine process boundary failure, a fuel processing system process boundary failure, a water detritiation system process boundary failure and an isotope separation system process boundary failure. 9 figs

  13. Review of the accident source terms for aluminide fuel: Application to the BR2 reactor

    International Nuclear Information System (INIS)

    Joppen, F.

    2005-01-01

    A major safety review of the BR2, a material test reactor, is to be conducted for the year 2006. One of the subjects selected for the safety review is the definition of source terms for emergency planning and in particular the development of accident scenarios. For nuclear power plants the behaviour of fuel under accident conditions is a well studied object. In case of non-power reactors this basic knowledge is rather scarce. The usefulness of information from power plant fuels is limited due to the differences in fuel type, power level and thermohydraulical conditions. First investigation indicates that using data from power plant fuel leads to an overestimation of the source terms. Further research on this subject could be very useful for the research reactor community, in order to define more realistic source terms and to improve the emergency preparedness. (author)

  14. Analysis of safety information for nuclear power plants and development of source term estimation program

    International Nuclear Information System (INIS)

    Kim, Tae Woon; Choi, Seong Soo; Park, Jin Hee

    1999-12-01

    Current CARE(Computerized Advisory System for Radiological Emergency) in KINS(Korea Institute of Nuclear Safety) has no STES(Source Term Estimation System) which links between SIDS(Safety Information Display System) and FADAS(Following Accident Dose Assessment System). So in this study, STES is under development. STES system is the system that estimates the source term based on the safety information provided by SIDS. Estimated source term is given to FADAS as an input for estimation of environmental effect of radiation. Through this first year project STES for the Kori 3,4 and Younggwang 1,2 has been developed. Since there is no CARE for Wolsong(PHWR) plants yet, CARE for Wolsong is under construction. The safety parameters are selected and the safety information display screens and the alarm logic for plant status change are developed for Wolsong Unit 2 based on the design documents for CANDU plants

  15. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  16. Use of source term code package in the ELEBRA MX-850 system

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-12-01

    The implantation of source term code package in the ELEBRA-MX850 system is presented. The source term is formed when radioactive materials generated in nuclear fuel leakage toward containment and the external environment to reactor containment. The implantated version in the ELEBRA system are composed of five codes: MARCH 3, TRAPMELT 3, THCCA, VANESA and NAVA. The original example case was used. The example consists of a small loca accident in a PWR type reactor. A sensitivity study for the TRAPMELT 3 code was carried out, modifying the 'TIME STEP' to estimate the processing time of CPU for executing the original example case. (M.C.K.) [pt

  17. Source term estimation based on in-situ gamma spectrometry using a high purity germanium detector

    International Nuclear Information System (INIS)

    Pauly, J.; Rojas-Palma, C.; Sohier, A.

    1997-06-01

    An alternative method to reconstruct the source term of a nuclear accident is proposed. The technique discussed here involves the use of in-situ gamma spectrometry. The validation of the applied methodology has been possible through the monitoring of routine releases of Ar-41 originating at a Belgian site from an air cooled graphite research reactor. This technique provides a quick nuclide specific decomposition of the source term and therefore will be have an enormous potential if implemented in nuclear emergency preparedness and radiological assessments of nuclear accidents during the early phase

  18. Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan.

    Science.gov (United States)

    Nápoles, H Jiménez; León Vintró, L; Mitchell, P I; Omarova, A; Burkitbayev, M; Priest, N D; Artemyev, O; Lukashenko, S

    2004-01-01

    New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form.

  19. Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan

    Energy Technology Data Exchange (ETDEWEB)

    Napoles, H.J.H. Jimenez; Leon Vintro, L. E-mail: luis.leon@ucd.ie; Mitchell, P.I.; Omarova, A.; Burkitbayev, M.; Priest, N.D.; Artemyev, O.; Lukashenko, S

    2004-09-01

    New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form.

  20. The long-term problems of contaminated land: Sources, impacts and countermeasures

    Energy Technology Data Exchange (ETDEWEB)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  1. The long-term problems of contaminated land: Sources, impacts and countermeasures

    International Nuclear Information System (INIS)

    Baes, C.F. III.

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'')

  2. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites

  3. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  4. Probabilistic safety assessment framework of pebble-bed modular high-temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Liu Tao; Tong Jiejuan; Zhao Jun; Cao Jianzhu; Zhang Liguo

    2009-01-01

    After an investigation of similar reactor type probabilistic safety assessment (PSA) framework, Pebble-bed Modular High-Temperature Gas-cooled Reactor (HTR-PM) PSA framework was presented in correlate with its own design characteristics. That is an integral framework which spreads through event sequence structure with initiating events at the beginning and source term categories in the end. The analysis shows that it is HTR-PM design feature that determines its PSA framework. (authors)

  5. A reconnaissance assessment of probabilistic earthquake accelerations at the Nevada Test Site

    International Nuclear Information System (INIS)

    Perkins, D.M.; Thenhaus, P.C.; Hanson, S.L.; Algermissen, S.T.

    1986-01-01

    We have made two interim assessments of the probabilistic ground-motion hazard for the potential nuclear-waste disposal facility at the Nevada Test Site (NTS). The first assessment used historical seismicity and generalized source zones and source faults in the immediate vicinity of the facility. This model produced relatively high probabilistic ground motions, comparable to the higher of two earlier estimates, which was obtained by averaging seismicity in a 400-km-radius circle around the site. The high ground-motion values appear to be caused in part by nuclear-explosion aftershocks remaining in the catalog even after the explosions themselves have been removed. The second assessment used particularized source zones and source faults in a region substantially larger than NTS to provide a broad context of probabilistic ground motion estimates at other locations of the study region. Source faults are mapped or inferred faults having lengths of 5 km or more. Source zones are defined by boundaries separating fault groups on the basis of direction and density. For this assessment, earthquake recurrence has been estimated primarily from historic seismicity prior to nuclear testing. Long-term recurrence for large-magnitude events is constrained by geological estimates of recurrence in a regime in which the large-magnitude earthquakes would occur with predominately normal mechanisms. 4 refs., 10 figs

  6. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Y. Chen

    2001-12-19

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  7. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    International Nuclear Information System (INIS)

    Y. Chen

    2001-01-01

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  8. Development of the probabilistic exposure modeling in the frame of the radioactive residues final repository long-term safety analysis; Weiterentwicklung der probabilistischen Expositionsmodellierung im Rahmen der Langzeitsicherheitsanalyse von Endlagern fuer radioaktive Reststoffe

    Energy Technology Data Exchange (ETDEWEB)

    Ciecior, Willy

    2017-04-28

    The long-term safety analysis of repositories for radioactive waste is based on the modeling of the releases of nuclides from the waste matrix and the subsequent transport through the near and far field of the repository system to the living part of the environment (biosphere). For the conversion of the nuclide release into a potential hazard (e. g. into an effective dose), a conceptual biosphere model and a mathematical exposure model is used. The parametrization of the mathematical model can be carried out deterministic as well as probabilistic using distributions and Monte Carlo simulation. However, to date, particularly in the context of the probabilistic safety analysis for deep-geological repositories, there is no uniform procedure for the derivation of the distributions to be used. The distributions used by the analyst are mostly chosen according to personal conviction and often illogical with respect to the underlying nature of the actual model parameter, but model results are in part very dependent on the type of the selected distributions of the input parameters. Furthermore, there less studies available on the influence of interactions and correlations or other dependencies between the radiological input parameters of the model. Therefore, the impact of different types of distributions (empirical, parametric) for different input parameters as well as the influence of interactions and correlations between input parameters on the results of the mathematical exposure modeling were analyzed in the present study. The influence of the type of distribution for the representation of the variability of the physical input parameter as well as their interactions and dependencies could be identified as less relevant. However, by means of Monte Carlo simulation of the second order, the composition of the corresponding samples or the condition of the sample moments to be used for the construction of parametric distributions were determined as the essential factors for

  9. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    Science.gov (United States)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order

  10. Probabilistic biosphere modeling for the long-term safety assessment of geological disposal facilities for radioactive waste using first- and second-order Monte Carlo simulation.

    Science.gov (United States)

    Ciecior, Willy; Röhlig, Klaus-Jürgen; Kirchner, Gerald

    2018-10-01

    In the present paper, deterministic as well as first- and second-order probabilistic biosphere modeling approaches are compared. Furthermore, the sensitivity of the influence of the probability distribution function shape (empirical distribution functions and fitted lognormal probability functions) representing the aleatory uncertainty (also called variability) of a radioecological model parameter as well as the role of interacting parameters are studied. Differences in the shape of the output distributions for the biosphere dose conversion factor from first-order Monte Carlo uncertainty analysis using empirical and fitted lognormal distribution functions for input parameters suggest that a lognormal approximation is possibly not always an adequate representation of the aleatory uncertainty of a radioecological parameter. Concerning the comparison of the impact of aleatory and epistemic parameter uncertainty on the biosphere dose conversion factor, the latter here is described using uncertain moments (mean, variance) while the distribution itself represents the aleatory uncertainty of the parameter. From the results obtained, the solution space of second-order Monte Carlo simulation is much larger than that from first-order Monte Carlo simulation. Therefore, the influence of epistemic uncertainty of a radioecological parameter on the output result is much larger than that one caused by its aleatory uncertainty. Parameter interactions are only of significant influence in the upper percentiles of the distribution of results as well as only in the region of the upper percentiles of the model parameters. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  12. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  13. Analysis of the primary source term for meltdown accidents using MELCOR 1.8.2

    International Nuclear Information System (INIS)

    Schmuck, P.

    1995-01-01

    The MELCOR code describing accident phenomena in the core and primary systems was used for source term calculations and - in the context of the MELCOR Cooperative Assessment Programme - for studying two-phase flows through components such as valves and chokes. Results of the latter studies in comparison to experiments gave hints for an improved calculation of momentum transfer between the phases. (orig.)

  14. Proposal for implementation of alternative source term in the nuclear power plant of Laguna Verde

    International Nuclear Information System (INIS)

    Bazan L, A.; Lopez L, M.; Vargas A, A.; Cardenas J, J. B.

    2009-10-01

    In 2010 the nuclear power plant of Laguna Verde will implement the extended power upbeat in both units of the plant. Agree with methodology of NEDC-33004P-A, (constant pressure power up rate), and the source term of core, for accidents evaluations, were increased in proportion to the ratio of power level. This means that for the case of a design basis accident of loss of coolant an increase of power of 15% originated an increase of 15% in dose to main control room. Using the method of NEDC-33004P-A to extended power upbeat conditions was determined that the dose value to main control room is very near to regulatory limit established by SRP 6.4. By the above and in order to recover the margin, the nuclear power plant of Laguna Verde will calculate an alternative source term following the criteria established in RG 1.183 (alternative radiological source term for evaluating DBA at nuclear power reactor). This approach also have a more realistic dose value using the criterion of 10-CFR-50.67, in addition is predicted to get the benefit of additional operational flexibilities. This paper present the proposal of implementing the alternative source term in Laguna Verde. (Author)

  15. Reciprocity relations and the mode conversion-absorption equation with an inhomogeneous source term

    International Nuclear Information System (INIS)

    Cho, S.; Swanson, D.G.

    1990-01-01

    The fourth-order mode conversion equation is solved completely via the Green's function to include an inhomogeneous source term. This Green's function itself contains all the plasma responsive effects such as mode conversion and absorption, and can be used to describe the spontaneous emission. In the course of the analysis, the reciprocity relations between coupling parameters are proved

  16. PLOTLIB: a computerized nuclear waste source-term library storage and retrieval system

    International Nuclear Information System (INIS)

    Marshall, J.R.; Nowicki, J.A.

    1978-01-01

    The PLOTLIB code was written to provide computer access to the Nuclear Waste Source-Term Library for those users with little previous computer programming experience. The principles of user orientation, quick accessibility, and versatility were extensively employed in the development of the PLOTLIB code to accomplish this goal. The Nuclear Waste Source-Term Library consists of 16 ORIGEN computer runs incorporating a wide variety of differing light water reactor (LWR) fuel cycles and waste streams. The typical isotopic source-term data consist of information on watts, curies, grams, etc., all of which are compiled as a function of time after reactor discharge and unitized on a per metric ton heavy metal basis. The information retrieval code, PLOTLIB, is used to process source-term information requests into computer plots and/or user-specified output tables. This report will serve both as documentation of the current data library and as an operations manual for the PLOTLIB computer code. The accompanying input description, program listing, and sample problems make this code package an easily understood tool for the various nuclear waste studies under way at the Office of Waste Isolation

  17. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    Science.gov (United States)

    Hall, Matthew L.; Bavelier, Daphne

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study…

  18. Radioiodine source term and its potential impact on the use of potassium iodide

    International Nuclear Information System (INIS)

    Malinauskas, A.P.

    1982-01-01

    Information is presented concerning chemical forms of fission product iodine in the primary circuit; chemical forms of fission product iodine in the containment building; summary of iodine chemistry in light water reactor accidents; and impact of the radiodine source term on the potassium iodide issue

  19. New source terms: what do they tell us about engineered safety feature performance

    International Nuclear Information System (INIS)

    Bernero, R.M.

    1985-01-01

    The accident behavior models which are the basis of engineered safety feature design are generally simple, non-mechanistic and concentrated on volatile radioiodine. Now data from source term studies show that models should be more mechanistic and look at other species than volatile iodine. A complete reevaluation of engineered safety features is needed

  20. Model description for calculating the source term of the Angra 1 environmental control system

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Amaral Neto, J.D.; Salles, M.R.

    1988-01-01

    This work presents the model used for evaluation of source term released from Angra 1 Nuclear Power Plant in case of an accident. After that, an application of the model for the case of a Fuel Assembly Drop Accident Inside the Fuel Handling Building during reactor refueling is presented. (author) [pt

  1. Determination of Source Term for an Annual Stack Release of Gas Reactor G.A. Siwabessy

    International Nuclear Information System (INIS)

    Sudiyati; Syahrir; Unggul Hartoyo; Nugraha Luhur

    2008-01-01

    Releases of radionuclide from the reactor are noble gases, halogenides and particulates. The measurements were carried out directly on the air monitoring system of the stack. The results of these measurements are compared with the annual Source-Term data from the Safety Analyses report (SAR) of RSG-GAS. The measurement results are smaller than the data reported in SAR document. (author)

  2. The Role of Language in Building Probabilistic Thinking

    Science.gov (United States)

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  3. A Probabilistic Framework for Security Scenarios with Dependent Actions

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweizer, Patrick; Albert, Elvira; Sekereinsk, Emil

    2014-01-01

    This work addresses the growing need of performing meaningful probabilistic analysis of security. We propose a framework that integrates the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. This allows us to perform

  4. Added Value of uncertainty Estimates of SOurce term and Meteorology (AVESOME)

    DEFF Research Database (Denmark)

    Sørensen, Jens Havskov; Schönfeldt, Fredrik; Sigg, Robert

    In the early phase of a nuclear accident, two large sources of uncertainty exist: one related to the source term and one associated with the meteorological data. Operational methods are being developed in AVESOME for quantitative estimation of uncertainties in atmospheric dispersion prediction.......g. at national meteorological services, the proposed methodology is feasible for real-time use, thereby adding value to decision support. In the recent NKS-B projects MUD, FAUNA and MESO, the implications of meteorological uncertainties for nuclear emergency preparedness and management have been studied...... uncertainty in atmospheric dispersion model forecasting stemming from both the source term and the meteorological data is examined. Ways to implement the uncertainties of forecasting in DSSs, and the impacts on real-time emergency management are described. The proposed methodology allows for efficient real...

  5. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    Science.gov (United States)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  6. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term - Trial Calculation

    International Nuclear Information System (INIS)

    Grabaskas, David

    2016-01-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  7. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Brunett, Acacia J. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Denman, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Clark, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Denning, Richard S. [Consultant, Columbus, OH (United States)

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  8. Low-level radioactive waste source terms for the 1992 integrated data base

    International Nuclear Information System (INIS)

    Loghry, S.L.; Kibbey, A.H.; Godbee, H.W.; Icenhour, A.S.; DePaoli, S.M.

    1995-01-01

    This technical manual presents updated generic source terms (i.e., unitized amounts and radionuclide compositions) which have been developed for use in the Integrated Data Base (IDB) Program of the U.S. Department of Energy (DOE). These source terms were used in the IDB annual report, Integrated Data Base for 1992: Spent Fuel and Radioactive Waste Inventories, Projections, and Characteristics, DOE/RW-0006, Rev. 8, October 1992. They are useful as a basis for projecting future amounts (volume and radioactivity) of low-level radioactive waste (LLW) shipped for disposal at commercial burial grounds or sent for storage at DOE solid-waste sites. Commercial fuel cycle LLW categories include boiling-water reactor, pressurized-water reactor, fuel fabrication, and uranium hexafluoride (UF 6 ) conversion. Commercial nonfuel cycle LLW includes institutional/industrial (I/I) waste. The LLW from DOE operations is category as uranium/thorium fission product, induced activity, tritium, alpha, and open-quotes otherclose quotes. Fuel cycle commercial LLW source terms are normalized on the basis of net electrical output [MW(e)-year], except for UF 6 conversion, which is normalized on the basis of heavy metal requirement [metric tons of initial heavy metal ]. The nonfuel cycle commercial LLW source term is normalized on the basis of volume (cubic meters) and radioactivity (curies) for each subclass within the I/I category. The DOE LLW is normalized in a manner similar to that for commercial I/I waste. The revised source terms are based on the best available historical data through 1992

  9. Probabilistic analysis showing that a combination of bacteroides and methanobrevibacter source tracking markers is effective for identifying waters contaminated by human fecal pollution

    Science.gov (United States)

    Johnston, Christopher; Byappanahalli, Muruleedhara N.; Gibson, Jacqueline MacDonald; Ufnar, Jennifer A.; Whitman, Richard L.; Stewart, Jill R.

    2013-01-01

    Microbial source tracking assays to identify sources of waterborne contamination typically target genetic markers of host-specific microorganisms. However, no bacterial marker has been shown to be 100% host-specific, and cross-reactivity has been noted in studies evaluating known source samples. Using 485 challenge samples from 20 different human and animal fecal sources, this study evaluated microbial source tracking markers including the Bacteroides HF183 16S rRNA, M. smithii nifH, and Enterococcus esp gene targets that have been proposed as potential indicators of human fecal contamination. Bayes' Theorem was used to calculate the conditional probability that these markers or a combination of markers can correctly identify human sources of fecal pollution. All three human-associated markers were detected in 100% of the sewage samples analyzed. Bacteroides HF183 was the most effective marker for determining whether contamination was specifically from a human source, and greater than 98% certainty that contamination was from a human source was shown when both Bacteroides HF183 and M. smithii nifH markers were present. A high degree of certainty was attained even in cases where the prior probability of human fecal contamination was as low as 8.5%. The combination of Bacteroides HF183 and M. smithii nifH source tracking markers can help identify surface waters impacted by human fecal contamination, information useful for prioritizing restoration activities or assessing health risks from exposure to contaminated waters.

  10. Effectiveness of Partition and Graph Theoretic Clustering Algorithms for Multiple Source Partial Discharge Pattern Classification Using Probabilistic Neural Network and Its Adaptive Version: A Critique Based on Experimental Studies

    Directory of Open Access Journals (Sweden)

    S. Venkatesh

    2012-01-01

    Full Text Available Partial discharge (PD is a major cause of failure of power apparatus and hence its measurement and analysis have emerged as a vital field in assessing the condition of the insulation system. Several efforts have been undertaken by researchers to classify PD pulses utilizing artificial intelligence techniques. Recently, the focus has shifted to the identification of multiple sources of PD since it is often encountered in real-time measurements. Studies have indicated that classification of multi-source PD becomes difficult with the degree of overlap and that several techniques such as mixed Weibull functions, neural networks, and wavelet transformation have been attempted with limited success. Since digital PD acquisition systems record data for a substantial period, the database becomes large, posing considerable difficulties during classification. This research work aims firstly at analyzing aspects concerning classification capability during the discrimination of multisource PD patterns. Secondly, it attempts at extending the previous work of the authors in utilizing the novel approach of probabilistic neural network versions for classifying moderate sets of PD sources to that of large sets. The third focus is on comparing the ability of partition-based algorithms, namely, the labelled (learning vector quantization and unlabelled (K-means versions, with that of a novel hypergraph-based clustering method in providing parsimonious sets of centers during classification.

  11. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    Science.gov (United States)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean

  12. Evaluation of short- and long-term fission product sources at the Fukushima Daiichi NPP

    International Nuclear Information System (INIS)

    Uchida, Shunsuke; Naitoh, Masanori; Suzuki, Hiroaki; Okada, Hidetoshi; Pellegrini, Marco; Achilli, Andrea; Hanamoto, Yukio; Sasaki, Hiroaki

    2014-01-01

    Research on fission product (FP) behaviors used to be one of the most important subjects in water chemistry but it is not done nowadays as a consequence of the increased integrity of nuclear fuels and the minimization of FP release into the environment. Evaluation of FP release into the environment is still one of the key issues for severe accident analysis, though. Although there have been a long quiet period in nuclear safety research, how to detect initiation of severe accidents, how to prevent them and how to mitigate them are still important subjects for nuclear engineering, and how to control the severe accidents after their occurrence, especially how to control FP release into the environment, has seldom been discussed in the water chemistry group recently. The paper is intended to address the issue of fewer activities for FP studies. FP sources are divided into two categories, short- and long-term FP sources. Short-term FP source can be evaluated based on the measured data obtained from monitoring posts (MPs), which give us clear evidence on the importance of radioactive iodine and cesium releases into the environment. It used to be considered that during primary containment vessel (PCV) venting, release of each element, e.g., iodine and cesium, was determined by the suppression pool scrubbing efficiency and most of the cesium would likely be removed in the pool due to its large scrubbing efficiency. But as a result of analyzing the MP data at early stage of the Fukushima Daiichi nuclear power plant (NPP) accident, it was confirmed that the releases of both elements were in proportion to their inventories in the reactors and their scrubbing efficiencies were almost the same. The scrubbing efficiency which increased with the pool water temperature became almost the same for iodine and cesium around the pool water boiling temperature. As a result of the mass balance analysis for FPs in the contaminated water accumulated in the Fukushima Daiichi plant site, it

  13. LS-APC v1.0: a tuning-free method for the linear inverse problem and its application to source-term determination

    Directory of Open Access Journals (Sweden)

    O. Tichý

    2016-11-01

    Full Text Available Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values as a product of the source-receptor sensitivity (SRS matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.

  14. LS-APC v1.0: a tuning-free method for the linear inverse problem and its application to source-term determination

    Science.gov (United States)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Stohl, Andreas

    2016-11-01

    Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values) as a product of the source-receptor sensitivity (SRS) matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX) where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.

  15. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    Science.gov (United States)

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  16. Scoping-level Probabilistic Safety Assessment of a complex experimental facility: Challenges and first results from the application to a neutron source facility (MEGAPIE)

    International Nuclear Information System (INIS)

    Podofillini, L.; Dang, V.N.; Thomsen, K.

    2008-01-01

    This paper presents a scoping-level application of Probabilistic Safety Assessment (PSA) to selected systems of a complex experimental facility. In performing a PSA for this type of facility, a number of challenges arise, mainly due to the extensive use of electronic and programmable components and of one-of-a-kind components. The experimental facility is the Megawatt Pilot Target Experiment (MEGAPIE), which was hosted at the Paul Scherrer Institut (PSI). MEGAPIE demonstrated the feasibility of a liquid lead-bismuth target for spallation facilities at a proton beam power level of 1 MW. Given the challenges to estimate initiating event frequencies and failure event probabilities, emphasis is placed on the qualitative results obtainable from the PSA. Even though this does not allow a complete and appropriate characterization of the risk profile, some level of importance/significance evaluation was feasible, and practical and detailed recommendations on potential system improvements were derived. The second part of the work reports on a preliminary quantification of the facility risk. This provides more information on risk significance, which allows prioritizing the insights and recommendations obtained from the PSA. At the present stage, the limited knowledge on initiating and failure events is reflected in the uncertainties in their probabilities as well as in inputs quantified with bounding values. Detailed analyses to improve the quantification of these inputs, many of which turn out to be important contributors, were out of the scope of this study. Consequently, the reported results should be primarily considered as a demonstration of how quantification of the facility risk by a PSA can support risk-informed decisions, rather than precise figures of the facility risk

  17. Validation of in vitro probabilistic tractography

    DEFF Research Database (Denmark)

    Dyrby, Tim B.; Sogaard, L.V.; Parker, G.J.

    2007-01-01

    assessed the anatomical validity and reproducibility of in vitro multi-fiber probabilistic tractography against two invasive tracers: the histochemically detectable biotinylated dextran amine and manganese enhanced magnetic resonance imaging. Post mortern DWI was used to ensure that most of the sources...

  18. Inventory and source term evaluation of Russian nuclear power plants for marine applications

    International Nuclear Information System (INIS)

    Reistad, O.; Oelgaard, P.L.

    2006-04-01

    This report discusses inventory and source term properties in regard to operation and possible releases due to accidents from Russian marine reactor systems. The first part of the report discusses relevant accidents on the basis of both Russian and western sources. The overview shows that certain vessels were much more accident prone compared to others, in addition, there have been a noteworthy reduction in accidents the last two decades. However, during the last years new types of incidents, such as collisions, has occurred more frequently. The second part of the study considers in detail the most important factors for the source term; reactor operational characteristics and the radionuclide inventory. While Russian icebreakers has been operated on a similar basis as commercial power plants, the submarines has different power cyclograms which results in considerable lower values for fission product inventory. Theoretical values for radionuclide inventory are compared with computed results using the modelling tool HELIOS. Regarding inventory of transuranic elements, the results of the calculations are discussed in detail for selected vessels. Criticality accidents, loss-of-cooling accidents and sinking accidents are considered, bases on actual experiences with these types of accident and on theoretical considerations, and source terms for these accidents are discussed in the last chapter. (au)

  19. Inventory and source term evaluation of Russian nuclear power plants for marine applications

    Energy Technology Data Exchange (ETDEWEB)

    Reistad, O. [Norwegian Radiation Protection Authority (Norway); Oelgaard, P.L. [Risoe National Lab. (Denmark)

    2006-04-15

    This report discusses inventory and source term properties in regard to operation and possible releases due to accidents from Russian marine reactor systems. The first part of the report discusses relevant accidents on the basis of both Russian and western sources. The overview shows that certain vessels were much more accident prone compared to others, in addition, there have been a noteworthy reduction in accidents the last two decades. However, during the last years new types of incidents, such as collisions, has occurred more frequently. The second part of the study considers in detail the most important factors for the source term; reactor operational characteristics and the radionuclide inventory. While Russian icebreakers has been operated on a similar basis as commercial power plants, the submarines has different power cyclograms which results in considerable lower values for fission product inventory. Theoretical values for radionuclide inventory are compared with computed results using the modelling tool HELIOS. Regarding inventory of transuranic elements, the results of the calculations are discussed in detail for selected vessels. Criticality accidents, loss-of-cooling accidents and sinking accidents are considered, bases on actual experiences with these types of accident and on theoretical considerations, and source terms for these accidents are discussed in the last chapter. (au)

  20. Accident source terms for Light-Water Nuclear Power Plants. Final report

    International Nuclear Information System (INIS)

    Soffer, L.; Burson, S.B.; Ferrell, C.M.; Lee, R.Y.; Ridgely, J.N.

    1995-02-01

    In 1962 tile US Atomic Energy Commission published TID-14844, ''Calculation of Distance Factors for Power and Test Reactors'' which specified a release of fission products from the core to the reactor containment for a postulated accident involving ''substantial meltdown of the core''. This ''source term'', tile basis for tile NRC's Regulatory Guides 1.3 and 1.4, has been used to determine compliance with tile NRC's reactor site criteria, 10 CFR Part 100, and to evaluate other important plant performance requirements. During the past 30 years substantial additional information on fission product releases has been developed based on significant severe accident research. This document utilizes this research by providing more realistic estimates of the ''source term'' release into containment, in terms of timing, nuclide types, quantities and chemical form, given a severe core-melt accident. This revised ''source term'' is to be applied to the design of future light water reactors (LWRs). Current LWR licensees may voluntarily propose applications based upon it

  1. Loss of confinement of liquefied gases. Evaluation of the source term; Perte de confinement de gaz liquefies. Evaluation du terme source

    Energy Technology Data Exchange (ETDEWEB)

    Alix, P.; Novat, E.; Hocquet, J.; Bigot, J.P. [Ecole Nationale Superieure des Mines, Centre SPIN, 42 - Saint-Etienne (France)

    2001-07-01

    In this work, the states law corresponding to flow rate measurements of two-phase flows performed with five different fluid (water, butane, R11, ethyl acetate, methanol) is applied. This allows to show that the critical mass flux (which is used as source term in the scenario of loss of confinement in liquefied gas reservoirs) is a 'universal' function of the reduced initial pressure P{sub 0}{sup *}, which can be used for most of the single-constituent fluids of the processes industry. Thus it is easy to make a relatively precise estimation of the critical mass flux (uncertainty < 20% for P{sub 0}{sup *} < 15%) without the need of any model. It is shown also that no improvement of the models can be expected from the use of the vaporization kinetics. On the contrary, a qualitative consideration indicates that the use of the slip seems more promising. (J.S.)

  2. A well-balanced scheme for Ten-Moment Gaussian closure equations with source term

    Science.gov (United States)

    Meena, Asha Kumari; Kumar, Harish

    2018-02-01

    In this article, we consider the Ten-Moment equations with source term, which occurs in many applications related to plasma flows. We present a well-balanced second-order finite volume scheme. The scheme is well-balanced for general equation of state, provided we can write the hydrostatic solution as a function of the space variables. This is achieved by combining hydrostatic reconstruction with contact preserving, consistent numerical flux, and appropriate source discretization. Several numerical experiments are presented to demonstrate the well-balanced property and resulting accuracy of the proposed scheme.

  3. A Geometric Presentation of Probabilistic Satisfiability

    OpenAIRE

    Morales-Luna, Guillermo

    2010-01-01

    By considering probability distributions over the set of assignments the expected truth values assignment to propositional variables are extended through linear operators, and the expected truth values of the clauses at any given conjunctive form are also extended through linear maps. The probabilistic satisfiability problems are discussed in terms of the introduced linear extensions. The case of multiple truth values is also discussed.

  4. Probabilistic solution of the Dirac equation

    International Nuclear Information System (INIS)

    Blanchard, P.; Combe, P.

    1985-01-01

    Various probabilistic representations of the 2, 3 and 4 dimensional Dirac equation are given in terms of expectation with respect to stochastic jump processes and are used to derive the nonrelativistic limit even in the presence of an external electromagnetic field. (orig.)

  5. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  6. Overview of waste isoltaion safety assessment program and description of source term characterization task at PNL

    International Nuclear Information System (INIS)

    Bradley, D.

    1977-01-01

    A project is being conducted to develop and illustrate the methods and obtain the data necessary to assess the safety of long-term disposal of high-level radioactive waste in geologic formations. The methods and data will initially focus on generic geologic isolation systems but will ultimately be applied to the long-term safety assessment of specific candidate sites that are selected in the NWTS Program. The activities of waste isolation safety assessment (WISAP) are divided into six tasks: (1) Safety Assessment Concepts and Methods, (2) Disruptive Event Analysis, (3) Source Characterization, (4) Transport Modeling, (5) Transport Data and (6) Societal Acceptance

  7. A Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III (registered trademark): IC4

    Science.gov (United States)

    2017-06-07

    blue and 4 locations in the ice: 1, 2, 5, and 10 km. Notice the steepening of the high frequency face and the shift of the peak to slightly lower...Term for Wave Attenuation by Sea Ice in WAVEWATCH III®: IC4 ClarenCe O. COllins iii W. eriCk rOgers Ocean Dynamics and Prediction Branch Oceanography...Wave model Sea ice Ocean surface waves Arctic Ocean WAVEWATCH III Spectral wave modeling Source terms Wave hindcasting 73-N2K2-07-5 Naval Research

  8. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.

    Science.gov (United States)

    McClelland, James L

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.

  9. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  10. Calculation of the isotope concentrations, source terms and radiation shielding of the SAFARI-1 irradiation products

    International Nuclear Information System (INIS)

    Stoker, C.C.; Ball, G.

    2000-01-01

    The ever increasing expansion of the irradiation product portfolio of the SAFARI-1 reactor leads to the need to routinely calculate the radio-isotope concentrations and source terms for the materials irradiated in the reactor accurately. In addition to this, the required shielding for the transportation and processing of these irradiation products needs to be determined. In this paper the calculational methodology applied is described with special attention given to the spectrum dependence of the one-group cross sections of selected SAFARI-1 irradiation materials and the consequent effect on the determination of the isotope concentrations and source terms. Comparisons of the calculated isotopic concentrations and dose rates with experimental analysis and measurements provide confidence in the calculational methodologies and data used. (author)

  11. Source terms for analysis of accidents at a high level waste repository

    International Nuclear Information System (INIS)

    Mubayi, V.; Davis, R.E.; Youngblood, R.

    1989-01-01

    This paper describes an approach to identifying source terms from possible accidents during the preclosure phase of a high-level nuclear waste repository. A review of the literature on repository safety analyses indicated that source term estimation is in a preliminary stage, largely based on judgement-based scoping analyses. The approach developed here was to partition the accident space into domains defined by certain threshold values of temperature and impact energy density which may arise in potential accidents and specify release fractions of various radionuclides, present in the waste form, in each domain. Along with a more quantitative understanding of accident phenomenology, this approach should help in achieving a clearer perspective on scenarios important to preclosure safety assessments of geologic repositories. 18 refs., 3 tabs

  12. Final report of the inter institutional project ININ-CNSNS 'Source Terms specific for the CNLV'

    International Nuclear Information System (INIS)

    Anaya M, R.A.

    1991-02-01

    The purpose of the project inter institutional ININ-CNSNS 'Source Terms Specifies for the CNLV' it is the one of implanting in the computer CYBER (CDC 180-830) of the ININ, the 'Source Term Code Package' (STCP) and to make the operation tests and corresponding operation using the data of the sample problem, for finally to liberate the package, all time that by means of the analysis of the results it is consider appropriate. In this report the results of the are presented simulation of the sequence 'Energy Losses external' (Station blackout) and 'Lost total of CA with failure of the RCIC and success of the HPCS' both with data of the Laguna Verde Central. (Author)

  13. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.

    1995-05-01

    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  14. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  15. On the application of subcell resolution to conservation laws with stiff source terms

    International Nuclear Information System (INIS)

    Chang, S.

    1989-11-01

    LeVeque and Yee recently investigated a one-dimensional scalar conservation law with stiff source terms modeling the reacting flow problems and discovered that for the very stiff case most of the current finite difference methods developed for non-reacting flows would produce wrong solutions when there is a propagating discontinuity. A numerical scheme, essentially nonoscillatory/subcell resolution - characteristic direction (ENO/SRCD), is proposed for solving conservation laws with stiff source terms. This scheme is a modification of Harten's ENO scheme with subcell resolution, ENO/SR. The locations of the discontinuities and the characteristic directions are essential in the design. Strang's time-splitting method is used and time evolutions are done by advancing along the characteristics. Numerical experiment using this scheme shows excellent results on the model problem of LeVeque and Yee. Comparisons of the results of ENO, ENO/SR, and ENO/SRCD are also presented

  16. Finite volume schemes with equilibrium type discretization of source terms for scalar conservation laws

    International Nuclear Information System (INIS)

    Botchorishvili, Ramaz; Pironneau, Olivier

    2003-01-01

    We develop here a new class of finite volume schemes on unstructured meshes for scalar conservation laws with stiff source terms. The schemes are of equilibrium type, hence with uniform bounds on approximate solutions, valid in cell entropy inequalities and exact for some equilibrium states. Convergence is investigated in the framework of kinetic schemes. Numerical tests show high computational efficiency and a significant advantage over standard cell centered discretization of source terms. Equilibrium type schemes produce accurate results even on test problems for which the standard approach fails. For some numerical tests they exhibit exponential type convergence rate. In two of our numerical tests an equilibrium type scheme with 441 nodes on a triangular mesh is more accurate than a standard scheme with 5000 2 grid points

  17. Assessing the joint impact of DNAPL source-zone behavior and degradation products on the probabilistic characterization of human health risk

    Science.gov (United States)

    Henri, Christopher V.; Fernàndez-Garcia, Daniel; de Barros, Felipe P. J.

    2016-02-01

    The release of industrial contaminants into the subsurface has led to a rapid degradation of groundwater resources. Contamination caused by Dense Non-Aqueous Phase Liquids (DNAPLs) is particularly severe owing to their limited solubility, slow dissolution and in many cases high toxicity. A greater insight into how the DNAPL source zone behavior and the contaminant release towards the aquifer impact human health risk is crucial for an appropriate risk management. Risk analysis is further complicated by the uncertainty in aquifer properties and contaminant conditions. This study focuses on the impact of the DNAPL release mode on the human health risk propagation along the aquifer under uncertain conditions. Contaminant concentrations released from the source zone are described using a screening approach with a set of parameters representing several scenarios of DNAPL architecture. The uncertainty in the hydraulic properties is systematically accounted for by high-resolution Monte Carlo simulations. We simulate the release and the transport of the chlorinated solvent perchloroethylene and its carcinogenic degradation products in randomly heterogeneous porous media. The human health risk posed by the chemical mixture of these contaminants is characterized by the low-order statistics and the probability density function of common risk metrics. We show that the zone of high risk (hot spot) is independent of the DNAPL mass release mode, and that the risk amplitude is mostly controlled by heterogeneities and by the source zone architecture. The risk is lower and less uncertain when the source zone is formed mostly by ganglia than by pools. We also illustrate how the source zone efficiency (intensity of the water flux crossing the source zone) affects the risk posed by an exposure to the chemical mixture. Results display that high source zone efficiencies are counter-intuitively beneficial, decreasing the risk because of a reduction in the time available for the production

  18. Quantification of source-term profiles from near-field geochemical models

    International Nuclear Information System (INIS)

    McKinley, I.G.

    1985-01-01

    A geochemical model of the near-field is described which quantitatively treats the processes of engineered barrier degradation, buffering of aqueous chemistry by solid phases, nuclide solubilization and transport through the near-field and release to the far-field. The radionuclide source-terms derived from this model are compared with those from a simpler model used for repository safety analysis. 10 refs., 2 figs., 2 tabs

  19. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    OpenAIRE

    Hall, Matthew L.

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory – perception, encoding, and recall – in this effect. The present study factorially manipulates whether American Sign Language (ASL) or English was used for perception, memory encoding, and recall in hearing ASL-English b...

  20. Short-term memory stages in sign vs. speech: The source of the serial span discrepancy

    OpenAIRE

    Hall, Matthew L.; Bavelier, Daphné

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory – perception, encoding, and recall – in this effect. The present study factorially manipulates whether American Sign Language (ASL) or English is used for perception, memory encoding, and recall in hearing ASL-English bi...

  1. On the sequence of core-melt accidents: Fission product release, source terms and Chernobyl release

    Energy Technology Data Exchange (ETDEWEB)

    Albrecht, H

    1986-01-01

    There is a sketch of our ideas on the course of a core melt-out accident in a PWR. There is then a survey of the most important results on fission product release, which were obtained by experiments on the SASCHA melt-out plant. The 3rd part considers questions which are important for determining source terms for the environment and the last part contains some considerations on radioactivity release from the Chernobyl reactor.

  2. Refined Source Terms in WAVEWATCH III with Wave Breaking and Sea Spray Forecasts

    Science.gov (United States)

    2015-09-30

    dissipation and breaking, nonlinear wave-wave interaction, bottom friction, wave-mud interaction, wave-current interaction as well as sea spray flux. These...shallow water outside the surf zone. After careful testing within a comprehensive suite of test bed cases, these refined source terms will be...aim to refine the parameterization of air-sea and upper ocean fluxes, including wind input and sea spray as well as dissipation, and hence improve

  3. Optimization method for identifying the source term in an inverse wave equation

    Directory of Open Access Journals (Sweden)

    Arumugam Deiveegan

    2017-08-01

    Full Text Available In this work, we investigate the inverse problem of identifying a space-wise dependent source term of wave equation from the measurement on the boundary. On the basis of the optimal control framework, the inverse problem is transformed into an optimization problem. The existence and necessary condition of the minimizer for the cost functional are obtained. The projected gradient method and two-parameter model function method are applied to the minimization problem and numerical results are illustrated.

  4. Reachability Analysis in Probabilistic Biological Networks.

    Science.gov (United States)

    Gabr, Haitham; Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2015-01-01

    Extra-cellular molecules trigger a response inside the cell by initiating a signal at special membrane receptors (i.e., sources), which is then transmitted to reporters (i.e., targets) through various chains of interactions among proteins. Understanding whether such a signal can reach from membrane receptors to reporters is essential in studying the cell response to extra-cellular events. This problem is drastically complicated due to the unreliability of the interaction data. In this paper, we develop a novel method, called PReach (Probabilistic Reachability), that precisely computes the probability that a signal can reach from a given collection of receptors to a given collection of reporters when the underlying signaling network is uncertain. This is a very difficult computational problem with no known polynomial-time solution. PReach represents each uncertain interaction as a bi-variate polynomial. It transforms the reachability problem to a polynomial multiplication problem. We introduce novel polynomial collapsing operators that associate polynomial terms with possible paths between sources and targets as well as the cuts that separate sources from targets. These operators significantly shrink the number of polynomial terms and thus the running time. PReach has much better time complexity than the recent solutions for this problem. Our experimental results on real data sets demonstrate that this improvement leads to orders of magnitude of reduction in the running time over the most recent methods. Availability: All the data sets used, the software implemented and the alignments found in this paper are available at http://bioinformatics.cise.ufl.edu/PReach/.

  5. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  6. Evaluation of applicability of alternative source terms to operating nuclear power plants in Korea

    International Nuclear Information System (INIS)

    Lim, S. N.; Park, Y. S.; Nam, K. M.; Song, D. B.; Bae, Y. J.; Lee, Y. J.; Jung, C. Y.

    2002-01-01

    In 1995 and 2000, NRC issued NUREG-1465 and Regulatory Guide 1.183 with respect to Alternative Source Terms(AST) replacing the existing source terms of TID-14844 and Regulatory Guide 1.4, 1.25, and 1.77 for radiological Design Basis Accidents(DBA) analysis. In 1990, ICRP published ICRP Pub. 60 which represents new recommendations on dose criteria and concepts. In Korea, alternative source terms were used for evaluation of effective doses for design basis accidents of Advanced Power Reactor(APR1400) using the computer program developed by an overseas company. Recently, DBADOSE, new computer program for DBA analysis incorporating AST and effective dose concept was developed by KHNP and KOPEC, and reanalysis applying AST to operating nuclear power plants, Kori units 3 and 4 in Korea using DBADOSE has been performed. As the results of this analysis, it was concluded that some conservative variables or operation procedures of operating plants could be mitigated or simplified by virtue of increased safety margin and consequently, economical and operational benefits ensue. In this paper, methodologies and results of Kori 3 and 4 DBA reanalysis and sensitivity analysis for mitigation of main design variables are introduced

  7. Unsplit schemes for hyperbolic conservation laws with source terms in one space dimension

    International Nuclear Information System (INIS)

    Papalexandris, M.V.; Leonard, A.; Dimotakis, P.E.

    1997-01-01

    The present work is concerned with an application of the theory of characteristics to conservation laws with source terms in one space dimension, such as the Euler equations for reacting flows. Space-time paths are introduced on which the flow/chemistry equations decouple to a characteristic set of ODE's for the corresponding homogeneous laws, thus allowing the introduction of functions analogous to the Riemann invariants in classical theory. The geometry of these paths depends on the spatial gradients of the solution. This particular decomposition can be used in the design of efficient unsplit algorithms for the numerical integration of the equations. As a first step, these ideas are implemented for the case of a scalar conservation law with a nonlinear source term. The resulting algorithm belongs to the class of MUSCL-type, shock-capturing schemes. Its accuracy and robustness are checked through a series of tests. The stiffness of the source term is also studied. Then, the algorithm is generalized for a system of hyperbolic equations, namely the Euler equations for reacting flows. A numerical study of unstable detonations is performed. 57 refs

  8. The Chernobyl reactor accident source term: development of a consensus view

    International Nuclear Information System (INIS)

    Devell, L.; Guntay, S.; Powers, D.A.

    1995-11-01

    Ten years after the reactor accident at Chernobyl, a great deal more data is available concerning the events, phenomena, and processes that took place. The purpose of this document is to examine what is known about the radioactive materials released during the accident, a task that is substantially more difficult than it might first appear to be. The Chernobyl station, like other nuclear power plants, was not instrumented to characterize a disastrous accident. The accident was peculiar in the sense that radioactive materials were released, at least initially, in an exceptionally energetic plume and were transported far from the reactor site. Release of radioactivity from the plant continued for several days. Characterization of the contamination caused by the releases of radioactivity has had a much lower priority than remediation of the contamination. Consequently, an assessment of the Chernobyl accident source term must rely to a significant extent on inferential evidence. The assessment presented here begins with an examination of the core inventories of radioactive materials. In subsequent sections of the report, the magnitude and timing of the releases of radioactivity are described. Then, the composition, chemical forms, and physical forms of the releases are discussed. A number of more recent publications and results from scientists in Russia and elsewhere have significantly improved the understanding of the Chernobyl source term. Because of the special features of the reactor design and the peculiarities of the Chernobyl accident, the source term for the Chernobyl accident is of limited applicability to the safety analysis of other types of reactors

  9. Least-squares finite-element method for shallow-water equations with source terms

    Institute of Scientific and Technical Information of China (English)

    Shin-Jye Liang; Tai-Wen Hsu

    2009-01-01

    Numerical solution of shallow-water equations (SWE) has been a challenging task because of its nonlinear hyperbolic nature, admitting discontinuous solution, and the need to satisfy the C-property. The presence of source terms in momentum equations, such as the bottom slope and friction of bed, compounds the difficulties further. In this paper, a least-squares finite-element method for the space discretization and θ-method for the time integration is developed for the 2D non-conservative SWE including the source terms. Advantages of the method include: the source terms can be approximated easily with interpolation functions, no upwind scheme is needed, as well as the resulting system equations is symmetric and positive-definite, therefore, can be solved efficiently with the conjugate gradient method. The method is applied to steady and unsteady flows, subcritical and transcritical flow over a bump, 1D and 2D circular dam-break, wave past a circular cylinder, as well as wave past a hump. Computed results show good C-property, conservation property and compare well with exact solutions and other numerical results for flows with weak and mild gradient changes, but lead to inaccurate predictions for flows with strong gradient changes and discontinuities.

  10. Development of dose calculation program (DBADOSE) incorporating alternative source term due to design basis accident

    International Nuclear Information System (INIS)

    Bae, Young Jig; Nam, Ki Mun; Lee, Yu Jong; Chung, Chan Young

    2003-01-01

    Source terms presented in TID-14844 and Regulatory Guide 1.4 have been used for radiological analysis of design basis accidents for licensing existing pressurized water reactor (PWR). However, more realistic and physically-based source term based on results of study and experiments for about 30 years after the publication of TID-14844 was developed and presented in NUREG-1465 published by U.S NRC in 1995. In addition, ICRP has revised dose concepts and criteria through the publication of ICRP-9, 26, 60 and recommended effective dose concepts rather than critical organ concept since the publication of ICRP-26. Accordingly, multipurpose computer program called DBADOSE incorporating alternative source terms in NUREG-1465 and effective dose concepts in ICRP-60 was developed. Comparison of results of DBADOSE with those of POSTDBA and STARDOSE was performed and verified and no significant difference and inaccuracy were found. DBADOSE will be used to evaluate accidental doses for licensing application according to the domestic laws that are expected to be revised in the near future

  11. Quantification of uncertainties in source term estimates for a BWR with Mark I containment

    International Nuclear Information System (INIS)

    Khatib-Rahbar, M.; Cazzoli, E.; Davis, R.; Ishigami, T.; Lee, M.; Nourbakhsh, H.; Schmidt, E.; Unwin, S.

    1988-01-01

    A methodology for quantification and uncertainty analysis of source terms for severe accident in light water reactors (QUASAR) has been developed. The objectives of the QUASAR program are (1) to develop a framework for performing an uncertainty evaluation of the input parameters of the phenomenological models used in the Source Term Code Package (STCP), and (2) to quantify the uncertainties in certain phenomenological aspects of source terms (that are not modeled by STCP) using state-of-the-art methods. The QUASAR methodology consists of (1) screening sensitivity analysis, where the most sensitive input variables are selected for detailed uncertainty analysis, (2) uncertainty analysis, where probability density functions (PDFs) are established for the parameters identified by the screening stage and propagated through the codes to obtain PDFs for the outputs (i.e., release fractions to the environment), and (3) distribution sensitivity analysis, which is performed to determine the sensitivity of the output PDFs to the input PDFs. In this paper attention is limited to a single accident progression sequence, namely; a station blackout accident in a BWR with a Mark I containment buildings. Identified as an important accident in the draft NUREG-1150 a station blackout involves loss of both off-site power and DC power resulting in failure of the diesels to start and in the unavailability of the high pressure injection and core isolation coding systems

  12. Modelling and simulation the radioactive source-term of fission products in PWR type reactors

    International Nuclear Information System (INIS)

    Porfirio, Rogilson Nazare da Silva

    1996-01-01

    The source-term was defined with the purpose the quantify all radioactive nuclides released the nuclear reactor in the case of accidents. Nowadays the source-term is limited to the coolant of the primary circuit of reactors and may be measured or modelled with computer coders such as the TFP developed in this work. The calculational process is based on the linear chain techniques used in the CINDER-2 code. The TFP code considers forms of fission products release from the fuel pellet: Recoil, Knockout and Migration. The release from the gap to the coolant fluid is determined from the ratio between activity measured in the coolant and calculated activity in the gap. Considered the operational data of SURRY-1 reactor, the TFP code was run to obtain the source=term of this reactor. From the measured activities it was verified the reliability level of the model and the employed computational logic. The accuracy of the calculated quantities were compared to the measured data was considered satisfactory. (author)

  13. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  14. Standardization of iridium-192 coiled source in terms of air kerma output

    International Nuclear Information System (INIS)

    Shanta, A.; Unnikrishnan, K.; Tripathi, U.B.; Kannan, A.; Iyer, P.S.

    1996-01-01

    ICRU (1985) recommended that the output of gamma ray brachytherapy sources should be specified in terms of reference air kerma rate, defined as the kerma rate to air in air at a reference distance of 1 meter, perpendicular to the long axis of the source, corrected for air attenuation and scattering. As these measurements are difficult to carry out in the routine clinical use, it is the common practice to calibrate the re-entrant ionization chamber with respect to open air measurements and use the re-entrant chamber for routine measurements. This paper reports on the measurements carried out to correlate the nominal activity and air kerma rate of 192 Ir wire sources supplied by the Board of Radiation and Isotope Technology, Department of Atomic Energy. (author). 3 refs, 1 tab

  15. Standardization of iridium-192 coiled source in terms of air kerma output

    Energy Technology Data Exchange (ETDEWEB)

    Shanta, A; Unnikrishnan, K; Tripathi, U B; Kannan, A; Iyer, P S [Bhabha Atomic Research Centre, Bombay (India)

    1996-08-01

    ICRU (1985) recommended that the output of gamma ray brachytherapy sources should be specified in terms of reference air kerma rate, defined as the kerma rate to air in air at a reference distance of 1 meter, perpendicular to the long axis of the source, corrected for air attenuation and scattering. As these measurements are difficult to carry out in the routine clinical use, it is the common practice to calibrate the re-entrant ionization chamber with respect to open air measurements and use the re-entrant chamber for routine measurements. This paper reports on the measurements carried out to correlate the nominal activity and air kerma rate of {sup 192}Ir wire sources supplied by the Board of Radiation and Isotope Technology, Department of Atomic Energy. (author). 3 refs, 1 tab.

  16. Long-term program up to fiscal 1993 of electric power source development

    International Nuclear Information System (INIS)

    Kawakami, Shin-ichi

    1984-01-01

    The long-term, ten years, program up to fiscal 1993 of electric power source development, determined by the Government aims at stable power supply and the expansion of utilization of petroleum-substitute energy. The annual growth in the gross national product (GNP) during the ten years was taken as about 4 %. So, the total electric power demand in fiscal 1993 is scheduled to be 731,000 million kwh, about 34 % up from 547,000 million kwh in fiscal 1983. The structure of electric power sources at the end of fiscal 1993 will be hydraulic 19.7 %, thermal 58.3 %, and nuclear 21.9 %. The development of electric power sources to be initiated in fiscal 1984 is hydraulic 500 MW, thermal 2,000 MW, and nuclear 6,000 MW. (Mori, K.)

  17. Probabilistic inversion for chicken processing lines

    International Nuclear Information System (INIS)

    Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der

    2006-01-01

    We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism

  18. Low-level waste disposal performance assessments - Total source-term analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wilhite, E.L.

    1995-12-31

    Disposal of low-level radioactive waste at Department of Energy (DOE) facilities is regulated by DOE. DOE Order 5820.2A establishes policies, guidelines, and minimum requirements for managing radioactive waste. Requirements for disposal of low-level waste emplaced after September 1988 include providing reasonable assurance of meeting stated performance objectives by completing a radiological performance assessment. Recently, the Defense Nuclear Facilities Safety Board issued Recommendation 94-2, {open_quotes}Conformance with Safety Standards at Department of Energy Low-Level Nuclear Waste and Disposal Sites.{close_quotes} One of the elements of the recommendation is that low-level waste performance assessments do not include the entire source term because low-level waste emplaced prior to September 1988, as well as other DOE sources of radioactivity in the ground, are excluded. DOE has developed and issued guidance for preliminary assessments of the impact of including the total source term in performance assessments. This paper will present issues resulting from the inclusion of all DOE sources of radioactivity in performance assessments of low-level waste disposal facilities.

  19. Probabilistic modeling of wind energy sources integrated in a conventional power system; Modelagem probabilistica de fontes eolicas de energia integradas em sistema de potencia convencional

    Energy Technology Data Exchange (ETDEWEB)

    Dalence, G W.H.

    1990-06-15

    This work describes a model capable of including non-conventional energy sources into a stochastic energy production model for conventional power sources. A wind energy system is initially considered as statistically independent of the hourly demand. The correlation between two wind systems is then considered by means of a joint wind speed distribution. The joint wind system is thereafter submitted to the stochastic energy production model considering independence between demand and wind speed. Finally the correlation wind systems and the hourly demand is studied. (author). 29 figs, 31 tabs

  20. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  1. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  2. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  3. An appreciation of the events, models and data used for LMFBR radiological source term estimations

    International Nuclear Information System (INIS)

    Keir, D.; Clough, P.N.

    1989-01-01

    In this report, the events, models and data currently available for analysis of accident source terms in liquid metal cooled fast neutron reactors are reviewed. The types of hypothetical accidents considered are the low probability, more extreme types of severe accident, involving significant degradation of the core and which may lead to the release of radionuclides. The base case reactor design considered is a commercial scale sodium pool reactor of the CDFR type. The feasibility of an integrated calculational approach to radionuclide transport and speciation (such as is used for LWR accident analysis) is explored. It is concluded that there is no fundamental obstacle, in terms of scientific data or understanding of the phenomena involved, to such an approach. However this must be regarded as a long-term goal because of the large amount of effort still required to advance development to a stage comparable with LWR studies. Particular aspects of LMFBR severe accident phenomenology which require attention are the behaviour of radionuclides during core disruptive accident bubble formation and evolution, and during the less rapid sequences of core melt under sodium. The basic requirement for improved thermal hydraulic modelling of core, coolant and structural materials, in these and other scenarios, is highlighted as fundamental to the accuracy and realism of source term estimations. The coupling of such modelling to that of radionuclide behaviour is seen as the key to future development in this area

  4. Inverse kinetics method with source term for subcriticality measurements during criticality approach in the IPEN/MB-01 research reactor

    International Nuclear Information System (INIS)

    Loureiro, Cesar Augusto Domingues; Santos, Adimir dos

    2009-01-01

    In reactor physics tests which are performed at the startup after refueling the commercial PWRs, it is important to monitor subcriticality continuously during criticality approach. Reactivity measurements by the inverse kinetics method are widely used during the operation of a nuclear reactor and it is possible to perform an online reactivity measurement based on the point reactor kinetics equations. This technique is successful applied at sufficiently high power level or to a core without an external neutron source where the neutron source term in point reactor kinetics equations may be neglected. For operation at low power levels, the contribution of the neutron source must be taken into account and this implies the knowledge of a quantity proportional to the source strength, and then it should be determined. Some experiments have been performed in the IPEN/MB-01 Research Reactor for the determination of the Source Term, using the Least Square Inverse Kinetics Method (LSIKM). A digital reactivity meter which neglects the source term is used to calculate the reactivity and then the source term can be determined by the LSIKM. After determining the source term, its value can be added to the algorithm and the reactivity can be determined again, considering the source term. The new digital reactivity meter can be used now to monitor reactivity during the criticality approach and the measured value for the reactivity is more precise than the meter which neglects the source term. (author)

  5. Stochastic Modeling of Long-Term and Extreme Value Estimation of Wind and Sea Conditions for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave energy power plants are expected to become one of the major future contribution to the sustainable electricity production. Optimal design of wave energy power plants is associated with modeling of physical, statistical, measurement and model uncertainties. This paper presents stochastic models...... for the significant wave height, the mean zero-crossing wave period and the wind speed for long-term and extreme estimations. The long-term estimation focuses on annual statistical distributions, the inter-annual variation of distribution parameters and the statistical uncertainty due to limited amount of data...

  6. Global Infrasound Association Based on Probabilistic Clutter Categorization

    Science.gov (United States)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  7. Development of the methodology for application of revised source term to operating nuclear power plants in Korea

    International Nuclear Information System (INIS)

    Kang, M.S.; Kang, P.; Kang, C.S.; Moon, J.H.

    2004-01-01

    Considering the current trend in applying the revised source term proposed by NUREG-1465 to the nuclear power plants in the U.S., it is expected that the revised source term will be applied to the Korean operating nuclear power plants in the near future, even though the exact time can not be estimated. To meet the future technical demands, it is necessary to prepare the technical system including the related regulatory requirements in advance. In this research, therefore, it is intended to develop the methodology to apply the revised source term to operating nuclear power plants in Korea. Several principles were established to develop the application methodologies. First, it is not necessary to modify the existing regulations about source term (i.e., any back-fitting to operating nuclear plants is not necessary). Second, if the pertinent margin of safety is guaranteed, the revised source term suggested by NUREG-1465 may be useful to full application. Finally, a part of revised source term could be selected to application based on the technical feasibility. As the results of this research, several methodologies to apply the revised source term to the Korean operating nuclear power plants have been developed, which include: 1) the selective (or limited) application to use only some of all the characteristics of the revised source term, such as release timing of fission products and chemical form of radio-iodine and 2) the full application to use all the characteristics of the revised source term. The developed methodologies are actually applied to Ulchin 9 and 4 units and their application feasibilities are reviewed. The results of this research are used as either a manual in establishing the plan and the procedure for applying the revised source term to the domestic nuclear plant from the utility's viewpoint; or a technical basis of revising the related regulations from the regulatory body's viewpoint. The application of revised source term to operating nuclear

  8. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  9. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  10. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  11. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Yee, Eric [KEPCO International Nuclear Graduate School, Dept. of Nuclear Power Plant Engineering, Ulsan (Korea, Republic of)

    2017-03-15

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

  12. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    International Nuclear Information System (INIS)

    Yee, Eric

    2017-01-01

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered

  13. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  14. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  15. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  16. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  17. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  18. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  19. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    Science.gov (United States)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both concentration

  20. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    Directory of Open Access Journals (Sweden)

    N. Evangeliou

    2017-07-01

    Full Text Available This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30–50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km than previously assumed (≈ 2.2 km in order

  1. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  2. Source term for the bounding assessment of the Canadian nuclear fuel waste disposal concept

    International Nuclear Information System (INIS)

    Flavelle, P.

    1996-02-01

    This is the second in a series to derive the bounds of the post-closure hazard of the Canadian nuclear fuel waste disposal concept, based on the premise that it is unnecessary to predict accurately the real hazard if the bounding hazard can be shown to be acceptable. In this report a reference used (Bruce A fuel, 865 GJ/kgU average burnup) is used to derive the source term for contaminant releases from the emplacement canisters. This requires development of a container failure function which defines the age of the fuel when the canister is perforated and flooded. The source term is expressed as the time-dependent fractional release rate from the used fuel or as the time-dependent contaminant concentrations in the canister porewater. It is derived as the superposition of an instant release, comprising the upper bound of the gap and grain boundary inventory in the used fuel, and the long-term dissolution of the used fuel matrix. Several dissolution models (stoichiometric dissolution/preferential leaching) under different conditions (matrix solubility limited/ unlimited; oxidizing/ reducing solubility limits; groundwater flow/ no flow) are evaluated and the one resulting in the highest release rate/ highest porewater concentration is adopted as the bounding case. Comparisons between the models are made on the basis of the potential ingestion hazard of the canister porewater, to account for differences in the hazard of different radionuclides. (author) 20 refs., 4 tabs., 9 figs

  3. Seismic source characterization of the Alpine foreland in the context of a probabilistic seismic hazard analysis by PEGASOS Expert Group 1 (EG1a)

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, S. M. [Geologisch-Palaeontologisches Institut, University of Basel, Basel (Switzerland); Slejko, D. [Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, Trieste (Italy)

    2009-05-15

    Seismic source characterization is performed as part of the PEGASOS project for the assessment of the seismic hazard at the 4 sites of the Swiss Nuclear Power Plants. The analysis is performed according to the Level 4 procedures for expert elicitation defined in the guidelines of the US Nuclear Regulatory Committee whereby the quantification of uncertainties plays a crucial role. According to our analysis, which is one amongst four that were performed in the frame of PEGASOS, the most important epistemic uncertainty is related to the question as to whether basement-rooted faults at the margins of pre-existing Permo-Carboniferous troughs are prone for compressive or transpressive reactivation under the present-day stress field or not. The question after the present-day style of deformation in the Alpine foreland (thick-skinned versus thin-skinned) is closely related to this key question. Together with the consideration of uncertainties regarding the mapping of seismogenic zones and/or line sources, alternative zonations are presented in form of a logic tree with 21 branches. Area sources play a predominant role in the working area located at the margin of a diffuse plate boundary. Earthquake recurrence relationships are discussed by taking into account a series of uncertainties. These concern the evaluation of b-values and the evaluation of a-values once the b-values were fixed. Both parameters in the Gutenberg-Richter law are based on non-perfect and incomplete catalogue data that were carefully analysed beforehand. Since PEGASOS demanded an analysis of annual probabilities down to one event in 10{sup 7} years, the question after the value of the maximum possible earthquake magnitude M{sub max} and related error in M{sub max} estimates plays a crucial role. We estimate M{sub max} by using geological as well as statistical methods. M{sub max} = 6.9 cannot be excluded in most areas, in the Basel area M{sub max} = 7.3 is possible. Uncertainties in a, b and M{sub max

  4. Long-term storage life of light source modules by temperature cycling accelerated life test

    International Nuclear Information System (INIS)

    Sun Ningning; Tan Manqing; Li Ping; Jiao Jian; Guo Xiaofeng; Guo Wentao

    2014-01-01

    Light source modules are the most crucial and fragile devices that affect the life and reliability of the interferometric fiber optic gyroscope (IFOG). While the light emitting chips were stable in most cases, the module packaging proved to be less satisfactory. In long-term storage or the working environment, the ambient temperature changes constantly and thus the packaging and coupling performance of light source modules are more likely to degrade slowly due to different materials with different coefficients of thermal expansion in the bonding interface. A constant temperature accelerated life test cannot evaluate the impact of temperature variation on the performance of a module package, so the temperature cycling accelerated life test was studied. The main failure mechanism affecting light source modules is package failure due to solder fatigue failure including a fiber coupling shift, loss of cooling efficiency and thermal resistor degradation, so the Norris-Landzberg model was used to model solder fatigue life and determine the activation energy related to solder fatigue failure mechanism. By analyzing the test data, activation energy was determined and then the mean life of light source modules in different storage environments with a continuously changing temperature was simulated, which has provided direct reference data for the storage life prediction of IFOG. (semiconductor devices)

  5. Modification to ORIGEN2 for generating N Reactor source terms. Volume 1

    International Nuclear Information System (INIS)

    Schwarz, R.A.

    1997-04-01

    This report discusses work that has been done to upgrade the ORIGEN2 code cross sections to be compatible with the WIMS computer code data. Because of the changes in the ORIGEN2 calculations. Details on changes made to the ORIGEN2 computer code and the Radnuc code will be discussed along with additional work that should be done in the future to upgrade both ORIGEN2 and Radnuc. A detailed historical description of how source terms have been generated for N Reactor fuel stored in the K Basins has been generated. The neutron source discussed in this description was generated by the WIMS computer code (Gubbins et al. 1982) because of known shortcomings in the ORIGEN2 (Croff 1980) cross sections. Another document includes a discussion of the ORIGEN2 cross sections

  6. An Organizational-Technical Concept to Deal with Open Source Software License Terms

    Directory of Open Access Journals (Sweden)

    Sergius Dyck

    2016-06-01

    Full Text Available Open source software (OSS released under various license terms is widely used as third party libraries in today's software projects. To ensure open source compliance within an organization, a strategic approach to OSS management is needed. As basis for such an approach, we introduce an organizational-technical concept for dealing with the various OSS licenses by using procedural instructions and build automation software. The concept includes the careful consideration of OSS license conditions. The results obtained from this consideration and additional necessary commitments are documented in a so-called license playbook. We introduce procedure instructions enabling a consistent approach for software development using OSS libraries. The procedure instructions are described in a way such that they can be implemented for example for Java projects using the popular build automation tool Apache Maven and the software repository tool Nexus. We give guidance on how to realize such an implementation on basis of automation tools in practice.

  7. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  8. Source Term Characterization for Structural Components in 17 x 17 KOFA Spent Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Dong Keun; Kook, Dong Hak; Choi, Heui Joo; Choi, Jong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-12-15

    Source terms of metal waste comprising a spent fuel assembly are relatively important when the spent fuel is pyroprocessed, because cesium, strontium, and transuranics are not a concern any more in the aspect of source term of permanent disposal. In this study, characteristics of radiation source terms for each structural component in spent fuel assembly was analyzed by using ORIGEN-S with a assumption that 10 metric tons of uranium is pyroprocessed. At first, mass and volume for each structural component of the fuel assembly were calculated in detail. Activation cross section library was generated by using KENO-VI/ORIGEN-S module for top-end piece and bottom-end piece, because those are located at outer core with different neutron spectrum compared to that of inner core. As a result, values of radioactivity, decay heat, and hazard index were reveled to be 1.40 x 10{sup 15} Bequerels, 236 Watts, 4.34 x 10{sup 9} m{sup 3}-water, respectively, at 10 years after discharge. Those values correspond to 0.7 %, 1.1 %, 0.1 %, respectively, compared to that of spent fuel. Inconel 718 grid plate was shown to be the most important component in the all aspects of radioactivity, decay heat, and hazard index although the mass occupies only 1 % of the total. It was also shown that if the Inconel 718 grid plate is managed separately, the radioactivity and hazard index of metal waste could be decreased to 20 {approx} 45 % and 30 {approx} 45 %, respectively. As a whole, decay heat of metal waste was shown to be negligible in the aspect of disposal system design, while the radioactivity and hazard index are important.

  9. Source Term Characteristics Analysis for Structural Components in PWR spent fuel assembly

    Energy Technology Data Exchange (ETDEWEB)

    Kook, Dong Hak; Choi, Heui Joo; Cho, Dong Keun [KAERI, Daejeon (Korea, Republic of)

    2010-12-15

    Source terms of metal waste comprising a spent fuel assembly are relatively important when the spent fuel is pyroprocessed, because cesium, strontium, and transuranics are not a concern any more in the aspect of source term of permanent disposal. In this study, characteristics of radiation source terms for each structural component in spent fuel assembly was analyzed by using ORIGEN-S with a assumption that 10 metric tons of uranium is pyroprocessed. At first, mass and volume for each structural component of the fuel assembly were calculated in detail. Activation cross section library was generated by using KENO-VI/ORIGEN-S module for top-end piece and bottom-end piece, because those are located at outer core under different neutron spectrum compared to that of inner core. As a result, values of radioactivity, decay heat, and hazard index were reveled to be 1.32x1015 Bequerels, 238 Watts, 4.32x109 m3 water, respectively, at 10 years after discharge. Those values correspond to 0.6 %, 1.1 %, 0.1 %, respectively, compared to that of spent fuel. Inconel 718 grid plate was shown to be the most important component in the all aspects of radioactivity, decay heat, and hazard index although the mass occupies only 1 % of the total. It was also shown that if the Inconel 718 grid plate is managed separately, the radioactivity and hazard index of metal waste could be decreased to 25{approx}50 % and 35{approx}40 %, respectively. As a whole, decay heat of metal waste was shown to be negligible in the aspect of disposal system design, while the radioactivity and hazard index are important

  10. Basic repository source term and data sheet report: Deaf Smith County

    International Nuclear Information System (INIS)

    1987-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Deaf Smith County, Texas. 2 refs., 6 tabs

  11. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    International Nuclear Information System (INIS)

    Park, Jin Beak; Park, Joo-Wan; Lee, Eun-Young; Kim, Chang-Lak

    2003-01-01

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation of the Korean concept of the LILW disposal project in the near future

  12. Description of apparatus for determining radiological source terms of nuclear fuels

    International Nuclear Information System (INIS)

    Baldwin, D.L.; Woodley, R.E.; Holt, F.E.; Archer, D.V.; Steele, R.T.; Whitkop, P.G.

    1985-01-01

    New apparatus have been designed, built and are currently being employed to measure the release of volatile fission products from irradiated nuclear fuel. The system is capable of measuring radiological source terms, particularly for cesium-137, cesium-134, iodine-129 and krypton-85, in various atmospheres at temperatures up to 1200 0 C. The design allows a rapid transient heatup from ambient to full temperature, a hold at maximum temperature for a specified period, and rapid cooldown. Released fission products are measured as deposition on a platinum thermal gradient tube or in a filter/charcoal trap. Noble gases pass through to a multi-channel gamma analyzer. 1 ref., 4 figs

  13. The uranium source-term mineralogy and geochemistry at the Broubster natural analogue site, Caithness

    International Nuclear Information System (INIS)

    Milodowski, A.E.; Pearce, J.M.; Basham, I.R.; Hyslop, E.K.

    1991-01-01

    The British Geological Survey (BGS) has been conducting a coordinated research programme at the Broubster natural analogue site in Caithness, north Scotland. This work on a natural radioactive geochemical system has been carried out with the aim of improving our confidence in using predictive models of radionuclide migration in the geosphere. This report is one of a series being produced and it concentrates on the mineralogical characterization of the uranium distribution in the limestone unit considered as the 'source-term' in the natural analogue model

  14. Source term analysis for a criticality accident in metal production line glove boxes

    International Nuclear Information System (INIS)

    Nguyen, D.H.

    1991-06-01

    A recent development in criticality accident analysis is the deterministic calculations of the transport of fission products and actinides through the barriers of the physical facility. The knowledge of the redistribution of the materials inside the facility will help determine the reentry and clean-up procedures. The amount of radioactive materials released to the environment is the source term for dispersion calculations. We have used an integrated computer model to determine the release of fission products to the environment from a hypothetical criticality event in a glove box of the metal production line (MPL) at the Lawrence Livermore National Laboratory (LLNL)

  15. A source term and risk calculations using level 2+PSA methodology

    International Nuclear Information System (INIS)

    Park, S. I.; Jea, M. S.; Jeon, K. D.

    2002-01-01

    The scope of Level 2+ PSA includes the assessment of dose risk which is associated with the exposures of the radioactive nuclides escaping from nuclear power plants during severe accidents. The establishment of data base for the exposure dose in Korea nuclear power plants may contribute to preparing the accident management programs and periodic safety reviews. In this study the ORIGEN, MELCOR and MACCS code were employed to produce a integrated framework to assess the radiation source term risk. The framework was applied to a reference plant. Using IPE results, the dose rate for the reference plant was calculated quantitatively

  16. The source term and waste optimization of molten salt reactors with processing

    International Nuclear Information System (INIS)

    Gat, U.; Dodds, H.L.

    1993-01-01

    The source term of a molten salt reactor (MSR) with fuel processing is reduced by the ratio of processing time to refueling time as compared to solid fuel reactors. The reduction, which can be one to two orders of magnitude, is due to removal of the long-lived fission products. The waste from MSRs can be optimized with respect to its chemical composition, concentration, mixture, shape, and size. The actinides and long-lived isotopes can be separated out and returned to the reactor for transmutation. These features make MSRs more acceptable and simpler in operation and handling

  17. Basic repository source term and data sheet report, Cypress Creek Dome: Draft

    International Nuclear Information System (INIS)

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Cypress Creek Dome, Mississippi. 2 refs., 6 tabs

  18. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan - Conceptual Design Report, SCP-CDR. The previous unpublished SCC Study identified the data needs for the Environmental Assessment effort for seven possible salt repository sites

  19. Adiabatic energization in the ring current and its relation to other source and loss terms

    Science.gov (United States)

    Liemohn, M. W.; Kozyra, J. U.; Clauer, C. R.; Khazanov, G. V.; Thomsen, M. F.

    2002-04-01

    The influence of adiabatic energization and deenergization effects, caused by particle drift in radial distance, on ring current growth rates and loss lifetimes is investigated. Growth and loss rates from simulation results of four storms (5 June 1991, 15 May 1997, 19 October 1998, and 25 September 1998) are examined and compared against the y component of the solar wind electric field (Ey,sw). Energy change rates with and without the inclusion of adiabatic energy changes are considered to isolate the influence of this mechanism in governing changes of ring current strength. It is found that the influence of adiabatic drift effects on the energy change rates is very large when energization and deenergization are considered separately as gain and loss mechanisms, often about an order of magnitude larger than all other source or loss terms combined. This is true not only during storm times, when the open drift path configuration of the hot ions dominates the physics of the ring current, but also during quiet times, when the small oscillation in L of the closed trajectories creates a large source and loss of energy each drift orbit. However, the net energy change from adiabatic drift is often smaller than other source and loss processes, especially during quiet times. Energization from adiabatic drift dominates ring current growth only during portions of the main phase of storms. Furthermore, the net-adiabatic energization is often positive, because some particles are lost in the inner magnetosphere before they can adiabatically deenergize. It is shown that the inclusion of only this net-adiabatic drift effect in the total source rate or loss lifetime (depending on the sign of the net-adiabatic energization) best matches the observed source and loss values from empirical Dst predictor methods (that is, for consistency, these values should be compared between the calculation methods). While adiabatic deenergization dominates the loss timescales for all Ey,sw values

  20. PHENOstruct: Prediction of human phenotype ontology terms using heterogeneous data sources.

    Science.gov (United States)

    Kahanda, Indika; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa

    2015-01-01

    The human phenotype ontology (HPO) was recently developed as a standardized vocabulary for describing the phenotype abnormalities associated with human diseases. At present, only a small fraction of human protein coding genes have HPO annotations. But, researchers believe that a large portion of currently unannotated genes are related to disease phenotypes. Therefore, it is important to predict gene-HPO term associations using accurate computational methods. In this work we demonstrate the performance advantage of the structured SVM approach which was shown to be highly effective for Gene Ontology term prediction in comparison to several baseline methods. Furthermore, we highlight a collection of informative data sources suitable for the problem of predicting gene-HPO associations, including large scale literature mining data.

  1. SARNET. Severe Accident Research Network - key issues in the area of source term

    International Nuclear Information System (INIS)

    Giordano, P.; Micaelli, J.C.; Haste, T.; Herranz, L.

    2005-01-01

    About fifty European organisations integrate in SARNET (Network of Excellence of the EU 6 th Framework Programme) their research capacities in resolve better the most important remaining uncertainties and safety issues concerning existing and future Nuclear Power Plants (NPPs) under hypothetical Severe Accident (SA) conditions. Wishing to maintain a long-lasting cooperation, they conduct three types of activities: integrating activities, spreading of excellence and jointly executed research. This paper summarises the main results obtained by the network after the first year, giving more prominence to those from jointly executed research in the Source Term area. Integrating activities have been performed through different means: the ASTEC integral computer code for severe accident transient modelling, through development of PSA2 methodologies, through the setting of a structure for definition of evolving R and D priorities and through the development of a web-network of data bases that hosts experimental data. Such activities have been facilitated by the development of an Advanced Communication Tool. Concerning spreading of excellence, educational courses covering Severe Accident Analysis Methodology and Level 2 PSA have been set up, to be given in early 2006. A detailed text book on Severe Accident Phenomenology has been designed and agreed amongst SARNET members. A mobility programme for students and young researchers is being developed, some detachments are already completed or in progress, and examples are quoted. Jointly executed research activities concern key issues grouped in the Corium, Containment and Source Term areas. In Source Term, behaviour of the highly radio-toxic ruthenium under oxidising conditions (like air ingress) for HBU and MOX fuel has been investigated. First modelling proposals for ASTEC have been made for oxidation of fuel and of ruthenium. Experiments on transport of highly volatile oxide ruthenium species have been performed. Reactor

  2. Regulatory Technology Development Plan Sodium Fast Reactor. Mechanistic Source Term Development

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David S. [Argonne National Lab. (ANL), Argonne, IL (United States); Brunett, Acacia Joann [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, Matthew D. [Argonne National Lab. (ANL), Argonne, IL (United States); Sienicki, James J. [Argonne National Lab. (ANL), Argonne, IL (United States); Sofu, Tanju [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-02-28

    Construction and operation of a nuclear power installation in the U.S. requires licensing by the U.S. Nuclear Regulatory Commission (NRC). A vital part of this licensing process and integrated safety assessment entails the analysis of a source term (or source terms) that represents the release of radionuclides during normal operation and accident sequences. Historically, nuclear plant source term analyses have utilized deterministic, bounding assessments of the radionuclides released to the environment. Significant advancements in technical capabilities and the knowledge state have enabled the development of more realistic analyses such that a mechanistic source term (MST) assessment is now expected to be a requirement of advanced reactor licensing. This report focuses on the state of development of an MST for a sodium fast reactor (SFR), with the intent of aiding in the process of MST definition by qualitatively identifying and characterizing the major sources and transport processes of radionuclides. Due to common design characteristics among current U.S. SFR vendor designs, a metal-fuel, pool-type SFR has been selected as the reference design for this work, with all phenomenological discussions geared toward this specific reactor configuration. This works also aims to identify the key gaps and uncertainties in the current knowledge state that must be addressed for SFR MST development. It is anticipated that this knowledge state assessment can enable the coordination of technology and analysis tool development discussions such that any knowledge gaps may be addressed. Sources of radionuclides considered in this report include releases originating both in-vessel and ex-vessel, including in-core fuel, primary sodium and cover gas cleanup systems, and spent fuel movement and handling. Transport phenomena affecting various release groups are identified and qualitatively discussed, including fuel pin and primary coolant retention, and behavior in the cover gas and

  3. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    Science.gov (United States)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  4. Probabilistic tsunami hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau (Canada); Alcinov, T.; Roussel, P.; Lavine, A.; Arcos, M.E.M.; Hanson, K.; Youngs, R., E-mail: trajce.alcinov@amecfw.com, E-mail: patrick.roussel@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure, Dartmouth, NS (Canada)

    2015-07-01

    In 2012 the Geological Survey of Canada published a preliminary probabilistic tsunami hazard assessment in Open File 7201 that presents the most up-to-date information on all potential tsunami sources in a probabilistic framework on a national level, thus providing the underlying basis for conducting site-specific tsunami hazard assessments. However, the assessment identified a poorly constrained hazard for the Atlantic Coastline and recommended further evaluation. As a result, NB Power has embarked on performing a Probabilistic Tsunami Hazard Assessment (PTHA) for Point Lepreau Generating Station. This paper provides the methodology and progress or hazard evaluation results for Point Lepreau G.S. (author)

  5. Chernobyl radiocesium in freshwater fish: Long-term dynamics and sources of variation

    Energy Technology Data Exchange (ETDEWEB)

    Sundbom, M [Uppsala Univ., Dept. of Limnology, Uppsala (Sweden)

    2002-04-01

    The aim of this thesis was to investigate both the long-term temporal pattern and sources of individual variation for radiocesium in freshwater fish. The basis for the study is time series of {sup 137}Cs activity concentrations in fish from three lakes in the area North-west of Uppsala, Sweden that received considerable amounts of {sup 137}Cs from Chernobyl in may 1986. The lakes were Lake Ekholmssjoen, Lake Flatsjoen and Lake Siggeforasjoen, all small forest lakes, but with different morphometrical and chemical characteristics. The data were collected regularly, usually several times per year, during 1986-2000, using consistent methods. More than 7600 fish individuals from 7 species covering wide size ranges and feeding habits were analysed for {sup 137}Cs. For each fish was the length, weight, sex, and often the stomach contend recorded. The evaluation on long-term trends were based on data from all three lakes, while the study on sources of variation evaluated data from Lake Flatsjoen only. (au)

  6. The Chernobyl reactor accident source term: Development of a consensus view

    International Nuclear Information System (INIS)

    Guntay, S.; Powers, D.A.; Devell, L.

    1997-01-01

    In August 1986, scientists from the former Soviet Union provided the nuclear safety community with an impressively detailed account of what was then known about the Chernobyl accident. This included assessments of the magnitudes, rates, and compositions of radionuclide releases during the ten days following initiation of the accident. A summary report based on the Soviet report, the oral presentations, and the discussions with scientists from various countries was issued by the International Atomic Energy Agency shortly thereafter. Ten years have elapsed since the reactor accident at Chernobyl. A great deal more data is now available concerning the events, phenomena, and processes that took place. The purpose of this document is to examine what is known about the radioactive materials released during the accident. The accident was peculiar in the sense that radioactive materials were released, at least initially, in an exceptionally energetic plume and were transported far from the reactor site. Release of radioactivity from the plant continued for about ten days. A number of more recent publications and results from scientists in Russia and elsewhere have significantly improved our understanding of the Chernobyl source term. Because of the special features of the reactor design and the pecularities of the Chernobyl accident, the source term for the Chernobyl accident is of limited applicability of the safety analysis of other types of reactors

  7. Source terms associated with two severe accident sequences in a 900 MWe PWR

    International Nuclear Information System (INIS)

    Fermandjian, J.; Evrard, J.M.; Berthion, Y.; Lhiaubet, G.; Lucas, M.

    1983-12-01

    Hypothetical accidents taken into account in PWR risk assessment result in fission product release from the fuel, transfer through the primary circuit, transfer into the reactor containment building (RCB) and finally release to the environment. The objective of this paper is to define the characteristics of the source term (noble gases, particles and volatile iodine forms) released from the reactor containment building during two dominant core-melt accident sequences: S 2 CD and TLB according to the ''Reactor Safety Study'' terminology. The reactor chosen for this study is a French 900 MWe PWR unit. The reactor building is a prestressed concrete containment with an internal liner. The first core-melt accident sequence is a 2-break loss-of-coolant accident on the cold leg, with failure of both system and the containment spray system. The second one is a transient initiated by a loss of offsite and onsite power supply and auxiliary feedwater system. These two sequences have been chosen because they are representative of risk dominant scenarios. Source terms associated with hypothetical core-melt accidents S 2 CD and TLB in a French PWR -900 MWe- have been performed using French computer codes (in particular, JERICHO Code for containment response analysis and AEROSOLS/31 for aerosol behavior in the containment)

  8. Source term and behavioural parameters for a postulated HIFAR loss-of-coolant accident

    International Nuclear Information System (INIS)

    May, F.G.

    1987-01-01

    The fraction of the fission product inventory which might be released into the atmosphere of the HIFAR reactor containment building (RCB) during a postulated loss-of-coolant accident (LOCA) has been evaluated as a function of time, for each classification of airborne radioactivity. This appraisal will be used as the source term for a computer program, which uses realistic attenuation of the fission product aerosol in a single compartment model with a defined leakrate to predict possible radioactive releases into the environment in a hypothetical bounding case reactor accident which is rather more severe in all major aspects than any single LOCA. Also given are the parameters governing the attenuation of the aerosol and vapours in the atmosphere of the RCB so that their behaviour may be accurately modelled. The source terms for several other types of accident involving the meltdown of fuel elements have also been considered but in less detail than the LOCA case. In some of the cases, the fission products are released directly to atmosphere, so there is no attenuation of the release by deposition within the RCB

  9. Source terms; isolation and radiological consequences of carbon-14 waste in the Swedish SFR repository

    International Nuclear Information System (INIS)

    Hesboel, R.; Puigdomenech, I.; Evans, S.

    1990-01-01

    The source term, isolation capacity, and long-term radiological exposure of 14 C from the Swedish underground repository for low and intermediate level waste (SFR) is assessed. The prospective amount of 14 C in the repository is assumed to be 5 TBq. Spent ion exchange resins will be the dominant source of 14 C. The pore water in the concrete repository is expected to maintain a pH of >10.5 for a period of at least 10 6 y. The cement matrix of the repository will retain most of the 14 CO 3 2- initially present. Bacterial production of CO 2 and CH 4 from degradation of ion-exchange resins and bitumen may contribute to 14 C release to the biosphere. However, CH 4 contributes only to a small extent to the overall carbon loss from freshwater ecosystems. The individual doses to local and regional individuals peaked with 5x10 -3 and regional individuals peaked with 5x10 -3 and 8x10 -4 μSv y -1 respectively at about 2.4x10 4 years. A total leakage of 8.4 GBq of 14 C from the repository will cause a total collective dose commitment of 1.1 manSv or 130 manSv TBq -1 . (authors)

  10. Projected Source Terms for Potential Sabotage Events Related to Spent Fuel Shipments

    International Nuclear Information System (INIS)

    Luna, R.E.; Neuhauser, K.S.; Vigil, M.G.

    1999-01-01

    Two major studies, one sponsored by the U.S. Department of Energy and the other by the U.S. Nuclear Regulatory Commission, were conducted in the late 1970s and early 1980s to provide information and source terms for an optimally successful act of sabotage on spent fuel casks typical of those available for use. This report applies the results of those studies and additional analysis to derive potential source terms for certain classes of sabotage events on spent fuel casks and spent fuel typical of those which could be shipped in the early decades of the 21st century. In addition to updating the cask and spent fuel characteristics used in the analysis, two release mechanisms not included in the earlier works were identified and evaluated. As would be expected, inclusion of these additional release mechanisms resulted in a somewhat higher total release from the postulated sabotage events. Although health effects from estimated releases were addressed in the earlier study conducted for U.S. Department of Energy, they have not been addressed in this report. The results from this report maybe used to estimate health effects

  11. Chernobyl radiocesium in freshwater fish: Long-term dynamics and sources of variation

    International Nuclear Information System (INIS)

    Sundbom, M.

    2002-01-01

    The aim of this thesis was to investigate both the long-term temporal pattern and sources of individual variation for radiocesium in freshwater fish. The basis for the study is time series of 137 Cs activity concentrations in fish from three lakes in the area North-west of Uppsala, Sweden that received considerable amounts of 137 Cs from Chernobyl in may 1986. The lakes were Lake Ekholmssjoen, Lake Flatsjoen and Lake Siggeforasjoen, all small forest lakes, but with different morphometrical and chemical characteristics. The data were collected regularly, usually several times per year, during 1986-2000, using consistent methods. More than 7600 fish individuals from 7 species covering wide size ranges and feeding habits were analysed for 137 Cs. For each fish was the length, weight, sex, and often the stomach contend recorded. The evaluation on long-term trends were based on data from all three lakes, while the study on sources of variation evaluated data from Lake Flatsjoen only. (au)

  12. Long-term dust aerosol production from natural sources in Iceland.

    Science.gov (United States)

    Dagsson-Waldhauserova, Pavla; Arnalds, Olafur; Olafsson, Haraldur

    2017-02-01

    Iceland is a volcanic island in the North Atlantic Ocean with maritime climate. In spite of moist climate, large areas are with limited vegetation cover where >40% of Iceland is classified with considerable to very severe erosion and 21% of Iceland is volcanic sandy deserts. Not only do natural emissions from these sources influenced by strong winds affect regional air quality in Iceland ("Reykjavik haze"), but dust particles are transported over the Atlantic ocean and Arctic Ocean >1000 km at times. The aim of this paper is to place Icelandic dust production area into international perspective, present long-term frequency of dust storm events in northeast Iceland, and estimate dust aerosol concentrations during reported dust events. Meteorological observations with dust presence codes and related visibility were used to identify the frequency and the long-term changes in dust production in northeast Iceland. There were annually 16.4 days on average with reported dust observations on weather stations within the northeastern erosion area, indicating extreme dust plume activity and erosion within the northeastern deserts, even though the area is covered with snow during the major part of winter. During the 2000s the highest occurrence of dust events in six decades was reported. We have measured saltation and Aeolian transport during dust/volcanic ash storms in Iceland, which give some of the most intense wind erosion events ever measured. Icelandic dust affects the ecosystems over much of Iceland and causes regional haze. It is likely to affect the ecosystems of the oceans around Iceland, and it brings dust that lowers the albedo of the Icelandic glaciers, increasing melt-off due to global warming. The study indicates that Icelandic dust may contribute to the Arctic air pollution. Long-term records of meteorological dust observations from Northeast Iceland indicate the frequency of dust events from Icelandic deserts. The research involves a 60-year period and

  13. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site

  14. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  15. Environmental radiation safety: source term modification by soil aerosols. Interim report

    International Nuclear Information System (INIS)

    Moss, O.R.; Allen, M.D.; Rossignol, E.J.; Cannon, W.C.

    1980-08-01

    The goal of this project is to provide information useful in estimating hazards related to the use of a pure refractory oxide of 238 Pu as a power source in some of the space vehicles to be launched during the next few years. Although the sources are designed and built to withstand re-entry into the earth's atmosphere, and to impact with the earth's surface without releasing any plutonium, the possibility that such an event might produce aerosols composed of soil and 238 PuO 2 cannot be absolutely excluded. This report presents the results of our most recent efforts to measure the degree to which the plutonium aerosol source term might be modified in a terrestrial environment. The five experiments described represent our best effort to use the original experimental design to study the change in the size distribution and concentration of a 238 PuO 2 aerosol due to coagulation with an aerosol of clay or sandy loam soil

  16. Source-term development for a contaminant plume for use by multimedia risk assessment models

    International Nuclear Information System (INIS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    1999-01-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equal importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool

  17. Jet flow analysis of liquid poison injection in a CANDU reactor using source term

    International Nuclear Information System (INIS)

    Chae, Kyung Myung; Choi, Hang Bok; Rhee, Bo Wook

    2001-01-01

    For the performance analysis of Canadian deuterium uranium (CANDU) reactor shutdown system number 2 (SDS2), a computational fluid dynamics model of poison jet flow has been developed to estimate the flow field and poison concentration formed inside the CANDU reactor calandria. As the ratio of calandria shell radius over injection nozzle hole diameter is so large (1055), it is impractical to develop a full-size model encompassing the whole calandria shell. In order to reduce the model to a manageable size, a quarter of one-pitch length segment of the shell was modeled using symmetric nature of the jet; and the injected jet was treated as a source term to avoid the modeling difficulty caused by the big difference of the hole sizes. For the analysis of an actual CANDU-6 SDS2 poison injection, the grid structure was determined based on the results of two-dimensional real- and source-jet simulations. The maximum injection velocity of the liquid poison is 27.8 m/s and the mass fraction of the poison is 8000 ppm (mg/kg). The simulation results have shown well-established jet flow field. In general, the jet develops narrowly at first but stretches rapidly. Then, the flow recirculates a little in r-x plane, while it recirculates largely in r-θ plane. As the time goes on, the adjacent jets contact each other and form a wavy front such that the whole jet develops in a plate form. his study has shown that the source term model can be effectively used for the analysis of the poison injection and the simulation result of the CANDU reactor is consistent with the model currently being used for the safety analysis. In the future, it is strongly recommended to analyze the transient (from helium tank to injection nozzle hole) of the poison injection by applying Bernoulli equation with real boundary conditions

  18. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    International Nuclear Information System (INIS)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-01-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish release fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.

  19. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States); Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States); Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish release fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.

  20. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  1. Geothermal probabilistic cost study

    Energy Technology Data Exchange (ETDEWEB)

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  2. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  3. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  4. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  5. DUSTMS-D: DISPOSAL UNIT SOURCE TERM - MULTIPLE SPECIES - DISTRIBUTED FAILURE DATA INPUT GUIDE.

    Energy Technology Data Exchange (ETDEWEB)

    SULLIVAN, T.M.

    2006-01-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). Many of these physical processes are influenced by the design of the disposal facility (e.g., how the engineered barriers control infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This has been done and the resulting models have been incorporated into the computer code DUST-MS (Disposal Unit Source Term-Multiple Species). The DUST-MS computer code is designed to model water flow, container degradation, release of contaminants from the wasteform to the contacting solution and transport through the subsurface media. Water flow through the facility over time is modeled using tabular input. Container degradation models include three types of failure rates: (a) instantaneous (all containers in a control volume fail at once), (b) uniformly distributed failures (containers fail at a linear rate between a specified starting and ending time), and (c) gaussian failure rates (containers fail at a rate determined by a mean failure time, standard deviation and gaussian distribution). Wasteform release models include four release mechanisms: (a) rinse with partitioning (inventory is released instantly upon container failure subject to equilibrium partitioning (sorption) with

  6. A note on variational multiscale methods for high-contrast heterogeneous porous media flows with rough source terms

    KAUST Repository

    Calo, Victor M.

    2011-09-01

    In this short note, we discuss variational multiscale methods for solving porous media flows in high-contrast heterogeneous media with rough source terms. Our objective is to separate, as much as possible, subgrid effects induced by the media properties from those due to heterogeneous source terms. For this reason, enriched coarse spaces designed for high-contrast multiscale problems are used to represent the effects of heterogeneities of the media. Furthermore, rough source terms are captured via auxiliary correction equations that appear in the formulation of variational multiscale methods [23]. These auxiliary equations are localized and one can use additive or multiplicative constructions for the subgrid corrections as discussed in the current paper. Our preliminary numerical results show that one can capture the effects due to both spatial heterogeneities in the coefficients (such as permeability field) and source terms (e.g., due to singular well terms) in one iteration. We test the cases for both smooth source terms and rough source terms and show that with the multiplicative correction, the numerical approximations are more accurate compared to the additive correction. © 2010 Elsevier Ltd.

  7. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    Science.gov (United States)

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  8. Novel Slope Source Term Treatment for Preservation of Quiescent Steady States in Shallow Water Flows

    Directory of Open Access Journals (Sweden)

    Khawar Rehman

    2016-10-01

    Full Text Available This paper proposes a robust method for modeling shallow-water flows and near shore tsunami propagation, applicable for both simple and complex geometries with uneven beds. The novel aspect of the model includes the introduction of a new method for slope source terms treatment to preserve quiescent equilibrium over uneven topographies, applicable to both structured and unstructured mesh systems with equal accuracy. Our model is based on the Godunov-type finite volume numerical approximation. Second-order spatial and temporal accuracy is achieved through high resolution gradient reconstruction and the predictor-corrector method, respectively. The approximate Riemann solver of Harten, Lax, and van Leer with contact wave restoration (HLLC is used to compute fluxes. Comparisons of the model’s results with analytical, experimental, and published numerical solutions show that the proposed method is capable of accurately predicting experimental and real-time tsunami propagation/inundation, and dam-break flows over varying topographies.

  9. A Mechanistic Source Term Calculation for a Metal Fuel Sodium Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2017-06-26

    A mechanistic source term (MST) calculation attempts to realistically assess the transport and release of radionuclides from a reactor system to the environment during a specific accident sequence. The U.S. Nuclear Regulatory Commission (NRC) has repeatedly stated its expectation that advanced reactor vendors will utilize an MST during the U.S. reactor licensing process. As part of a project to examine possible impediments to sodium fast reactor (SFR) licensing in the U.S., an analysis was conducted regarding the current capabilities to perform an MST for a metal fuel SFR. The purpose of the project was to identify and prioritize any gaps in current computational tools, and the associated database, for the accurate assessment of an MST. The results of the study demonstrate that an SFR MST is possible with current tools and data, but several gaps exist that may lead to possibly unacceptable levels of uncertainty, depending on the goals of the MST analysis.

  10. A study on the safety of spent fuel management. Radioactive source term modelling

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Kwan Sik; Lee, Hoo Keun; Park, Keun Il; Hwoang, Jung Ki; Chung, Choong Hwan [Korea Atomic Energy Research Inst., Daeduk (Korea, Republic of)

    1992-02-01

    The types and probabilities of events which may occur during the process of reception, transfer and storage of spent fuels in an away-from-reactor (AFR) spent fuel storage facility were analyzed in order to calculate the amount of radioactive material released to operation area and atmosphere, and the basic model for predicting the radioactive source-term under normal and abnormal operations were developed. Also, oxidation and dissolution of U0{sub 2} pellet was investigated to estimate the amount of radioactive materials released from spent fuel and the release characteristics of radionuclides from defected spent fuel rods was analyzed. Basic information using FIRAC code to analyze the ventilation system during fire accident was prepared and FIRIN was detached from FIRAC modified to simulate the compartment fire by personal computer. (Author).

  11. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    International Nuclear Information System (INIS)

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-01-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory

  12. A source term estimation method for a nuclear accident using atmospheric dispersion models

    DEFF Research Database (Denmark)

    Kim, Minsik; Ohba, Ryohji; Oura, Masamichi

    2015-01-01

    The objective of this study is to develop an operational source term estimation (STE) method applicable for a nuclear accident like the incident that occurred at the Fukushima Dai-ichi nuclear power station in 2011. The new STE method presented here is based on data from atmospheric dispersion...... models and short-range observational data around the nuclear power plants.The accuracy of this method is validated with data from a wind tunnel study that involved a tracer gas release from a scaled model experiment at Tokai Daini nuclear power station in Japan. We then use the methodology developed...... and validated through the effort described in this manuscript to estimate the release rate of radioactive material from the Fukushima Dai-ichi nuclear power station....

  13. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  14. Distributed source term analysis, a new approach to nuclear material inventory verification

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to γ-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than γ-rays

  15. Methods to prevent the source term of methyl lodide during a core melt accident

    Energy Technology Data Exchange (ETDEWEB)

    Karhu, A. [VTT Energy (Finland)

    1999-11-01

    The purpose of this literature review is to gather available information of the methods to prevent a source term of methyl iodide during a core melt accident. The most widely studied methods for nuclear power plants include the impregnated carbon filters and alkaline additives and sprays. It is indicated that some deficiencies of these methods may emerge. More reactive impregnants and additives could make a great improvement. As a new method in the field of nuclear applications, the potential of transition metals to decompose methyl iodide, is introduced in this review. This area would require an additional research, which could elucidate the remaining questions of the reactions. The ionization of the gaseous methyl iodide by corona-discharge reactors is also shortly described. (au)

  16. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-04-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory.

  17. Nuclear reaction models - source term estimation for safety design in accelerators

    International Nuclear Information System (INIS)

    Nandy, Maitreyee

    2013-01-01

    Accelerator driven subcritical system (ADSS) employs proton induced spallation reaction at a few GeV. Safety design of these systems involves source term estimation in two steps - multiple fragmentation of the target and n+γ emission through a fast process followed by statistical decay of the primary fragments. The prompt radiation field is estimated in the framework of quantum molecular dynamics (QMD) theory, intra-nuclear cascade or Monte Carlo calculations. A few nuclear reaction model codes used for this purpose are QMD, JQMD, Bertini, INCL4, PHITS, followed by statistical decay codes like ABLA, GEM, GEMINI, etc. In the case of electron accelerators photons and photoneutrons dominate the prompt radiation field. High energy photon yield through Bremsstrahlung is estimated in the framework of Born approximation while photoneutron production is calculated using giant dipole resonance and quasi-deuteron formation cross section. In this talk hybrid and exciton PEQ models and QMD formalism will be discussed briefly

  18. Operational techniques employed for the liquid sodium source term control loops

    International Nuclear Information System (INIS)

    Chulos, L.E.

    1976-01-01

    Four Source Term Control Loops (STCLs) have been designed, constructed, and placed into operation at the Hanford Engineering Development Laboratory (HEDL) as part of the Radioactivity Control Technology program. The data obtained are used to determine the corrosion and deposition of LMFBR materials, including corrosion product radionuclides, in a non-isothermal flowing sodium system. The paper discusses operation of the STCL Facilities and, in particular, the methods used for controlling the oxygen content of the liquid sodium. These methods include cold trapping techniques, hot trapping, seeding the cold traps with sodium oxide, and precipitating the oxygen in the cold trap in a controlled manner. Operational problems encountered with the STCL Facilities and the techniques for correcting these problems are also discussed

  19. Low-level radioactive waste source term model development and testing: Topical report

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Kempf, C.R.; Suen, C.J.; Mughabghab, S.M.

    1988-08-01

    The Low-Level Waste Source Term Evaluation Project has the objective to develop a system model capable of predicting radionuclide release rates from a shallow land burial facility. The previous topical report for this project discussed the framework and methodology for developing a system model and divided the problem into four compartments: water flow, container degradation, waste form leaching, and radionuclide transport. Each of these compartments is described by submodels which will be coupled into the system model. From February 1987 to March 1988, computer models have been selected to predict water flow (FEMWATER) and radionuclide transport (FEMWASTE) and separate models have been developed to predict pitting corrosion of steel containers and leaching from porous waste forms contained in corrodible containers. This report discusses each of the models in detail and presents results obtained from applying the models to shallow land burial trenches over a range of expected conditions. 68 refs., 34 figs., 14 tabs

  20. Improved thermal source term generation capability for use in performance assessment and system studies

    International Nuclear Information System (INIS)

    King, J.; Rhodes, C.

    1994-01-01

    This paper describes work performed by the Civilian Radioactive Waste Management System (CRWMS) Management and Operating (M ampersand O) Contractor to improve spent nuclear fuel (SNF) waste stream characterization for system studies. It discusses how these new capabilities may be exploited for thermal source term generation for use in repository performance assessment modeling. SNF historical discharges have been exhaustively tracked, and significant effort has gone into capturing, verifying, and electronically managing spent fuel inventory data. Future discharge projections are produced annually by the Energy Information Administration (EIA) using sophisticated computer models. The output of these models is coupled with annually updated SNF historical discharges to produce what is referred to as the open-quotes reactor database.close quotes This database and related data are published in a variety of ways including on magnetic media for consistent use by analysts or other interested parties

  1. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  2. Source term estimation for small sized HTRs: status and further needs - a german approach

    International Nuclear Information System (INIS)

    Moormann, R.; Schenk, W.; Verfondern, K.

    2000-01-01

    The main results of German studies on source term estimation for small pebble-bed HTRs with their strict safety demands are outlined. Core heat-up events are no longer dominant for modern high quality fuel, but fission product transport during water ingress accidents (steam cycle plants) and depressurization is relevant, mainly due to remobilization of fission products which were plated-out in the course of normal operation or became dust borne. An important lack of knowledge was identified as concerns data on plate-out under normal operation, as well as on the behaviour of dust borne activity as a whole. Improved knowledge in this field is also important for maintenance/repair and design/shielding. For core heat-up events the influence of burn-up on temperature induced fission product release has to be measured for future high burn-up fuel. Also, transport mechanisms out of the He circuit into the environment require further examination. For water/steam ingress events mobilization of plated-out fission products by steam or water has to be considered in detail, along with steam interaction with kernels of particles with defective coatings. For source terms of depressurization, a more detailed knowledge of the flow pattern and shear forces on the various surfaces is necessary. In order to improve the knowledge on plate-out and dust in normal operation and to generate specimens for experimental remobilization studies, planning/design of plate-out/dust examination facilities which could be added to the next generation of HTRs (HTR10,HTTR) is proposed. For severe air ingress and reactivity accidents, behaviour of future advanced fuel elements has to be experimentally tested. (authors)

  3. An artificial neural network approach to reconstruct the source term of a nuclear accident

    International Nuclear Information System (INIS)

    Giles, J.; Palma, C. R.; Weller, P.

    1997-01-01

    This work makes use of one of the main features of artificial neural networks, which is their ability to 'learn' from sets of known input and output data. Indeed, a trained artificial neural network can be used to make predictions on the input data when the output is known, and this feedback process enables one to reconstruct the source term from field observations. With this aim, an artificial neural networks has been trained, using the projections of a segmented plume atmospheric dispersion model at fixed points, simulating a set of gamma detectors located outside the perimeter of a nuclear facility. The resulting set of artificial neural networks was used to determine the release fraction and rate for each of the noble gases, iodines and particulate fission products that could originate from a nuclear accident. Model projections were made using a large data set consisting of effective release height, release fraction of noble gases, iodines and particulate fission products, atmospheric stability, wind speed and wind direction. The model computed nuclide-specific gamma dose rates. The locations of the detectors were chosen taking into account both building shine and wake effects, and varied in distance between 800 and 1200 m from the reactor.The inputs to the artificial neural networks consisted of the measurements from the detector array, atmospheric stability, wind speed and wind direction; the outputs comprised a set of release fractions and heights. Once trained, the artificial neural networks was used to reconstruct the source term from the detector responses for data sets not used in training. The preliminary results are encouraging and show that the noble gases and particulate fission product release fractions are well determined

  4. Radiological consequence evaluation of DBAs with alternative source term method for a Chinese PWR

    International Nuclear Information System (INIS)

    Li, J.X.; Cao, X.W.; Tong, L.L.; Huang, G.F.

    2012-01-01

    Highlights: ► Radiological consequence evaluation of DBAs with alternative source term method for a Chinese 900 MWe PWR has been investigated. ► Six typical DBA sequences are analyzed. ► The doses of control room, EAB and outer boundary of LPZ are acceptable. ► The differences between AST method and TID-14844 method are investigated. - Abstract: Since a large amount of fission products may releases into the environment, during the accident progression in nuclear power plants (NPPs), which is a potential hazard to public risk, the radiological consequence should be evaluated for alleviating the hazard. In most Chinese NPPs the method of TID-14844, in which the whole body and thyroid dose criteria is employed as dose criteria, is currently adopted to evaluate the radiological consequences for design-basis accidents (DBAs), but, due to the total effective dose equivalent is employed as dose criteria in alternative radiological source terms (AST) method, it is necessary to evaluate the radiological consequences for DBAs with AST method and to discuss the difference between two methods. By using an integral safety analysis code, an analytical model of the 900 MWe pressurized water reactor (PWR) is built and the radiological consequences in DBAs at control room (CR), exclusion area boundary (EAB), low population zone (LPZ) are analyzed, which includes LOCA and non-LOCA DBAs, such as fuel handling accident (FHA), rod ejection accident (REA), main steam line break (MSLB), steam generator tube rupture (SGTR), locked rotor accident (LRA) by using the guidance of the RG 1.183. The results show that the doses in CR, EAB and LPZ are acceptable compared with dose criteria in RG 1.183 and the differences between AST method and TID-14844 method are also discussed.

  5. Source term and activation calculations for the new TR-FLEX cyclotron for medical applications at HZDR

    Energy Technology Data Exchange (ETDEWEB)

    Konheiser, Joerg [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Reactor Safety; Ferrari, A. [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Inst. of Radiation Physics; Magin, A. [Karlsruher Institut fuer Technologie (KIT), Karlsruhe (Germany); Naumann, B. [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Dept. of Radiation Protection and Safety; Mueller, S.E.

    2017-06-01

    The neutron source terms for a proton beam hitting an {sup 18}O-enriched water target were calculated with the radiation transport programs MCNP6 and FLUKA and were compared to source terms for exclusive {sup 18}O(p,n){sup 18}F production. To validate the radiation fields obtained in the simulations, an experimental program has been started using activation samples originally used in reactor dosimetry.

  6. A simplistic view of the iodine chemistry influence on source term assessment

    International Nuclear Information System (INIS)

    Herranz, L.E.; Rodriguez, J.J.

    1994-01-01

    The intrinsic characteristics of iodine make it a relevant concern as to its potential radiobiological impact in case of a hypothetical severe accident in nuclear power plants. This paper summarizes the major results drawn from a very simple but illustrative calculation exercise aimed at weighing how significant could be taking iodine chemistry in containment into account for source term assessments in case of a postulated severe reactor accident. The scenario chosen as representative of expected conditions in containment was LA-4 test of LACE programme. Several approximations and hypothesis concerning the scenario were necessary. Iodine chemistry analyses were performed with IODE code, as long as thermalhydraulic and aerosol behaviour analyses, providing initial and boundary conditions for iodine calculations, were carried out with CONTEMPT4/MOD5 and NAUA/MOD5 codes, respectively. In general, the results obtained agreed qualitatively with the current knowledge on the area; from a quantitative point of view, one of the major results was that iodine chemistry on acidic conditions could provide a substantial increase in the leaked mass from containment under the postulated circumstances. Hence, this study underlines the need of including iodine chemistry in source tenn assessments. (author)

  7. Parameterizing unresolved obstacles with source terms in wave modeling: A real-world application

    Science.gov (United States)

    Mentaschi, Lorenzo; Kakoulaki, Georgia; Vousdoukas, Michalis; Voukouvalas, Evangelos; Feyen, Luc; Besio, Giovanni

    2018-06-01

    Parameterizing the dissipative effects of small, unresolved coastal features, is fundamental to improve the skills of wave models. The established technique to deal with this problem consists in reducing the amount of energy advected within the propagation scheme, and is currently available only for regular grids. To find a more general approach, Mentaschi et al., 2015b formulated a technique based on source terms, and validated it on synthetic case studies. This technique separates the parameterization of the unresolved features from the energy advection, and can therefore be applied to any numerical scheme and to any type of mesh. Here we developed an open-source library for the estimation of the transparency coefficients needed by this approach, from bathymetric data and for any type of mesh. The spectral wave model WAVEWATCH III was used to show that in a real-world domain, such as the Caribbean Sea, the proposed approach has skills comparable and sometimes better than the established propagation-based technique.

  8. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  9. Long-term monitoring on environmental disasters using multi-source remote sensing technique

    Science.gov (United States)

    Kuo, Y. C.; Chen, C. F.

    2017-12-01

    Environmental disasters are extreme events within the earth's system that cause deaths and injuries to humans, as well as causing damages and losses of valuable assets, such as buildings, communication systems, farmlands, forest and etc. In disaster management, a large amount of multi-temporal spatial data is required. Multi-source remote sensing data with different spatial, spectral and temporal resolutions is widely applied on environmental disaster monitoring. With multi-source and multi-temporal high resolution images, we conduct rapid, systematic and seriate observations regarding to economic damages and environmental disasters on earth. It is based on three monitoring platforms: remote sensing, UAS (Unmanned Aircraft Systems) and ground investigation. The advantages of using UAS technology include great mobility and availability in real-time rapid and more flexible weather conditions. The system can produce long-term spatial distribution information from environmental disasters, obtaining high-resolution remote sensing data and field verification data in key monitoring areas. It also supports the prevention and control on ocean pollutions, illegally disposed wastes and pine pests in different scales. Meanwhile, digital photogrammetry can be applied on the camera inside and outside the position parameters to produce Digital Surface Model (DSM) data. The latest terrain environment information is simulated by using DSM data, and can be used as references in disaster recovery in the future.

  10. Integral migration and source term experiments on cement and bitumen waste forms

    International Nuclear Information System (INIS)

    Ewart, F.T.; Howse, R.M.; Sharpe, B.M.; Smith, A.J.; Thomason, H.P.; Williams, S.J.; Young, M.

    1986-01-01

    This is the final report of a programme of research which formed a part of the CEC joint research project into radionuclide migration in the geosphere (MIRAGE). This study addressed the aspects of integral migration and source term. The integral migration experiment simulated, in the laboratory, the intrusion of water into the repository, the leaching of radionuclides from two intermediate level wasteforms and the subsequent migration through the geosphere. The simulation consisted of a source of natural ground water which flowed over a sample of wasteform, at a controlled redox potential, and then through backfill and geological material packed in columns. The two wasteforms used here were cemented waste from the WAK plant at Karlsruhe, W. Germany and bitumenised intermediate concentrates from the Marcoule plant in France. The soluble fission products such as caesium wire rapidly released from the cemented waste but the actinides, and technetium in the reduced state, were retained in the wasteform. The release of all nuclides from the bitumenised waste was very low. (author)

  11. Integral migration and source-term experiments on cement and bitumen waste forms

    International Nuclear Information System (INIS)

    Ewart, F.T.; Howse, R.M.; Sharpe, B.M.; Smith, A.J.; Thomason, H.P.; Williams, S.J.; Young, M.

    1986-01-01

    This is the final report of a programme of research which formed a part of the CEC joint research project into radionuclide migration in the geosphere (MIRAGE). This study addressed the aspects of integral migration and source term. The integral migration experiment simulated, in the laboratory, the intrusion of water into the repository, the leaching of radionuclides from two intermediate-level waste-forms and the subsequent migration through the geosphere. The simulation consisted of a source of natural ground water which flowed over a sample of waste-form, at a controlled redox potential, and then through backfill and geological material packed in columns. The two waste forms used here were cemented waste from the WAK plant at Karlsruhe in the Federal Republic of Germany and bitumenized intermediate concentrates from the Marcoule plant in France. The soluble fission products such as caesium were rapidly released from the cemented waste but the actinides, and technetium in the reduced state, were retained in the waste-form. The released of all nuclides from the bitumenized waste was very low

  12. Derivation of the source term, dose results and associated radiological consequences for the Greek Research Reactor – 1

    Energy Technology Data Exchange (ETDEWEB)

    Pappas, Charalampos, E-mail: chpappas@ipta.demokritos.gr; Ikonomopoulos, Andreas; Sfetsos, Athanasios; Andronopoulos, Spyros; Varvayanni, Melpomeni; Catsaros, Nicolas

    2014-07-01

    Highlights: • Source term derivation of postulated accident sequences in a research reactor. • Various containment ventilation scenarios considered for source term calculations. • Source term parametric analysis performed in case of lack of ventilation. • JRODOS employed for dose calculations under eighteen modeled scenarios. • Estimation of radiological consequences during typical and adverse weather scenarios. - Abstract: The estimated source term, dose results and radiological consequences of selected accident sequences in the Greek Research Reactor – 1 are presented and discussed. A systematic approach has been adopted to perform the necessary calculations in accordance with the latest computational developments and IAEA recommendations. Loss-of-coolant, reactivity insertion and fuel channel blockage accident sequences have been selected to derive the associated source terms under three distinct containment ventilation scenarios. Core damage has been conservatively assessed for each accident sequence while the ventilation has been assumed to function within the efficiency limits defined at the Safety Analysis Report. In case of lack of ventilation a parametric analysis is also performed to examine the dependency of the source term on the containment leakage rate. A typical as well as an adverse meteorological scenario have been defined in the JRODOS computational platform in order to predict the effective, lung and thyroid doses within a region defined by a 15 km radius downwind from the reactor building. The radiological consequences of the eighteen scenarios associated with the accident sequences are presented and discussed.

  13. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    International Nuclear Information System (INIS)

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ''MELCOR Verification, Benchmarking, and Applications,'' whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR

  14. Effect of Fuel Structure Materials on Radiation Source Term in Reactor Core Meltdown

    International Nuclear Information System (INIS)

    Jeong, Hae Sun; Ha, Kwang Soon

    2014-01-01

    The fission product (Radiation Source) releases from the reactor core into the containment is obligatorily evaluated to guarantee the safety of Nuclear Power Plant (NPP) under the hypothetical accident involving a core meltdown. The initial core inventory is used as a starting point of all radiological consequences and effects on the subsequent results of accident assessment. Hence, a proper evaluation for the inventory can be regarded as one of the most important part over the entire procedure of accident analysis. The inventory of fission products is typically evaluated on the basis of the uranium material (e.g., UO2 and USi2) loaded in nuclear fuel assembly, except for the structure materials such as the end fittings, grids, and some kinds of springs. However, the structure materials are continually activated by the neutrons generated from the nuclear fission, and some nuclides of them (e.g., 14 C and 60 Co) can significantly influence on accident assessment. During the severe core accident, the structure components can be also melted with the melting points of temperature relatively lower than uranium material. A series of the calculation were performed by using ORIGEN-S module in SCALE 6.1 package code system. The total activity in each part of structure materials was specifically analyzed from these calculations. The fission product inventory is generally evaluated based on the uranium materials of fuel only, even though the structure components of the assembly are continually activated by the neutrons generated from the nuclear fission. In this study, the activation calculation of the fuel structure materials was performed for the initial source term assessment in the accident of reactor core meltdown. As a result, the lower end fitting and the upper plenum greatly contribute to the total activity except for the cladding material. The nuclides of 56 Mn, '5 1 Cr, 55 Fe, 58 Co, 54 Mn, and 60 Co are analyzed to mainly effect on the activity. This result

  15. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  16. Probabilistic methods for physics

    International Nuclear Information System (INIS)

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  17. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  18. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  19. Variational Iterative Refinement Source Term Estimation Algorithm Assessment for Rural and Urban Environments

    Science.gov (United States)

    Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.

    2016-12-01

    It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03

  20. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  1. PRECIS -- A probabilistic risk assessment system

    International Nuclear Information System (INIS)

    Peterson, D.M.; Knowlton, R.G. Jr.

    1996-01-01

    A series of computer tools has been developed to conduct the exposure assessment and risk characterization phases of human health risk assessments within a probabilistic framework. The tools are collectively referred to as the Probabilistic Risk Evaluation and Characterization Investigation System (PRECIS). With this system, a risk assessor can calculate the doses and risks associated with multiple environmental and exposure pathways, for both chemicals and radioactive contaminants. Exposure assessment models in the system account for transport of contaminants to receptor points from a source zone originating in unsaturated soils above the water table. In addition to performing calculations of dose and risk based on initial concentrations, PRECIS can also be used in an inverse manner to compute soil concentrations in the source area that must not be exceeded if prescribed limits on dose or risk are to be met. Such soil contaminant levels, referred to as soil guidelines, are computed for both single contaminants and chemical mixtures and can be used as action levels or cleanup levels. Probabilistic estimates of risk, dose and soil guidelines are derived using Monte Carlo techniques

  2. Short-term X-ray variability of the globular cluster source 4U 1820 - 30 (NGC 6624)

    Science.gov (United States)

    Stella, L.; Kahn, S. M.; Grindlay, J. E.

    1984-01-01

    Analytical techniques for improved identification of the temporal and spectral variability properties of globular cluster and galactic bulge X-ray sources are described in terms of their application to a large set of observations of the source 4U 1820 - 30 in the globular cluster NGC 6624. The autocorrelation function, cross-correlations, time skewness function, erratic periodicities, and pulse trains are examined. The results are discussed in terms of current models with particular emphasis on recent accretion disk models. It is concluded that the analyzed observations provide the first evidence for shot-noise variability in a globular cluster X-ray source.

  3. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  4. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  5. Probabilistic pathway construction.

    Science.gov (United States)

    Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha

    2011-07-01

    Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  7. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  8. Probabilistic biological network alignment.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  9. Quantum probabilistic logic programming

    Science.gov (United States)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  10. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  11. The European PASSAM project. R and D outcomes towards enhanced severe accident source term mitigation

    International Nuclear Information System (INIS)

    Albiol, T.; Herranz, L.; Riera, E.; Dalibart, C.; Lind, T.; Corno, A. Del; Kärkelä, T.; Losch, N.; Azambre, B.

    2017-01-01

    The European PASSAM project (Passive and Active Systems on Severe Accident source term Mitigation) involved nine partners from six countries during four year (2013 - 2016): IRSN (project coordinator), EDF and University of Lorraine (France); CIEMAT and CSIC (Spain); PSI (Switzerland); RSE (Italy); VTT (Finland) and AREVA GmbH (Germany). It was mainly of an R and D experimental nature and aimed at investigating phenomena that might enhance source term mitigation in case of a severe accident in a LWR. Both already existing systems and innovative ones were experimentally studied. This paper presents the main outcomes of this project, including experimental results, understanding of phenomena and corresponding models and correlations with some preliminary analyses for potential use in severe accident management strategies, taking into account the passive or non-passive nature of the systems studied. Pool scrubbing represented the most studied domain of the PASSAM project. As an example of results, it was shown that gas hydrodynamics, at least in some relevant scenarios, is significantly different from what is nowadays encapsulated in severe accident analysis codes, particularly at high velocities and, that in the long run, maintaining an alkaline pH in the scrubber solution is absolutely necessary for preventing a delayed iodine release. Regarding sand bed filters plus metallic pre-filters, implemented on all French nuclear power plants, filtration efficiency for gaseous molecular and organic iodine was checked. Other experiments showed that under severe accident conditions, cesium iodide aerosols trapped in the sand filter are unstable and may constitute a delayed source term, which is not the case for CsI particles trapped on the metallic pre-filter. As innovative processes, both acoustic agglomeration and high pressure spray systems were studied mainly in the aim of leading to bigger particles upstream of filtered containment venting systems (FCVS), and so enhancing

  12. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  13. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  14. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  15. Fission Product Transport and Source Terms in HTRs: Experience from AVR Pebble Bed Reactor

    Directory of Open Access Journals (Sweden)

    Rainer Moormann

    2008-01-01

    Full Text Available Fission products deposited in the coolant circuit outside of the active core play a dominant role in source term estimations for advanced small pebble bed HTRs, particularly in design basis accidents (DBA. The deposited fission products may be released in depressurization accidents because present pebble bed HTR concepts abstain from a gas tight containment. Contamination of the circuit also hinders maintenance work. Experiments, performed from 1972 to 88 on the AVR, an experimental pebble bed HTR, allow for a deeper insight into fission product transport behavior. The activity deposition per coolant pass was lower than expected and was influenced by fission product chemistry and by presence of carbonaceous dust. The latter lead also to inconsistencies between Cs plate out experiments in laboratory and in AVR. The deposition behavior of Ag was in line with present models. Dust as activity carrier is of safety relevance because of its mobility and of its sorption capability for fission products. All metal surfaces in pebble bed reactors were covered by a carbonaceous dust layer. Dust in AVR was produced by abrasion in amounts of about 5 kg/y. Additional dust sources in AVR were ours oil ingress and peeling of fuel element surfaces due to an air ingress. Dust has a size of about 1  m, consists mainly of graphite, is partly remobilized by flow perturbations, and deposits with time constants of 1 to 2 hours. In future reactors, an efficient filtering via a gas tight containment is required because accidents with fast depressurizations induce dust mobilization. Enhanced core temperatures in normal operation as in AVR and broken fuel pebbles have to be considered, as inflammable dust concentrations in the gas phase.

  16. Probabilistic Linguistic Power Aggregation Operators for Multi-Criteria Group Decision Making

    Directory of Open Access Journals (Sweden)

    Agbodah Kobina

    2017-12-01

    Full Text Available As an effective aggregation tool, power average (PA allows the input arguments being aggregated to support and reinforce each other, which provides more versatility in the information aggregation process. Under the probabilistic linguistic term environment, we deeply investigate the new power aggregation (PA operators for fusing the probabilistic linguistic term sets (PLTSs. In this paper, we firstly develop the probabilistic linguistic power average (PLPA, the weighted probabilistic linguistic power average (WPLPA operators, the probabilistic linguistic power geometric (PLPG and the weighted probabilistic linguistic power geometric (WPLPG operators. At the same time, we carefully analyze the properties of these new aggregation operators. With the aid of the WPLPA and WPLPG operators, we further design the approaches for the application of multi-criteria group decision-making (MCGDM with PLTSs. Finally, we use an illustrated example to expound our proposed methods and verify their performances.

  17. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  18. Effect of hypoiodous acid volatility on the iodine source term in reactor accidents

    International Nuclear Information System (INIS)

    Routamo, T.

    1996-01-01

    A FORTRAN code ACT WATCH has been developed to establish an improved understanding of essential radionuclide behaviour mechanisms, especially related to iodine chemistry, in reactor accidents. The accident scenarios calculated in this paper are based on the Loss of Coolant accident at the Loviisa Nuclear Power Plant. The effect of different airborne species, especially HIO, on the iodine source term has been studied. The main cause of the high HIO release in the system modelled is the increase of I 2 hydrolysis rate along with the temperature increase, which accelerates HIO production. Due to the high radiation level near the reactor core, I 2 is produced from I - very rapidly. High temperature in the reactor coolant causes I 2 to be transformed into HIO and through the boiling of the coolant volatile I 2 and HIO are transferred efficiently into the gas phase. High filtration efficiency for particulate iodine causes I - release to be much lower than those of I 2 and HIO. (author) 15 figs., 1 tab., refs

  19. LMFBR source term experiments in the Fuel Aerosol Simulant Test (FAST) facility

    International Nuclear Information System (INIS)

    Petrykowski, J.C.; Longest, A.W.

    1985-01-01

    The transport of uranium dioxide (UO 2 ) aerosol through liquid sodium was studied in a series of ten experiments in the Fuel Aerosol Simulant Test (FAST) facility at Oak Ridge National Laboratory (ORNL). The experiments were designed to provide a mechanistic basis for evaluating the radiological source term associated with a postulated, energetic core disruptive accident (CDA) in a liquid metal fast breeder reactor (LMFBR). Aerosol was generated by capacitor discharge vaporization of UO 2 pellets which were submerged in a sodium pool under an argon cover gas. Measurements of the pool and cover gas pressures were used to study the transport of aerosol contained by vapor bubbles within the pool. Samples of cover gas were filtered to determine the quantity of aerosol released from the pool. The depth at which the aerosol was generated was found to be the most critical parameter affecting release. The largest release was observed in the baseline experiment where the sample was vaporized above the sodium pool. In the nine ''undersodium'' experiments aerosol was generated beneath the surface of the pool at depths varying from 30 to 1060 mm. The mass of aerosol released from the pool was found to be a very small fraction of the original specimen. It appears that the bulk of aerosol was contained by bubbles which collapsed within the pool. 18 refs., 11 figs., 4 tabs

  20. Effect of hypoiodous acid volatility on the iodine source term in reactor accidents

    Energy Technology Data Exchange (ETDEWEB)

    Routamo, T [Imatran Voima Oy, Vantaa (Finland)

    1996-12-01

    A FORTRAN code ACT WATCH has been developed to establish an improved understanding of essential radionuclide behaviour mechanisms, especially related to iodine chemistry, in reactor accidents. The accident scenarios calculated in this paper are based on the Loss of Coolant accident at the Loviisa Nuclear Power Plant. The effect of different airborne species, especially HIO, on the iodine source term has been studied. The main cause of the high HIO release in the system modelled is the increase of I{sub 2} hydrolysis rate along with the temperature increase, which accelerates HIO production. Due to the high radiation level near the reactor core, I{sub 2} is produced from I{sup -}very rapidly. High temperature in the reactor coolant causes I{sub 2} to be transformed into HIO and through the boiling of the coolant volatile I{sub 2} and HIO are transferred efficiently into the gas phase. High filtration efficiency for particulate iodine causes I{sup -} release to be much lower than those of I{sub 2} and HIO. (author) 15 figs., 1 tab., refs.

  1. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Trial Calculation. Work Plan

    International Nuclear Information System (INIS)

    Grabaskas, David; Bucknor, Matthew; Jerden, James; Brunett, Acacia J.

    2016-01-01

    The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides during a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.

  2. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  3. Source Term Analysis of the Irradiated Graphite in the Core of HTR-10

    Directory of Open Access Journals (Sweden)

    Xuegang Liu

    2017-01-01

    Full Text Available The high temperature gas-cooled reactor (HTGR has potential utilization due to its featured characteristics such as inherent safety and wide diversity of utilization. One distinct difference between HTGR and traditional pressurized water reactor (PWR is the large inventory of graphite in the core acting as reflector, moderator, or structure materials. Some radionuclides will be generated in graphite during the period of irradiation, which play significant roles in reactor safety, environmental release, waste disposal, and so forth. Based on the actual operation of the 10 MW pebble bed high temperature gas-cooled reactor (HTR-10 in Tsinghua University, China, an experimental study on source term analysis of the irradiated graphite has been done. An irradiated graphite sphere was randomly collected from the core of HTR-10 as sample in this study. This paper focuses on the analytical procedure and the establishment of the analytical methodology, including the sample collection, graphite sample preparation, and analytical parameters. The results reveal that the Co-60, Cs-137, Eu-152, and Eu-154 are the major γ contributors, while H-3 and C-14 are the dominating β emitting nuclides in postirradiation graphite material of HTR-10. The distribution profiles of the above four nuclides are also presented.

  4. Release of radionuclides following severe accident in interim storage facility. Source term determination

    International Nuclear Information System (INIS)

    Morandi, S.; Mariani, M.; Giacobbo, F.; Covini, R.

    2006-01-01

    Among the severe accidents that can cause the release of radionuclides from an interim storage facility, with a consequent relevant radiological impact on the population, there is the impact of an aircraft on the facility. In this work, a safety assessment analysis for the case of an aircraft crash into an interim storage facility is tackled. To this aim a methodology, based upon DOE, IAEA and NUREG standard procedures and upon conservative yet realistic hypothesis, has been developed in order to evaluate the total radioactivity, source term, released to the biosphere in consequence of the impact, without recurring to the use of complicated numerical codes. The procedure consists in the identification of the accidental scenarios, in the evaluation of the consequent damage to the building structures and to the waste packages and in the determination of the total release of radionuclides through the building-atmosphere interface. The methodology here developed has been applied to the case of an aircraft crash into an interim storage facility currently under design. Results show that in case of perforation followed by a fire incident the total released activity would be greater of some orders of magnitude with respect to the case of mere perforation. (author)

  5. Study of source term evaluation from fuel solution under simulated nuclear criticality accident in TRACY

    International Nuclear Information System (INIS)

    Abe, Hitoshi; Tashiro, Shinsuke; Nagai, Hitoshi; Koike, Tadao; Okagawa, Seigo; Murata, Mikio

    1999-01-01

    In a accident at the dissolver in a reprocessing plant, various fission products and radiolysis gases will be produced in the fuel solution and volatile radioactive nuclides and radiolysis gases and nitrogen oxide will be released into vent-gas spontaneously. Moreover other on-volatile nuclide will be releases as radioactive aerosol (mist) with bursting bubbles at surface of the solution. Therefore quantitative estimation of release and transport behavior of the radioactive material from solution as source term is very important. TRACY is a transient criticality experimental facility for studying the transient criticality characteristics of low enriched uranium. In this paper, experiment methods and results about the release behavior of the hydrogen, radioactive aerosol and iodine species from the fuel solutions are reported. As the results of the experiments, release patterns of H 2 , 140 Ba and 131 I could be grasped. Concentrations of H 2 in the vent-gas and 140 Ba in the gas phase in the core tank attained to the peak just after the transient criticality and decreased exponentially with time. On the other hand, concentrations of 131 I in the gas phase of the tank began to increase with a time lag of several minutes from the transient criticality and attained approximately constant values. (J.P.N.)

  6. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  7. A Source-Term Based Boundary Layer Bleed/Effusion Model for Passive Shock Control

    Science.gov (United States)

    Baurle, Robert A.; Norris, Andrew T.

    2011-01-01

    A modeling framework for boundary layer effusion has been developed based on the use of source (or sink) terms instead of the usual practice of specifying bleed directly as a boundary condition. This framework allows the surface boundary condition (i.e. isothermal wall, adiabatic wall, slip wall, etc.) to remain unaltered in the presence of bleed. This approach also lends itself to easily permit the addition of empirical models for second order effects that are not easily accounted for by simply defining effective transpiration values. Two effusion models formulated for supersonic flows have been implemented into this framework; the Doerffer/Bohning law and the Slater formulation. These models were applied to unit problems that contain key aspects of the flow physics applicable to bleed systems designed for hypersonic air-breathing propulsion systems. The ability of each model to predict bulk bleed properties was assessed, as well as the response of the boundary layer as it passes through and downstream of a porous bleed system. The model assessment was performed with and without the presence of shock waves. Three-dimensional CFD simulations that included the geometric details of the porous plate bleed systems were also carried out to supplement the experimental data, and provide additional insights into the bleed flow physics. Overall, both bleed formulations fared well for the tests performed in this study. However, the sample of test problems considered in this effort was not large enough to permit a comprehensive validation of the models.

  8. Analysis of the Variability of Classified and Unclassified Radiological Source term Inventories in the Frenchman Flat Area, Nevada test Site

    International Nuclear Information System (INIS)

    Zhao, P.; Zavarin, M.

    2008-01-01

    It has been proposed that unclassified source terms used in the reactive transport modeling investigations at NTS CAUs should be based on yield-weighted source terms calculated using the average source term from Bowen et al. (2001) and the unclassified announced yields reported in DOE/NV-209. This unclassified inventory is likely to be used in unclassified contaminant boundary calculations and is, thus, relevant to compare to the classified inventory. They have examined the classified radionuclide inventory produced by 10 underground nuclear tests conducted in the Frenchman Flat (FF) area of the Nevada Test Site. The goals were to (1) evaluate the variability in classified radiological source terms among the 10 tests and (2) compare that variability and inventory uncertainties to an average unclassified inventory (e.g. Bowen 2001). To evaluate source term variability among the 10 tests, radiological inventories were compared on two relative scales: geometric mean and yield-weighted geometric mean. Furthermore, radiological inventories were either decay corrected to a common date (9/23/1992) or the time zero (t 0 ) of each test. Thus, a total of four data sets were produced. The date of 9/23/1992 was chosen based on the date of the last underground nuclear test at the Nevada Test Site

  9. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    International Nuclear Information System (INIS)

    Yee, H.C.; Shinn, J.L.

    1986-12-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the source terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated

  10. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    International Nuclear Information System (INIS)

    Yee, H.C.; Shinn, J.L.

    1987-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogeneous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the source terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated. 46 references

  11. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  12. Probabilistic fracture finite elements

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-05-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  13. Probabilistic retinal vessel segmentation

    Science.gov (United States)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  14. Probabilistic sensory recoding.

    Science.gov (United States)

    Jazayeri, Mehrdad

    2008-08-01

    A hallmark of higher brain functions is the ability to contemplate the world rather than to respond reflexively to it. To do so, the nervous system makes use of a modular architecture in which sensory representations are dissociated from areas that control actions. This flexibility however necessitates a recoding scheme that would put sensory information to use in the control of behavior. Sensory recoding faces two important challenges. First, recoding must take into account the inherent variability of sensory responses. Second, it must be flexible enough to satisfy the requirements of different perceptual goals. Recent progress in theory, psychophysics, and neurophysiology indicate that cortical circuitry might meet these challenges by evaluating sensory signals probabilistically.

  15. Source term assessment using inverse modeling of radiation dose measured with environmental radiation monitors located at different positions

    International Nuclear Information System (INIS)

    Srinivas, C.V.; Rakesh, P.T.; Baskaran, R.; Venkatraman, B.

    2018-01-01

    Source term is an important input for consequence analysis using Decision Support Systems (DSS) to project radiological impact in the event of nuclear emergencies. A source term model called 'ASTER' is incorporated in the Online Nuclear Emergency Response System (ONERS) operational at Kalpakkam site for decision making during nuclear emergencies. This computes release rates using inverse method by employing an atmospheric dispersion model and gamma dose rates measured by environmental radiation monitors (ERM) deployed around the nuclear plant. The estimates may depend on the distribution of ERMs around the release location. In this work, data from various gamma monitors located at different radii 0.75 km and 1.5 km is used to assess the accuracy in the source term estimation for stack releases of MAPS-PHWR at Kalpakkam

  16. Long-term aerosol climatology over Indo-Gangetic Plain: Trend, prediction and potential source fields

    Science.gov (United States)

    Kumar, M.; Parmar, K. S.; Kumar, D. B.; Mhawish, A.; Broday, D. M.; Mall, R. K.; Banerjee, T.

    2018-05-01

    Long-term aerosol climatology is derived using Terra MODIS (Collection 6) enhanced Deep Blue (DB) AOD retrieval algorithm to investigate decadal trend (2006-2015) in columnar aerosol loading, future scenarios and potential source fields over the Indo-Gangetic Plain (IGP), South Asia. Satellite based aerosol climatology was analyzed in two contexts: for the entire IGP considering area weighted mean AOD and for nine individual stations located at upper (Karachi, Multan, Lahore), central (Delhi, Kanpur, Varanasi, Patna) and lower IGP (Kolkata, Dhaka). A comparatively high aerosol loading (AOD: 0.50 ± 0.25) was evident over IGP with a statistically insignificant increasing trend of 0.002 year-1. Analysis highlights the existing spatial and temporal gradients in aerosol loading with stations over central IGP like Varanasi (decadal mean AOD±SD; 0.67 ± 0.28) and Patna (0.65 ± 0.30) exhibit the highest AOD, followed by stations over lower IGP (Kolkata: 0.58 ± 0.21; Dhaka: 0.60 ± 0.24), with a statistically significant increasing trend (0.0174-0.0206 year-1). In contrast, stations over upper IGP reveal a comparatively low aerosol loading, having an insignificant increasing trend. Variation in AOD across IGP is found to be mainly influenced by seasonality and topography. A distinct "aerosol pool" region over eastern part of Ganges plain is identified, where meteorology, topography, and aerosol sources favor the persistence of airborne particulates. A strong seasonality in aerosol loading and types is also witnessed, with high AOD and dominance of fine particulates over central to lower IGP, especially during post-monsoon and winter. The time series analyses by autoregressive integrated moving average (ARIMA) indicate contrasting patterns in randomness of AOD over individual stations with better performance especially over central IGP. Concentration weighted trajectory analyses identify the crucial contributions of western dry regions and partial contributions from

  17. Long-term trends in California mobile source emissions and ambient concentrations of black carbon and organic aerosol.

    Science.gov (United States)

    McDonald, Brian C; Goldstein, Allen H; Harley, Robert A

    2015-04-21

    A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.

  18. Effect of seasonal and long-term changes in stress on sources of water to wells

    Science.gov (United States)

    Reilly, Thomas E.; Pollock, David W.

    1995-01-01

    The source of water to wells is ultimately the location where the water flowing to a well enters the boundary surface of the ground-water system . In ground-water systems that receive most of their water from areal recharge, the location of the water entering the system is at the water table . The area contributing recharge to a discharging well is the surface area that defines the location of the water entering the groundwater system. Water entering the system at the water table flows to the well and is eventually discharged from the well. Many State agencies are currently (1994) developing wellhead-protection programs. The thrust of some of these programs is to protect water supplies by determining the areas contributing recharge to water-supply wells and by specifying regulations to minimize the opportunity for contamination of the recharge water by activities at the land surface. In the analyses of ground-water flow systems, steady-state average conditions are frequently used to simplify the problem and make a solution tractable. Recharge is usually cyclic in nature, however, having seasonal cycles and longer term climatic cycles. A hypothetical system is quantitatively analyzed to show that, in many cases, these cyclic changes in the recharge rates apparently do not significantly affect the location and size of the areas contributing recharge to wells. The ratio of the mean travel time to the length of the cyclic stress period appears to indicate whether the transient effects of the cyclic stress must be explicitly represented in the analysis of contributing areas to wells. For the cases examined, if the ratio of the mean travel time to the period of the cyclic stress was much greater than one, then the transient area contributing recharge to wells was similar to the area calculated using an average steady-state condition. Noncyclic long-term transient changes in water use, however, and cyclic stresses on systems with ratios less than 1 can and do affect the

  19. Radiation Protection Aspects of Primary Water Chemistry and Source-term Management Report

    International Nuclear Information System (INIS)

    2014-04-01

    Since the beginning of the 1990's, occupational exposures in nuclear power plant has strongly decreased, outlining efforts achieved by worldwide nuclear operators in order to reach and maintain occupational exposure as low as reasonably achievable (ALARA) in accordance with international recommendations and national regulations. These efforts have focused on both technical and organisational aspects. According to many radiation protection experts, one of the key features to reach this goal is the management of the primary system water chemistry and the ability to avoid dissemination of radioactivity within the system. It outlines the importance for radiation protection staff to work closely with chemistry staff (as well as operation staff) and thus to have sufficient knowledge to understand the links between chemistry and the generation of radiation field. This report was prepared with the primary objective to provide such knowledge to 'non-chemist'. The publication primarily focuses on three topics dealing with water chemistry, source term management and remediation techniques. One key objective of the report is to provide current knowledge regarding these topics and to address clearly related radiation protection issues. In that mind, the report prepared by the EGWC was also reviewed by radiation protection experts. In order to address various designs, PWRs, VVERs, PHWRs and BWRs are addressed within the document. Additionally, available information addressing current operating units and lessons learnt is outlined with choices that have been made for the design of new plants. Chapter 3 of this report addresses current practices regarding primary chemistry management for different designs, 'how to limit activity in the primary circuit and to minimise contamination'. General information is provided regarding activation, corrosion and transport of activated materials in the primary circuit (background on radiation field generation). Primary chemistry aspects that

  20. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)